Set up Davis alerts based on events

  • Tutorial
  • 4min

Ingested logs can be triggers for opening new Davis problems.

Using Davis events based on logs you will get immediate alerts once the log record you define is ingested.

Follow this guide to learn more about extracting events from logs.

If you need to set thresholds for your alerts, you should follow the instructions in Set up Davis alerts based on metrics.

Prerequisites

optional

Steps

In this example we will open a new Davis problem when certain records, which contain a specific phrase, are ingested.

You can find alerts by opening Logs Logs and using the following DQL query.

fetch logs
| filter matchesPhrase(content, "Dropping data because sending_queue is full")
| sort timestamp desc

Log results

If your DQL query uses parse, fieldAdd, or other transformations, you should add a processing rule to set those fields on ingest.

  1. Add Davis event data extraction configuration in OpenPipeline.

    1. Open Settings Settings > Process and contextualize > OpenPipeline > Logs and select the Pipelines tab.
    2. Find the pipeline you want to modify, or add a new pipeline.
    3. Select > Edit. The pipeline configuration page appears.
    4. Select Data extraction tab and add a Davis event processor.
  2. Set the DQL matcher. A matcher sets the condition for the event that is to be extracted. It is a subset of filtering conditions in a single DQL statement.

    In Matching condition, use the matcher as shown below.

    matchesPhrase(content, "Dropping data because sending_queue is full")

    If you use segments or your permissions are set at the record level, you should include those conditions in the matcher.

    There are situations when a matcher can't be easily extracted from a DQL statement. In these cases, you can create log alerts for a log event or summary of log data.

  3. Set event properties.

    Event properties are metadata that your event will contain when it is triggered. You can remap any field from the log record.

    In our example, we will remap the dt.source_entity field to have the alerts connected to entities for Davis root cause analysis.

    In Event template, set the following key/value pairs.

    • Set event.type to CUSTOM_ALERT.
    • Set event.description to Dropping data because sending_queue is full. Try increasing queue_size..
    • Set event.name to OpenTelemetry exporter failure.
    • Set dt.source_entity to {dt.source_entity}.

    Pipeline data extraction

When the first Davis event is extracted, a new problem will be opened. If there are no new events within the timeout period as defined in dt.davis.event_timeout, the problem will be closed automatically.

The default timeout is 15 minutes.

Problem in Problems app

Conclusion

Extracting Davis events from logs is ideal for simple alerting when thresholds are not important.

  • It provides immediate/real-time alerting.
  • Additional overview of matching data overtime is not required.

Once you're extracting events, you can use these to trigger automations using simple workflows as described in Create a simple workflow in Dynatrace Workflows.

Further reading

More information about event properties is available at:

Related tags
Log AnalyticsLog Analytics