Observe real-time log data with DQL

In this use case, you want to observe mission-critical information over time found in your logs that are sent using log ingest API.

  1. Send your log data to the generic log ingestion API.

    The Log Monitoring API - POST ingest logs method allows you to stream log records to Dynatrace. The ingestion endpoint, which is located on your ActiveGate, tries to automatically transform any log data based on the API schema.

  2. optional Use a log shipper together with generic ingestion.

    You can use the generic ingestion API, together with a common log shipper like Fluentd or Logstash, to stream logs to Dynatrace.

  3. optional Forward logs from Cloud environments.

    The generic ingestion API is used to stream logs from Cloud providers to Dynatrace.

  4. Parse out attributes from raw logs with Log Processing.

    Log Processing lets you manipulate all incoming log data during ingestion, such as extracting numeric or other attributes from raw log content, performing mathematical manipulations, or dropping, adding, or masking attributes before the log record is persisted.

  5. Analyze log data in the Logs and events viewer.

    Go to Logs or Logs & Events (latest Dynatrace) to see the log data ingested. In Simple mode, select filters like log source, severity level, or topology entity. In Advanced mode, craft powerful queries with DQL to extract any information from historical logs or create aggregations or statistics.

  6. Pin DQL query results to a dashboard.

    You can reuse the DQL query in your workflows by pinning the result to a dashboard. You can pin a table of records or a visualization like a bar chart.

  7. optional Use log-based metrics for observability

    As an alternative to pinning a DQL query, you can use log metrics.