Log processing with OpenPipeline

Latest Dynatrace Dynatrace version 1.295+

OpenPipeline is the Dynatrace solution for processing log data from various sources. It enables effortless data handling at any scale and format on the Dynatrace platform. Using OpenPipeline when processing logs in Dynatrace offers a powerful solution to manage, process, and analyze logs. This approach combines the traditional log processing capabilities with the advanced data handling features of OpenPipeline, to get deeper insights into your log data.

OpenPipeline provides the following advantages:

  • Contextual data transformation: OpenPipeline extracts data with context and transforms it into more efficient formats, for example, converting logs to business events.
  • Unified processing language: DQL (Dynatrace Query Language) is used as a processing language, offering one syntax for all Dynatrace features and more advanced options for processing.
  • Pipeline concepts: Log ingest traffic can be split into different pipelines with dedicated processing, data and metric extraction, permissions, and storage.
  • Additonal processors: You can use additional processors such as fieldsAdd, fieldsRemove, and more. For a complete list, see the OpenPipeline processors.
  • Enhanced data extraction: Extract business events from logs with more data extraction options.
  • Increased limits: Benefit from increased default limits, including content size up to 524,288 bytes, attribute size up to 2,500 bytes, and up to 250 log attributes.
  • Improved performance and higher throughput.

logs-openpipeline

The stages of log processing with OpenPipeline are the following:

Stage

Description

Processors in the stage

Executed processors

Supported data types

Processing

Prepare data for analysis and storage by parsing values into fields, transforming the schema, and filtering the data records. Fields are edited, and sensitive data is masked.

  • DQL
  • Add fields
  • Remove fields
  • Rename fields
  • Drop record

All matches

Logs, Events, Business events, Metrics 1

Metric extraction

Extract metrics from the records that match the query.

  • Counter metric
  • Value metric

All matches

Logs, Events, Business events, System events

Data extraction

Extract and resend log data into another pipeline.

  • Davis event
  • Business event

All matches

Logs, Events, Business events, System events

Permissions

Apply security context to the records that match the query.

  • Set dt.security_context

First match only

Logs, Events, Business events, Metrics 1

Storage

Assign records to the best-fit bucket.

  • Bucket assignment
  • No storage assignment

First match only

Logs, Events, Business events, Spans

1

Specific metric fields are excluded from matching and processing. To learn more, see OpenPipeline limits.

Log and business event processing pipeline conditions are included in the built-in OpenPipeline pipelines. Processing is based on available records, and doesn't take into account record enrichment from external services.

If you have defined any new pipelines and your logs are routed to them by the dynamic route definition, they will not be processed by the classic pipeline. If logs aren't routed to any of the newly defined pipelines, they will be processed by the classic pipeline.

Check access to OpenPipeline

To access the OpenPipeline application, in the tenant, go to Apps > OpenPipeline OpenPipeline app (new).

logs-openpipeline

If you don't have access to OpenPipeline, the following message will be displayed when opening the app: OpenPipeline isn't enabled. To enable OpenPipeline, all rules in your log pipeline must match the conditions in DQL. Rules with matching conditions in LQL for processing, events extraction, or metrics extraction are not allowed in OpenPipeline. See the LQL to DQL migration guide.

Enable built-in rules for OpenPipeline

OpenPipeline provides built-in rules for common technologies and log formats, that you can manually enable.

Follow the steps below to enable them:

  1. Go to Apps > OpenPipeline OpenPipeline app (new).
  2. Select the Pipelines tab, and select AddPipeline to add a new record.
  3. Input a title for the pipeline.
  4. Select AddProcessor in the Processing tab, and choose Technology bundle.
  5. Choose the technology for which you want to enable an OpenPipeline built-in rule.
  6. Provide a fragment of the sample log manually in the Paste a log / JSON sample text box. Make sure it's in JSON format. Any textual log data should be inserted into the content field of the JSON.
  7. Select Run sample data to test it, and view the result.
  8. Select Save.

Add rule

Follow the steps below to create a new rule:

  1. Go to Apps > OpenPipeline OpenPipeline app (new).
  2. Select the Pipelines tab, and select AddPipeline to add a new record.
  3. Input a title for the pipeline.
  4. Select one of the tabs representing stages of log processing: Processing, Metric Extraction, Data extraction, Permission, or Storage.
  5. Select AddProcessor in and choose from the available processors.
  6. Choose the technology for which you want to enable an OpenPipeline rule and provide the processing rule definition. The processing rule definition is a log processing instruction about how Dynatrace should transform or modify your log data.
  7. Test the rule definition by providing a fragment of the sample log manually in the Paste a log / JSON sample text box. Make sure it's in JSON format. Any textual log data should be inserted into the content field of the JSON.
  8. Select Run sample data to test the JSON sample, and view the result.
  9. Select Save.

You can review or edit any pipeline by selecting the record and making the necessary changes.

If you haven't upgraded to OpenPipeline yet, Grail is not yet supported in your Cloud or region, or if you use Dynatrace version 1.293 and earlier, see Log processing.