Latest Dynatrace Dynatrace version 1.295+
OpenPipeline is the Dynatrace solution for processing log data from various sources. It enables effortless data handling at any scale and format on the Dynatrace platform. Using OpenPipeline when processing logs in Dynatrace offers a powerful solution to manage, process, and analyze logs. This approach combines the traditional log processing capabilities with the advanced data handling features of OpenPipeline, to get deeper insights into your log data.
OpenPipeline provides the following advantages:
fieldsAdd
, fieldsRemove
, and more. For a complete list, see the OpenPipeline processors.The stages of log processing with OpenPipeline are the following:
Stage
Description
Processors in the stage
Executed processors
Supported data types
Processing
Prepare data for analysis and storage by parsing values into fields, transforming the schema, and filtering the data records. Fields are edited, and sensitive data is masked.
All matches
Logs, Events, Business events, Metrics 1
Metric extraction
Extract metrics from the records that match the query.
All matches
Logs, Events, Business events, System events
Data extraction
Extract and resend log data into another pipeline.
All matches
Logs, Events, Business events, System events
Permissions
Apply security context to the records that match the query.
First match only
Logs, Events, Business events, Metrics 1
Storage
Assign records to the best-fit bucket.
First match only
Logs, Events, Business events, Spans
Specific metric fields are excluded from matching and processing. To learn more, see OpenPipeline limits.
Log and business event processing pipeline conditions are included in the built-in OpenPipeline pipelines. Processing is based on available records, and doesn't take into account record enrichment from external services.
If you have defined any new pipelines and your logs are routed to them by the dynamic route definition, they will not be processed by the classic pipeline. If logs aren't routed to any of the newly defined pipelines, they will be processed by the classic pipeline.
To access the OpenPipeline application, in the tenant, go to Apps > OpenPipeline .
If you don't have access to OpenPipeline, the following message will be displayed when opening the app: OpenPipeline isn't enabled
.
To enable OpenPipeline, all rules in your log pipeline must match the conditions in DQL. Rules with matching conditions in LQL for processing, events extraction, or metrics extraction are not allowed in OpenPipeline. See the LQL to DQL migration guide.
The OpenPipeline processing examples on the Dynatrace platform provide guidance for processing logs for enhanced observability, including filtering, enrichment, and rounting. Each scenario includes detailed steps, configurations, and examples. For full instructions, see OpenPipeline processing examples and Parse log lines and extract a metric.
OpenPipeline provides built-in rules for common technologies and log formats, that you can manually enable.
Follow the steps below to enable them:
content
field of the JSON.Follow the steps below to create a new rule:
content
field of the JSON.You can review or edit any pipeline by selecting the record and making the necessary changes.
If you haven't upgraded to OpenPipeline yet, Grail is not yet supported in your Cloud or region, or if you use Dynatrace version 1.293 and earlier, see Log processing.