Dynatrace version 1.295+
OpenPipeline is the Dynatrace solution for processing log data from various sources. It enables effortless data handling at any scale and format on the Dynatrace platform. Using OpenPipeline when processing logs in Dynatrace offers a powerful solution to manage, process, and analyze logs. This approach combines the traditional log processing capabilities with the advanced data handling features of OpenPipeline, to get deeper insights into your log data.
This article is intended for administrators and app users.
In this article, you will learn to process logs for enhanced observability, including filtering, enrichment, and routing.
OpenPipeline provides the following advantages:
fieldsAdd
, fieldsRemove
, and more. For a complete list, see the OpenPipeline processors.The stages of log processing with OpenPipeline are the following:
Specific fields are excluded from matching and processing or retricted. To learn more, see Limits specific to fields.
Stage | Description | Processors in the stage | Executed processors | Supported data types |
---|---|---|---|---|
Processing | Prepare data for analysis and storage by parsing values into fields, transforming the schema, and filtering the data records. Fields are edited, and sensitive data is masked. |
| All matches | Logs, Events—Generic, Events—Davis events, Events—Davis, Events—SDLC events, Events—Security events (legacy), Security events (new) 1, Business events, Spans1 , Metrics |
Metric extraction | Extract metrics from the records that match the query. |
| All matches | Logs, Events—Generic, Events—SDLC events, Events—Security events (legacy), Security events (new)1, Business events, System events, User events, User sessions |
| All matches | Spans | ||
Data extraction | Extract a new record from a pipeline and re-ingest it as a different data type into another pipeline. |
| All matches | Logs, Events—Generic, Events—SDLC events, Events—Security events (legacy), Security events (new)1, Business events, System events, Spans1 |
Davis | Extract a new record from a pipeline and re-ingest it as a Davis events into another pipeline. |
| All matches | Logs, Events—Generic, Events—SDLC events, Events—Security events (legacy), Security events (new)1, Business events, System events, Spans1 |
Cost allocation | Advanced option to assign cost center usage to specific records that match a query. Make sure to review Cost Allocation documentation when choosing the best approach for your environment. |
| First match only | Logs, Events—Generic, Events—SDLC events, Events—Security events (legacy), Security events (new)1, Business events, System events, Spans1 |
Product allocation | Advanced option to assign product or application usage to specific records that match a query. Make sure to review Cost Allocation documentation when choosing the best approach for your environment. |
| First match only | Logs, Events—Generic, Events—SDLC events, Events—Security events (legacy), Security events (new)1, Business events, System events, Spans1 |
Permissions | Apply security context to the records that match the query. |
| First match only | Logs, Events—Generic, Events—Davis events, Events—Davis, Events—SDLC events, Events—Security events (legacy), Security events (new)1, Business events, Spans1, Metrics, User events, User sessions |
Storage | Assign records to the best-fit bucket. |
| First match only | Logs, Events—Generic, Events—Davis events, Events—Davis, Events—SDLC events, Events—Security events (legacy), Security events (new)1, Business events, Spans1 |
The data remains in its original, structured form. This is important for detailed analysis and troubleshooting, as it ensures that no information is lost or altered.
Log and business event processing pipeline conditions are included in the built-in OpenPipeline pipelines. Processing is based on available records, and doesn't take into account record enrichment from external services.
If you have defined any new pipelines and your logs are routed to them by the dynamic route definition, they will not be processed by the classic pipeline. If logs aren't routed to any of the newly defined pipelines, they will be processed by the classic pipeline.
OpenPipeline provides built-in rules for common technologies and log formats, that you can manually enable.
To process logs, you need to enable dynamic routing. To learn how to enable it, see Route data.
Follow the steps below to enable them:
Follow the steps below to create a new rule:
content
field of the JSON.You can review or edit any pipeline by selecting the record and making the necessary changes.
If you haven't upgraded to OpenPipeline yet, Grail is not yet supported in your Cloud or region, or if you use Dynatrace version 1.293 and earlier, see Log processing.