OpenPipeline offers pre-defined parsers to structure technology-specific logs according to the Dynatrace Semantic Dictionary. The parser library supports a broad range of technologies—including well-known data formats, popular third-party services, and cloud providers—for example, AWS Lambda, Python, Cassandra, and Apache Tomcat.
This article is intended for administrators and app users.
In this article, you will learn to structure technology-specific logs in OpenPipeline and analyze them in Notebooks.
Syslog - Pipeline
.You successfully configured a new pipeline with a processor to structure syslog logs according to pre-defined rules that match Dynatrace Semantic Dictionary. The new pipeline is in the pipeline list.
Go to OpenPipeline > Logs > Dynamic routing.
To create a new route, select Dynamic route and enter:
Syslog
Syslog - Pipeline
)Select Add.
Make sure to place the new route in the correct position on the list. Routes are evaluated from top to bottom. Data is dynamically routed into a pipeline according to the first applicable matching condition. Routed data is not evaluated against any subsequent conditions.
Select Save.
You successfully configured a new route. All syslog logs are routed to the pipeline for processing. The new route is in the route list.
Once logs are processed according to the technology bundle, several attributes are extracted from the log content into new fields that match Dynatrace Semantic Dictionary. You can easily filter logs by status, application, or attributes specific to the technology, as shown in the examples below.
Go to Notebooks and open a new or existing notebook.
Select > DQL and enter one of the following queries
Fetch syslog warn logs
fetch logs| filter dt.openpipeline.source == "extension:syslog"| filter status == "WARN"| sort timestamp desc
Group syslog logs by application
fetch logs| filter dt.openpipeline.source == "extension:syslog" and isNotNull(syslog.appname)| summarize totalCount = count(), by: {syslog.appname}| sort totalCount desc
Sort applications by the percentage of syslog error logs
fetch logs| filter dt.openpipeline.source == "extension:syslog" and isNotNull(syslog.appname)| summarize TotalCount = count(), Count = countIf(status == "ERROR"), by: {syslog.appname}| fieldsAdd Percentage = (Count * 100 / TotalCount)| sort Count desc| fieldsRemove TotalCount
You successfully structured syslog logs according to pre-defined processing rules in OpenPipeline. Incoming records that match the routing conditions are routed to the syslog pipeline, where new attributes specific to the syslog technology are extracted. The new attributes match Dynatrace Semantic Dictionary allowing for smooth analysis. You can filter syslog logs in Notebooks and get the most out of your structured logs.