This guide introduces the advanced Log Management and Analytics features for log configurations at scale.
Dynatrace Hub and the Dynatrace web UI support the decision-making process in selecting the proper log ingest channel. While OneAgent log ingestion is the recommended strategy for automatically ingesting contextualized logs for analysis, there might be use cases where OneAgent installation isn't possible. In such cases, log ingestion using API integration, extensions, or native syslog integration might be the best options.
Dynatrace Hub lists possible log integrations tailored to your needs and use cases, providing descriptions and links to implementation details.
The Dynatrace platform provides apps like Logs, Clouds , Infrastructure & Operations, Kubernetes, etc. designed for specific stakeholders. Log signals are available for analysis in context due to the log component that is part of each application. If, for any reason, log data isn't available, there might be two reasons why: either there isn't data available for the given time frame, or log ingestion isn't configured yet. In the latter case, the Dynatrace web UI displays the Set up log ingestion link, leading to the recommended method for filling monitoring gap with logs.
Log ingest via OneAgent has a central configuration of log ingest rules, so accessing the host is not required once OneAgent is installed. Thousands of deployed OneAgents can be controlled from the Dynatrace tenant with granularity on the host, host group, and tenant scope.
OneAgent is the recommended ingest integration for host, processes, and Kubernetes observability, with minimal manual configuration. Logs ingested by OneAgent automatically preserve the full topology context. Additionally, trace identifiers can easily be added to logs and stored in Grail for in-context log analysis and Davis AI automated problem resolution.
See Log ingestion via OneAgent to learn more.
OneAgent automatically discovers all new log files written by important processes, saving you the effort of configuring log sources individually. If a log source isn't automatically detected because, for example, it is not part of any processes, you can use the central custom log sources configuration. Suppose the log source isn't automatically detected. In that case, you can use the central custom log sources configuration, to upload logs that aren't part of any processes or are part of short-lived processes. Custom log source configuration also allows binary content and attributes definition that will enrich your logs.
After OneAgent discovers all relevant log files, the only required step is to create central log ingest rules in your Dynatrace tenant. The flexible matcher mechanism allows you to include or exclude log streams based on attributes like process groups, log severity, log content, log source, and more.
OneAgent also has a security mechanism that prevents non-log data from being detected and ingested outside your environment. The security rules configuration allows the relaxation of the restriction of rules for specific use cases.
OneAgent is pre-configured to ingest log sources of monitored hosts, systems, processes, and containers. The default configuration includes topology enrichment and log sources autodiscovery to enable easy log ingest setup with minimal effort. However, to tailor the OneAgent log configuration to specific needs, Dynatrace offers advanced OneAgent log settings to customize monitoring data acquisition. For example, you can modify the log autodetection mechanism to turn off IIS log detection if this capability is not required. Additionally, essential log mechanisms like severity or timestamp detection can be adjusted to specific use cases.
See OneAgent settings for more details.
Timestamp configuration allows you to switch to timestamp detection in log records. This is helpful if you can’t configure the log producer. Timestamp detection also influences log splitting. If no timestamp is detected, adjacent lines are merged into a single log record. The OneAgent splitting patterns configuration allows definitions for log boundaries to fine-tune the ingestion of multiline logs.
Dynatrace distributed traces provide you with a combination of analysis tools to gain insight into your environment transactions. You can combine logs, code-level visibility, topology information, and metadata with the highest level of data granularity and fidelity level to get a complete observability picture.
When using Dynatrace OneAgent for automated distributed tracing, all your logs are automatically enriched with tracing metadata. Log enrichment with traces allows you to:
Spans and traces are preserved during log ingestion if such log attributes are already part of the log structure. Dynatrace also allows central single-select log analysis with traces, which is useful if you can't influence log producers.
Go to Connecting log data to traces to learn more.
Dynatrace OpenPipeline is a data handling solution used to ingest and process logs and other observability data from different sources, at any scale, and in any format in the Dynatrace Platform. Log use cases include:
Dynatrace Log Management and Analytics can reshape incoming log data for better understanding, analysis, or further processing based on rules you create. Common log processing use cases include:
You can also apply access management rules and assign proper log buckets to fit retention needs. Dynatrace can store your logs for up to 10 years.
See Log processing and OpenPipeline to learn more.
Secure your sensitive data with built-in Dynatrace features. Your logs may contain sensitive data that requires masking before you can use it for analytics, observability, and security use cases. Dynatrace provides tools that allow you to meet data protection and other compliance requirements while still getting value from your logs.
By configuring relevant log processing rules, you can mask data at capture with OneAgent's sensitive data masking capabilities and ensure a second layer of security on the Dynatrace tenant. Choose one or both methods based on your information architecture and log setup:
You can also create a sensitive data scanner based on Dynatrace Workflows to perform regular checks for stored data against corporate and compliance rules.
See Methods of masking sensitive data to learn more.
Logs are stored in buckets. The best practice is to create buckets based on retention periods for easier control over how long your data is stored in Grail. Grail contains built-in buckets with default retention periods. The default built-in bucket intended for log data is default_logs
, with a retention period of 35 days. Using a custom log bucket, you can:
Buckets can improve query performance by reducing query execution time, and the scope of data read. You can create up to 80 log buckets with retention of up to 10 years by applying configurations in Storage Management or via API, where you can:
See Bucket assignment to learn more.
Log management in Dynatrace includes mechanisms for assigning proper access rights to users. The Dynatrace permission model is configured in Account Management by accessing Identity and Access Management (IAM), where you can create policies and assign them to roles in your organization.
Permissions can be assigned at the bucket, table, field, and entity levels. Without permissions, your users can't fetch data from a bucket or table.
Dynatrace allows you to tweak your ingested log data by adding a dt.security_context
attribute to specific log records. This allows you to set additional options, such as permissions for individual records.
See Log security context and Permissions in Grail to to learn more.
Access the built-in Log ingest overview dashboard for a high-level overview of ingested log volumes, top log producers, and log processing status.
Additionally, based on the Log ingest health section, you can determine whether there are any log ingest issues and what remediation options you have.
The Log ingest overview dashboard is distributed automatically with Logs. To access it:
The default dashboard is available in your environment and will include further improvements. If you need to customize it, we recommend making a copy.