Log Management and Analytics

Dynatrace SaaS only

Dynatrace offers two pricing options for Log Management and Analytics:

  • Ingest & Process, Retain, and Query

  • Ingest & Process, and Retain with Included Queries.

The unit of measure for consumed data volume is gibibytes (GiB) as described below.

Ingest & Process
Retain / Retain with Included Queries
Query
Definition
Ingested data is the amount of raw data in bytes sent to Dynatrace before enrichment and transformation.
Retained data is the amount of data saved to storage after data parsing, enrichment, transformation, and filtering but before compression.
Queried data is the volume of data read during the execution of a DQL query.
Unit of measure
per gibibyte (GiB)
per gibibyte-day (GiB-day)
per gibibyte scanned (GiB scanned)

Ingest & Process

What's included with the Ingest & Process data-usage dimension?

Concept
Explanation
Data delivery
Delivery of log data via OneAgent or Generic Log Ingestion API (via ActiveGate)
Topology enrichment
Enrichment of log events with data source and topology metadata
Data transformation
- Add, edit, or drop any log attributePerform mathematical transformations on numerical values (for example, creating new attributes based on calculations of existing attributes)
- Mask sensitive data by replacing either the whole log record, one specific log record attribute, or certain text with a masked string
- Extract business, infrastructure, application, or other data from raw logs. This can be a single character, string, number, array of values, or other. Extracted data can be turned into a new attribute allowing additional querying and filtering. Metrics can be created from newly extracted attributes (see Conversion to time series below)
Data-retention control
- Filter and exclude incoming logs based on content, topology, or metadata (filtering generates usage for Ingest & Process, but not for Retain)
- Manage data retention periods of incoming logs based on data-retention rules
Conversion to time series
Create metrics from log records or attributes (note that creating custom metrics generates additional consumption as described here)

Apply the following calculation to determine your consumption for the Ingest & Process data-usage dimension:

consumption = (number of GiBs ingested) × (GiB price as per your rate card)

Data enrichment and processing can increase your data volume significantly. Depending on the source of the data, the technology, the attributes, and metadata added during processing, the total data volume after processing can increase by a factor of 2 or more.

Dynatrace reserves the right to work with customers to adjust or disable parsing rules, processors, or pipelines that are experiencing service degradation.

Retain

Here's what's included with the Retain data-usage dimension:

Concept
Explanation
Data availability
Retained data is accessible for analysis and querying until the end of the retention period.
Retention periods
Choose a desired retention period. For log buckets, the available retention period ranges from 1 day to 10 years

Apply the following calculation to determine your daily consumption for the Retain data-usage dimension:

consumption per day = (volume of uncompressed logs stored in GiB) x (GiB-day price for Retain on your rate card)

Query

Query data usage occurs when:

  • Executing DQL queries in Notebooks, Workflows, Custom Apps and via API
  • Dashboard tiles that are based on log data trigger the execution of DQL queries on refresh and include sampled data
  • Submitting DQL queries by clicking the ‘Run query’ button (for example, in the Logs & Events viewer in simple and advanced mode or on unified analysis pages)

What's included with the Query data-usage dimension?

Concept
Explanation
On-read parsing
Use DQL to query historical logs in storage and extract business, infrastructure, or other data across any timeframe, and use extracted data for follow-up analysis
Aggregation
Perform aggregation, summarization, or statistical analysis of data in logs across specific timeframes or time patterns (for example, data occurrences in 30-second or 10-minute intervals)
Reporting
Create reports or summaries with customized fields (columns) by adding, modifying, or dropping existing log attributes
Context
Use DQL to analyze log data in context with relevant data on the Dynatrace platform, for example, user sessions or distributed traces

Apply the following calculation to determine your consumption for the Query data-usage dimension:

consumption = (number of GiB of uncompressed data read during query execution) × (GiB scanned price as per your rate card)

Query consumption is based on the GiB of data scanned to return a result. The highest potential cost for a query is equal to the volume of logs within the query’s search range times the price on your rate card. As each scan is executed, Grail applies various proprietary optimizations to improve response time and reduce cost. In some cases, these optimizations will identify portions of data that are not relevant to the query result; the cost for scanning that data is discounted by 98%. The impact of Grail’s scan optimizations varies based on data and query attributes and may evolve over time as Dynatrace improves Grail’s query intelligence.

Retain with Included Queries

Beginning with Dynatrace SaaS version 1.303, you can choose to subscribe to Retain with Included Queries or the existing usage-based model for Ingest & Processing, Retain and Query.

Customers who choose Retain with Included Queries are not charged for the included queries that are run within the Dynatrace Platform. In any 24-hour period, customers with this pricing option are entitled to run queries with aggregate scanned-GIB volume up to 15 times the volume of log data that is retained at that time. In the event that usage exceeds the included volume of queries, Dynatrace reserves the right to throttle query throughput.

Are there any limitations related to bucket management for Retain with Included Queries?

Yes, the retention period for log buckets ranges from a minimum of 10 days to a maximum of 35 days. If you need to retain your data for more than 35 days, consider switching to our usage-based model with Ingest & Process, Retain, and Query.

How do I know how much query usage I have available?

The data volume stored in a bucket configured for Retain with Included Queries defines the query volume that is included in your Retain with Included Queries consumption.

Included query usage per day = (GiB of logs retained) × 15

How do I calculate the cost of Retain with Included Queries?

Apply the following calculation to determine your daily consumption for the Retain with Included Queries data-usage dimension:

consumption per day = (volume of uncompressed logs stored in GiB) x (GiB-day price for Retain with Included Queries on your rate card)

Consumption details

Your organization's consumption of each Dynatrace capability accrues costs towards your annual commitment as defined in your contract. Your Dynatrace Platform Subscription provides daily updates about accrued usage and related costs. You can access these details anytime via Account Management (Subscription > Overview > Cost and usage details > [select DPS capability] > Actions > View details) or the Dynatrace Platform Subscription API.

On the Capability cost and usage analysis page, select a specific environment to analyze that environment’s cost and usage for a specific capability. On the environment level, Dynatrace provides built-in metrics and / or pre-made Notebooks for each capability you can use for detailed analysis (Actions > View details).

The table below shows the list of metrics you can use to monitor the consumption details for Log Management and Analytics. To use them in Data Explorer, enter Log Management and Analytics into the Search field. These metrics are also available via the Environment API.

Log Management and Analytics usage - Ingest & Process

Key: builtin:billing.log.ingest.usage

Dimension: Byte

Resolution: 1 hour

Description: Number of raw bytes sent to Dynatrace before enrichment and transformation in hourly intervals.

Log Management and Analytics usage - Retain

Key: builtin:billing.log.retain.usage

Dimension: Byte

Resolution: 1 hour

Description: Number of bytes saved to storage after data parsing, enrichment, transformation, and filtering but before compression.

Log Management and Analytics usage - Query

Key: builtin:billing.log.query.usage

Dimension: Byte

Resolution: 1 hour

Description: Number of bytes read during the execution of a DQL query, including sampled data.

Ingest & Process

You can monitor the total number of bytes ingested for Ingest & Process in hourly intervals for any selected timeframe using the metric Log Management and Analytics usage - Ingest & Process. The example below shows usage aggregated in 1-hour intervals between 2023-09-04 and 2023-09-11 (Last 7 days).

Log Management & Analytics (DPS)

Retain

You can monitor the total bytes stored for Retain in hourly intervals for any selected timeframe using the metric Log Management and Analytics usage - Retain. The example below shows usage aggregated in 1-hour intervals between 2023-09-04 and 2023-09-11 (Last 7 days).

Log Management & Analytics (DPS)

The following DQL query provides the hourly Retain usage by bucket

fetch dt.system.events
| filter event.kind == "BILLING_USAGE_EVENT" and event.type == "Log Management & Analytics - Retain"
| summarize {usage.event\_bucket = takeLast(usage.event_bucket), billed_bytes = takeLast(billed_bytes)}, by:{billing_period = bin(timestamp, 1h), event.id}
| fieldsAdd bytes_and_bucket = record(bucket = usage.event_bucket, billed_bytes = billed_bytes)
| summarize {`total billed_bytes` = sum(billed_bytes), `billed_bytes by bucket` = collectDistinct(bytes_and_bucket)}, by:{billing_period}
| fields billing_period, `total billed_bytes`, `billed_bytes by bucket`

The example below shows the hourly usage by bucket visualized in a nested table view:

Retain with included queries (LMA)

Query

You can monitor the total scanned bytes for Query in hourly intervals for any selected timeframe using the metric Log Management and Analytics usage - Query. The example below shows usage aggregated in 1-hour intervals between 2023-09-04 and 2023-09-11 (Last 7 days).

Log Management & Analytics (DPS)

The following DQL query provides an overview of total Log Management & Analytics – Query usage in gibibytes scanned:

fetch dt.system.events
| filter event.kind == "BILLING_USAGE_EVENT"
| filter event.type == "Log Management & Analytics - Query"
| dedup event.id
| summarize {
data_read_GiB = sum(billed_bytes)
}, by: {
startHour = bin(timestamp, 1d)
}

The example below shows the daily query usage visualized in a line chart for the last 30 days:

Retain with included Queries (LMA)

Retain with Included Queries

You can monitor the total bytes stored in hourly intervals for any selected timeframe using the metric Log Management and Analytics usage - Retain. The example below shows usage aggregated in 1-hour intervals between 2024-09-15 and 2024-10-15 (Last 30 days).

Retain with included Queries (LMA)

The following DQL query provides an overview of the Log Management & Analytics – Retain with Included Queries hourly usage in gibibytes retained by bucket:

fetch dt.system.events
| filter event.kind == "BILLING_USAGE_EVENT" and event.type == "Log Management & Analytics - Retain"
| summarize {usage.event_bucket = takeLast(usage.event_bucket), billed_bytes = takeLast(billed_bytes)}, by:{billing_period = bin(timestamp, 1h), event.id}
| fields billing_period, billed_bytes, usage.event_bucket
| makeTimeseries max(billed_bytes), by:{usage.event_bucket}, time: billing_period, interval:1h

The example below shows the hourly usage visualized in a bar chart for the last 30 days:

Retain with included Queries (LMA)