Log Management and Analytics

Dynatrace SaaS only

The consumption model for Log Management and Analytics is based on three dimensions of data usage (Ingest & Process, Retain, and Query). The unit of measure for consumed data volume is gibibytes (GiB) as further described below.

Ingest & Process

Retain

Query

Definition

Ingested data is the amount of raw data in bytes (logs and events) sent to Dynatrace before enrichment and transformation.

Retained data is the amount of data saved to storage after data parsing, enrichment, transformation, and filtering but before compression.

Queried data is the data read during the execution of a DQL query, including sampled data.

Unit of measure

per gibibyte (GiB)

per gibibyte-day (GiB-day)

per gibibyte scanned (GiB scanned)

Ingest & Process

What's included with the Ingest & Process data-usage dimension?

Concept

Explanation

Data delivery

  • Delivery of log data via OneAgent or Generic Log Ingestion API (via ActiveGate)

Topology enrichment

  • Enrichment of log events with data source and topology metadata

Data transformation

  • Add, edit, or drop any log attribute
  • Perform mathematical transformations on numerical values (for example, creating new attributes based on calculations of existing attributes)
  • Extract business, infrastructure, application, or other data from raw logs. This can be a single character, string, number, array of values, or other. Extracted data can be turned into a new attribute allowing additional querying, filtering, etc. Metrics can be created from newly extracted attributes (see Conversion to time series below), or extracted attributes can be queried for ad-hoc analysis
  • Mask sensitive data by replacing either the whole log record, one specific log record attribute, or certain text with a masked string

Data-retention control

  • Filter incoming logs based on content, topology, or metadata to reduce noise. Log filtering generates usage for Ingest & Process, but not for Retain.
  • Manage data retention periods of incoming logs based on data-retention rules

Conversion to time series

  • Create metrics from log records or attributes (note that creating custom metrics generates additional consumption beyond what is consumed for ingestion and processing.)

Apply the following calculation to determine your consumption for the Ingest & Process data-usage dimension:
(number of GiBs ingested) × (GiB price as per your rate card) = consumption in your local currency

Be aware that data enrichment and processing can increase your data volume significantly. Depending on the source of the data, the technology, the attributes, and metadata added during processing, the total data volume after processing can increase by a factor of 2 or more.

Retain

Here's what's included with the Retain data-usage dimension:

Concept

Explanation

Data availability

  • Retained data is accessible for analysis and querying until the end of the retention period.

Retention periods

  • Choose a desired retention period. For log buckets, the available retention period ranges from 1 day to 10 years, with an additional week.

Apply the following calculation to determine your consumption for the Retain data-usage dimension:
(number of GiB-days) × (retention period in days) × (GiB-day price as per your rate card) × (number of days that data is stored) = consumption in your local currency

  • Retention period in days is based on the retention-period of the storage bucket under analysis. (For example, 35 days if you're analyzing the default_logs bucket.)

  • Number of days data is stored reflects the period during which the data is stored. (For example, 30 days if you're analyzing the monthly cost, or 365 days for a full year.)

Query

Query data usage occurs when:

  • Accessing the Logs & Events viewer in simple mode with filters.
  • Submitting custom DQL queries in the Logs & Events viewer in advanced mode.
  • Unified analysis pages show the log data of a specific entity.
  • Dashboard tiles that are based on log data trigger the execution of DQL queries on refresh and include sampled data.
  • Executing DQL queries in Notebooks, Workflows, Custom Apps and via API.

What's included with the Query data-usage dimension?

Concept

Explanation

On-read parsing

  • Use DQL to query historical logs in storage and extract business, infrastructure, or other data across any timeframe, and use extracted data for follow-up analysis.
  • No upfront indexes or schema required for on-read parsing

Aggregation

  • Perform aggregation, summarization, or statistical analysis of data in logs across specific timeframes or time patterns (for example, data occurrences in 30-second or 10-minute intervals), mathematical, or logical functions.

Reporting

  • Create reports or summaries with customized fields (columns) by adding, modifying, or dropping existing log attributes.

Context

  • Use DQL to analyze log data in context with relevant data on the Dynatrace platform, for example, user sessions or distributed traces.

Apply the following calculation to determine your consumption for the Query data-usage dimension:
(number of GiB of uncompressed data read during query execution) × (GiB scanned price as per your rate card) = consumption in your local currency

Consumption examples

Following are example calculations which show how each data-usage dimension contributes to the overall usage and consumption.

Step 1 – Ingest & Process

For example, say that you produce 500 GiB of log data per day which you ingest into Log Management and Analytics for processing. The monthly consumption for Ingest & Process is calculated as follows:

Ingest volume per day

500 GiB

Ingest volume per month

15,000 GiB

500 (GiB data per day) × 30 (days)

Consumption per month

15,000 (GiB per month) × ingest price as per your rate card

Step 2 – Retain

Following the Ingest & Process step, your data is retained and enriched on an on-going basis. If you ingested 500 GiB of raw data in step 1,900 GiB of enriched data (500 GiB × 1.8 for enrichment) is added to your storage daily. In this example, your enriched data is retained for 35 days. The monthly consumption (after a ramp-up period of 35 days) for Retain is calculated as follows:

Retained volume for 1 day

900 GiB

500 (GiB data per day) × 1.8 (enrichment)

Retained volume for 35 days

31,500 GiB

900 (GiB data per day) × 35 (days)

Consumption per day

31,500 (GiB) × retain price per day as per your rate card

Consumption per month

31,500 (GiB) × retain price per day as per your rate card × 30 (days)

If the same amount of processed data is to be retained for a year, the monthly consumption (after a ramp-up of 365 days in this case) for Retain is calculated as follows:

Retained volume for 1 day

900 GiB

500 (GiB data per day) × 1.8 (enrichment)

Retained volume for 365 days

328,500 GiB

900 (GiB data per day) × 365 (days)

Consumption per day

328,500 (GiB) × retain price per day as per your rate card

Consumption per month

328,500 (GiB) × retain price per day as per your rate card × 30 (days)

Step 3 – Query

Let's assume that to resolve incidents and analyze performance issues your team executes DQL queries with a total of 25,000 GiB of data read per day. The monthly the consumption for Query is calculated as follows:

Data volume read per day

25,000 GiB

Data volume read per month

750,000 GiB

25,000 (GiB data per day) × 30 (days)

Consumption per month

750,000 (GiB per month) × query price as per your rate card

Step 4 – Total consumption

The total monthly consumption for this example scenario of 35 days of data retention is the sum of the monthly consumption for Ingest & Process, Retain, and Query.

Consumption details

Dynatrace provides built-in metrics that help you understand and analyze your organization's consumption of Ingest & Processing, Retain, and Query for Log Management and Analytics. To use them in Data Explorer, enter Log Management and Analytics into the Search field. These metrics are also available via the Environment API and are linked in Account Management (Usage summary > Log Management and Analytics – Ingest & Process, Retain, Query > Actions > View details). The table below shows the list of metrics you can use to monitor the consumption details for Log Management and Analytics.

Log Management and Analytics usage - Ingest & Process

Key: builtin:billing.log.ingest.usage

Dimension: Byte

Resolution: 1 hour

Description: Number of raw bytes sent to Dynatrace before enrichment and transformation in hourly intervals.

Log Management and Analytics usage - Retain

Key: builtin:billing.log.retain.usage

Dimension: Byte

Resolution: 1 hour

Description: Number of bytes saved to storage after data parsing, enrichment, transformation, and filtering but before compression.

Log Management and Analytics usage - Query

Key: builtin:billing.log.query.usage

Dimension: Byte

Resolution: 1 hour

Description: Number of bytes read during the execution of a DQL query, including sampled data.

Ingest & Process

You can monitor the total number of bytes ingested for Ingest & Process in hourly intervals for any selected timeframe using the metric Log Management and Analytics usage - Ingest & Process. The example below shows usage aggregated in 1-hour intervals between 2023-09-04 and 2023-09-11 (Last 7 days).

Log Management & Analytics (DPS)

Retain

You can monitor the total bytes stored for Retain in hourly intervals for any selected timeframe using the metric Log Management and Analytics usage - Retain. The example below shows usage aggregated in 1-hour intervals between 2023-09-04 and 2023-09-11 (Last 7 days).

Log Management & Analytics (DPS)

Query

You can monitor the total scanned bytes for Query in hourly intervals for any selected timeframe using the metric Log Management and Analytics usage - Query. The example below shows usage aggregated in 1-hour intervals between 2023-09-04 and 2023-09-11 (Last 7 days).

Log Management & Analytics (DPS)