Query data usage occurs when:
Here's what's included with the Query data-usage dimension:
Concept
Explanation
On-read parsing
Aggregation
Perform aggregation, summarization, or statistical analysis of data in events across specific timeframes or time patterns (for example, data occurrences in 30-second or 10-minute intervals), mathematical, or logical functions.
Reporting
Create reports or summaries with customized fields (columns) by adding, modifying, or dropping existing event attributes.
Context
Use DQL to analyze event data in context with relevant data on the Dynatrace platform, for example, user sessions or distributed traces.
The usage of Distributed Tracing and
Services is included with Dynatrace.
No consumption is generated by these apps.
Queried data is the data read during the execution of a DQL query. It is calculated per gibibyte scanned (GiB scanned).
Apply the following calculation to determine your consumption for the Query data-usage dimension:
(number of GiB of uncompressed data read during query execution) × (GiB scanned price as per your rate card) = consumption in your local currency
Grail applies various optimizations to improve response time and reduce cost. In some cases, these optimizations will identify portions of data that are not relevant to the query result—the price for scanning that data is discounted by 98%.
The impact of Grail's scan optimizations varies based on data and query attributes. It may evolve as Dynatrace improves Grail's query intelligence.
Let's assume that to resolve incidents and analyze performance issues your team executes DQL queries with a total of 60 GiB of data read per day. The yearly consumption for Query is calculated as follows:
60 GiB
60 GiB * 365 days = 21,900 GiB
21,900 GiB * (Query price as per your rate card) = Cost
This section describes the different Dynatrace tools that you can use to track consumption and costs.
The following DQL query provides an overview of total Query usage in gibibytes scanned:
fetch dt.system.events| filter event.kind == "BILLING_USAGE_EVENT"| filter event.type == "Events - Query" or event.type == "Events - Query - SaaS"| dedup event.id| summarize {data_read_bytes = sum(billed_bytes)}, by: {startHour = bin(timestamp, 1d)}
The example below shows the daily usage by bucket visualized in a line chart
The following DQL query provides an overview of the Query usage by application:
fetch dt.system.events| filter event.kind == "BILLING_USAGE_EVENT"| filter event.type == "Events - Query" or event.type == "Events - Query - SaaS"| fieldsAdd query_id = if(event.version == "1.0", event.id, else: query_id)| dedup event.id| summarize {data_read_bytes = sum(billed_bytes),Query_count = countDistinctExact(query_id)}, by: {App_context = client.application_context, application_detail = client.source, User = user.email}| fieldsAdd split_by_user = record(data_read_bytes, App_context, application_detail, User, Query_count)| summarize {split_by_user = arraySort(collectArray(split_by_user), direction: "descending"),data_read_bytes = sum(data_read_bytes),Query_count = sum(Query_count)}, by:{App_context, application_detail}| fieldsAdd split_by_user = record(App_context = split_by_user[][App_context], application_detail = split_by_user[][application_detail], User = split_by_user[][User], data_read_bytes = split_by_user[][data_read_bytes], data_read_pct = (split_by_user[][data_read_bytes] / data_read_bytes * 100), Query_count = split_by_user[][Query_count])| fieldsAdd split_by_user = if(arraySize(split_by_user) == 1, arrayFirst(split_by_user)[User], else: split_by_user)| fieldsAdd application_details = record(data_read_bytes, App_context, application_detail, split_by_user, Query_count)| summarize {application_details = arraySort(collectArray(application_details), direction: "descending"),data_read_bytes = sum(data_read_bytes),Query_count = toLong(sum(Query_count))}, by:{App_context}| fieldsAdd application_details = record(App_context = application_details[][App_context], application_detail = application_details[][application_detail], split_by_user = application_details[][split_by_user], data_read_bytes = application_details[][data_read_bytes], data_read_pct = application_details[][data_read_bytes] / data_read_bytes * 100, Query_count = application_details[][Query_count])| fieldsAdd key = 1| fieldsAdd total = lookup([fetch dt.system.events| filter event.kind == "BILLING_USAGE_EVENT" and (event.type == "Events - Query" or event.type == "Events - Query - SaaS")| dedup event.id| summarize total = sum(billed_bytes),| fieldsAdd key = 1], sourceField: key, lookupField:key)[total]| fields App_context, application_details, data_read_bytes, data_read_pct = data_read_bytes / total * 100, Query_count| sort data_read_bytes desc
The example below shows the usage by application visualized in a nested table view
Your Dynatrace Platform Subscription provides daily updates about accrued usage and related costs. You can access these details anytime via Account Management (Subscription > Overview > Cost and usage details > Events – Query > Actions > View details) or the Dynatrace Platform Subscription API.
On the Capability cost and usage analysis page, select a specific environment to analyze that environment’s cost and usage for a specific capability.
You can query metrics via the Environment API - Metrics API v2.