The following page lists the default limits of Dynatrace OpenPipeline.
Data-type-specific limitations might override OpenPipeline generic limits. For limits specific to the data type, see
The following fields are excluded for metrics in OpenPipeline.
Fields excluded from matching and processing
dt.entity.*
Fields excluded from processing
value
metric.key
dt.system.monitoring_source
metric.type
timestamp
The following fields are added after the Processing stage when Dynatrace runs its entity detection. Because they are not available before, you cannot use them in pre-processing, routing, or the Processing stage. You can use them in the Metric extraction, Data extraction, Permissions, and Storage pipeline stages.
dt.entity.<genericEntityType>
dt.entity.aws_lambda_function
dt.entity.cloud_application
dt.entity.cloud_application_instance
dt.entity.cloud_application_names
dt.entity.custom_device
dt.entity.kubernetes_cluster
dt.entity.kubernetes_node
dt.entity.kubernetes_service
dt.entity.service
dt.env_vars.dt_tags
dt.kubernetes.cluster.id
dt.kubernetes.cluster.name
dt.loadtest.custom_entity.enriched_custom_device_name
dt.process.name
dt.source_entity
host.name
k8s.cluster.name
If the timestamp is more than 10 minutes in the future, it's adjusted to the ingest server time plus 10 minutes.
Item
Earliest timestamp
Logs, Events, Business Events, System events
The ingest time minus 24 hours
Metrics, extracted metrics, and Davis events
The ingest time minus 1 hour
Records outside of these timeframes are dropped.
Numerical and string timestamp values are supported. OpenPipeline parses the timestamp as follows.
100_000_000_000
are parsed as SECONDS
.100_000_000_000_000
are parsed as MILLISECONDS
.9_999_999_999_999_999
are parsed as MICROSECONDS
.UNIX epoch
milliseconds or secondsRFC3339
formatsRFC3164
formatstimestamp
is overwritten with the ingest time.If the record doesn't have a timestamp
field, the field timestamp
is set to ingest time.
Each record can occupy maximum 16MB of processing memory. Each change to the record (e.g. parsing a field) decreases the available processing memory. Once the available processing memory is exhausted, the record is dropped.
The maximum size of a record after processing is 16MB.
Item
Maximum limit
Request payload size
10MB
Pipeline number
100
Stage size
512KB per stage
Processor number
1000 per stage
Endpoint number
100 per data type
Dynamic routes number
100 per data type
Matching condition length
1,000 UTF-8 encoded bytes per condition
DQL processor script length
8,192 UTF-8 encoded bytes per script
The endpoint path is a unique name starting with a literal that defines the endpoint. It's case-insensitive and supports alphanumeric characters and dot (.
). For example: Endpoint.1
.
Endpoint path doesn't support:
.
) as the last character..
)Null
or empty input