OpenPipeline limits

The following page lists the default limits of Dynatrace OpenPipeline.

Data type limits

Data-type-specific limitations might override OpenPipeline generic limits. For limits specific to the data type, see

Limits specific to fields

Excluded fields for metrics

The following fields are excluded for metrics in OpenPipeline.

  • Fields excluded from matching and processing

    • dt.entity.*
  • Fields excluded from processing

    • value
    • metric.key
    • dt.system.monitoring_source
    • metric.type
    • timestamp

The following fields are added after the Processing stage when Dynatrace runs its entity detection. Because they are not available before, you cannot use them in pre-processing, routing, or the Processing stage. You can use them in the Metric extraction, Data extraction, Permissions, and Storage pipeline stages.

  • dt.entity.<genericEntityType>
  • dt.entity.aws_lambda_function
  • dt.entity.cloud_application
  • dt.entity.cloud_application_instance
  • dt.entity.cloud_application_names
  • dt.entity.custom_device
  • dt.entity.kubernetes_cluster
  • dt.entity.kubernetes_node
  • dt.entity.kubernetes_service
  • dt.entity.service
  • dt.env_vars.dt_tags
  • dt.kubernetes.cluster.id
  • dt.kubernetes.cluster.name
  • dt.loadtest.custom_entity.enriched_custom_device_name
  • dt.process.name
  • dt.source_entity
  • host.name
  • k8s.cluster.name

Ingestion

Record maximum timestamp

If the timestamp is more than 10 minutes in the future, it's adjusted to the ingest server time plus 10 minutes.

Record minimum timestamp

Item

Earliest timestamp

Logs, Events, Business Events, System events

The ingest time minus 24 hours

Metrics, extracted metrics, and Davis events

The ingest time minus 1 hour

Records outside of these timeframes are dropped.

Ingest API

Timestamp value

Numerical and string timestamp values are supported. OpenPipeline parses the timestamp as follows.

  • Numerical values
    • Up to 100_000_000_000 are parsed as SECONDS.
    • Up to 100_000_000_000_000 are parsed as MILLISECONDS.
    • Up to 9_999_999_999_999_999 are parsed as MICROSECONDS.
  • String values are parsed either as
    • UNIX epoch milliseconds or seconds
    • RFC3339 formats
    • RFC3164 formats
  • For other values that cannot be parsed, timestamp is overwritten with the ingest time.

If the record doesn't have a timestamp field, the field timestamp is set to ingest time.

Processing

Size of working memory during processing per record

Each record can occupy maximum 16MB of processing memory. Each change to the record (e.g. parsing a field) decreases the available processing memory. Once the available processing memory is exhausted, the record is dropped.

Size of record after processing

The maximum size of a record after processing is 16MB.

Configuration

Item

Maximum limit

Request payload size

10MB

Pipeline number

100

Stage size

512KB per stage

Processor number

1000 per stage

Endpoint number

100 per data type

Dynamic routes number

100 per data type

Matching condition length

1,000 UTF-8 encoded bytes per condition

DQL processor script length

8,192 UTF-8 encoded bytes per script

Allowed characters in the endpoint path

The endpoint path is a unique name starting with a literal that defines the endpoint. It's case-insensitive and supports alphanumeric characters and dot (.). For example: Endpoint.1.

Endpoint path doesn't support:

  • Dot (.) as the last character
  • Withspaces
  • Consecutive dots (..)
  • Null or empty input