The following configuration example shows how you configure Kafka to read data from topics and relay this data via OTLP.
kafkareceiver component.BROKER_ADDRESS.
For more information, see the Kafka Apache quickstart guide.Here is an example YAML file for a basic Collector configuration that can be used to receive OpenTelemetry traces, metrics, and logs from Kafka.
receivers:kafka:tls:insecure: true # Only necessary if your Kafka server does not provide a certificate that's trusted by the Collector.traces:metrics:logs:brokers: ["${env:BROKER_ADDRESS}"]exporters:otlphttp:endpoint: ${env:DT_ENDPOINT}headers:Authorization: "Api-Token ${env:DT_API_TOKEN}"service:pipelines:traces:receivers: [kafka]exporters: [otlphttp]metrics:receivers: [kafka]exporters: [otlphttp]logs:receivers: [kafka]exporters: [otlphttp]
For this configuration to work, you need to set the following environment variables.
BROKER_ADDRESS: Specific to your Kafka server.DT_ENDPOINT: The base URL of the Dynatrace API endpoint (for example, https://{your-environment-id}.live.dynatrace.com/api/v2/otlp).DT_API_TOKEN: The API token.Validate your settings to avoid any configuration issues.
For our configuration, we configure certain components as described in the sections below.
Under receivers, we specify kafka as the active receiver component for our deployment.
This is required to receive data from Kafka server.
Under exporters, we specify the otlphttp exporter to forward data into Dynatrace.
For this purpose, we set the following two environment variables and reference them in the configuration values for endpoint and Authorization.
DT_ENDPOINT contains the base URL of the Dynatrace API endpoint (for example, https://{your-environment-id}.live.dynatrace.com/api/v2/otlp).DT_API_TOKEN contains the API token.Under service, we assemble our receiver, and exporter objects into service pipelines, which will perform these steps: