Advanced AWS log use cases

  • Latest Dynatrace
  • Tutorial
  • Published Sep 11, 2025
  • Preview

Direct push to Amazon Data Firehose stream: Linking and enriching logs records

By default, log records cannot be linked nor enriched in this mode. To accomplish that, you need to create a AWS connection to the designated account.

To enable linking and enrichment for specific log sources, you need to create an HTTP endpoint delivery Firehose data stream per log source resource. The stream configuration should follow the illustrated and include two parameters (which Firehose then translates to HTTP headers):

  • aws.arn
  • aws.resource.type

Those two parameters are parsed by the Dynatrace SaaS ingest APIs, enabling the linking and enrichment of log records.

You can obtain the resource ARN aws.arn from the AWS Console. The aws.resource.type follows the cloudformation resource naming. The HTTP endpoint URL is unique for any environment, use the one that the AWS connection Firehose stream is pointing to.

Account-level log source subscriptions

Account-level log sources subscription is a global method which defines a policy for subscribing log groups—Amazon Kinesis Data Firehose supports to this capability. This allows a more efferent policy-driven subscription filter.

Log processing in OpenPipeline optional

Your AWS logs are ingested by default with loglevel and status set to NONE. Dynatrace offers built-in technology bundles that you can set up in OpenPipeline to parse your logs as they’re ingested and enrich these attributes accordingly.

To set up log parsing, follow the instructions below.

  1. Go to Settings Settings > Process and contextualize > OpenPipeline > Logs.

  2. Create a new pipeline. To do so, go to the Pipelines tab, select the Pipeline and name it (for example, AWS log processing).

  3. Within the Processing tab, select Processor > Technology bundle.

  4. Go to the AWS section and select a bundle for an AWS service you’re ingesting logs from, then select Choose.

  5. Repeat step 3 to add as many technology bundles as AWS services you ingest logs for.

    Once you're done, select Save in the upper-right corner of the screen.

  6. Go to the Dynamic routing tab, select Dynamic route to add a new route and name it (for example: AWS logs route).

  7. For the matching condition enter: isNotNull(dt.da.aws.data_firehose.arn).

  8. In the Pipeline drop-down list, select the pipeline you created in step 3.

  9. Select Add to save the dynamic route.

You have now configured OpenPipeline log parsing for all the logs flowing via Amazon Data Firehose to Dynatrace, and you will see that loglevel and status attributes are enriched accordingly in your log records.

As logs are being ingested, refresh the page and you'll see the Records over time metric in Settings Settings to reflect log parsing activity.

Related tags
Infrastructure ObservabilityLogsLogsSettingsOpenPipelineOpenPipeline