Mask sensitive data in logs

  • Latest Dynatrace
  • Explanation
  • 4-min read

Your logs may contain sensitive data that requires masking before you can use it for analytics, observability, security, and other purposes. Dynatrace provides you with tools that enable you to meet your data protection and other compliance requirements while still getting value from your logs.

Choose one or both methods, based on your information architecture and log setup:

  • Masking at capture
    This approach allows masking sensitive data in your environment, hosts, or processes before it's transferred to the Dynatrace SaaS environment.
  • Masking at ingest
    This approach allows masking sensitive data once it arrives in the Dynatrace SaaS environment, and before it's stored in Grail.

Mask your logs at capture

At-capture masking requires identifying and masking sensitive parts of your log records before data is transferred to Dynatrace. To achieve this, you can choose OneAgent to collect your logs. OneAgent has a built-in mechanism for sensitive data masking that can be granularly configured on the host, host group, or environment level. Use at-capture masking with OneAgent to:

  • Make sure your sensitive data never leaves your environment.
  • Avoid using multiple tools.
  • Simplify configuration.

To mask your data this way, follow the steps described in Sensitive data masking in OneAgent.

Mask before sending logs using Fluent Bit, OpenTelemetry, or the log ingest API

If you send logs to Dynatrace using Fluent Bit, OpenTelemetry, or the log ingest API and need to mask sensitive data at capture, you need to do one of the following:

  • Mask your data by configuring a log producer.
  • Set up a log shipper or forwarder that masks sensitive data before it's sent to Dynatrace.

Mask your logs at ingest using OpenPipeline

With this method, you can mask the data once it reaches Dynatrace by configuring processors in OpenPipeline. After data is processed, it is stored in Grail and is available for further analysis. The key advantage of this method is that it works across log ingest channels.

To mask sensitive data in Dynatrace with this method, configure a processor in OpenPipeline (for example, a DQL processor in the Processing stage).

To mask your data this way, follow the steps described in OpenPipeline processing examples.

Summary

There are two complementary ways to mask sensitive data in logs: masking at capture (for example, with OneAgent, so sensitive values never leave your environment) and masking at ingest (by configuring OpenPipeline processors that transform log attributes before data is stored in Grail). If you send logs using Fluent Bit, OpenTelemetry, or the log ingest API, apply masking before data is sent, and use ingest masking as an additional layer for centralized control across ingest channels.

Related tags
Log Analytics