Your logs may contain sensitive data that requires masking before you can use it for analytics, observability, security, and other purposes. Dynatrace provides you with tools that enable you to meet your data protection and other compliance requirements while still getting value from your logs.
Choose one or both methods, based on your information architecture and log setup:
At-capture masking requires identifying and masking sensitive parts of your log records before data is transferred to Dynatrace. To achieve this, you can choose OneAgent to collect your logs. OneAgent has a built-in mechanism for sensitive data masking that can be granularly configured on the host, host group, or environment level. Use at-capture masking with OneAgent to:
To mask your data this way, follow the steps described in Sensitive data masking in OneAgent.
If you send logs to Dynatrace using Fluent Bit, OpenTelemetry, or the log ingest API and need to mask sensitive data at capture, you need to do one of the following:
With this method, you can mask the data once it reaches Dynatrace by configuring processors in OpenPipeline. After data is processed, it is stored in Grail and is available for further analysis. The key advantage of this method is that it works across log ingest channels.
To mask sensitive data in Dynatrace with this method, configure a processor in OpenPipeline (for example, a DQL processor in the Processing stage).
To mask your data this way, follow the steps described in OpenPipeline processing examples.
There are two complementary ways to mask sensitive data in logs: masking at capture (for example, with OneAgent, so sensitive values never leave your environment) and masking at ingest (by configuring OpenPipeline processors that transform log attributes before data is stored in Grail). If you send logs using Fluent Bit, OpenTelemetry, or the log ingest API, apply masking before data is sent, and use ingest masking as an additional layer for centralized control across ingest channels.