The Dynatrace full-stack observability platform combined with Traceloop's OpenLLMetry OpenTelemetry SDK can seamlessly provide comprehensive insights into Large Language Models (LLMs) in production environments. By observing AI models, businesses can make informed decisions, optimize performance, and ensure compliance with emerging AI regulations.
Instrument your application
Create a Dynatrace token so OpenLLMetry can report data to your Dynatrace tenant.
To create a Dynatrace token
In Dynatrace, go to Access Tokens.
To find Access Tokens, press Ctrl/Cmd+K to search for and select Access Tokens.
In Access Tokens, select Generate new token.
Enter a Token name for your new token.
Give your new token the following permissions:
Search for and select all of the following scopes.
Copy the generated token to the clipboard. Store the token in a password manager for future use.
You can only access your token once upon creation. You can't reveal it afterward.
Initialize OpenLLMetry with the token to collect all the relevant KPIs.
How you initialize the framework depends on the language.
The Dynatrace backend exclusively works with delta values and requires the respective aggregation temporality. Make sure your metrics exporter is configured accordingly, or set the OTEL_EXPORTER_OTLP_METRICS_TEMPORALITY_PREFERENCE environment variable to DELTA.
We can leverage OpenTelemetry to provide autoinstrumentation that collects traces and metrics of your AI workloads, particularly OpenLLMetry that can be installed with the following command:
pip install traceloop-sdk
Afterward, add the following code at the beginning of your main file.
We can leverage OpenTelemetry to provide autoinstrumentation that collects traces and metrics of your AI workloads, particularly OpenLLMetry that can be installed with the following command:
npm i @opentelemetry/exporter-trace-otlp-proto @traceloop/node-server-sdk
Afterward, add the following code at the beginning of your main file.