The Dynatrace full-stack observability platform combined with Traceloop's OpenLLMetry OpenTelemetry SDK can seamlessly provide comprehensive insights into Large Language Models (LLMs) in production environments. By observing AI models, businesses can make informed decisions, optimize performance, and ensure compliance with emerging AI regulations.
Create a Dynatrace token so OpenLLMetry can report data to your Dynatrace tenant.
To create a Dynatrace token
metrics.ingest
)logs.ingest
)events.ingest
)openTelemetryTrace.ingest
)metrics.read
)settings.write
)You can only access your token once upon creation. You can't reveal it afterward.
Initialize OpenLLMetry with the token to collect all the relevant KPIs.
How you initialize the framework depends on the language.
The Dynatrace backend exclusively works with delta values and requires the respective aggregation temporality. Make sure your metrics exporter is configured accordingly, or set the OTEL_EXPORTER_OTLP_METRICS_TEMPORALITY_PREFERENCE
environment variable to DELTA
.
We can leverage OpenTelemetry to provide autoinstrumentation that collects traces and metrics of your AI workloads, particularly OpenLLMetry that can be installed with the following command:
pip install traceloop-sdk
Afterward, add the following code at the beginning of your main file.
from traceloop.sdk import Traceloopheaders = { "Authorization": "Api-Token <YOUR_DT_API_TOKEN>" }Traceloop.init(app_name="<your-service>",api_endpoint="https://<YOUR_ENV>.live.dynatrace.com/api/v2/otlp",headers=headers)