Ollama

  • Concept
  • 1-min read
  • Published Oct 22, 2024

Ollama is a platform that allows users to run and manage AI models locally on their own machines. It provides tools for deploying, interacting with, and fine-tuning various AI models, particularly those related to natural language processing.

Monitoring your Ollama models via Dynatrace, you can get prompt and completion recording, error tracking, performance metrics of your AI services, and more.

Ollama Observability

Explore the sample dashboard on the Dynatrace Playground.

Spans

The following attributes are available for GenAI Spans.

AttributeTypeDescription
gen_ai.completion.0.contentstringThe full response received from the GenAI model.
gen_ai.completion.0.rolestringThe role used by the GenAI model.
gen_ai.prompt.0.contentstringThe full prompt sent to the GenAI model.
gen_ai.prompt.0.rolestringThe role setting for the GenAI request.
gen_ai.request.modelstringThe name of the GenAI model a request is being made to.
gen_ai.response.modelstringThe name of the model that generated the response.
gen_ai.systemstringThe GenAI product as identified by the client or server instrumentation.
gen_ai.usage.completion_tokensintegerThe number of tokens used in the GenAI response (completion).
gen_ai.usage.prompt_tokensintegerThe number of tokens used in the GenAI input (prompt).
llm.request.typestringThe type of the operation being performed.