Pushes custom logs to Dynatrace.
This endpoint is available in your SaaS environment, or as an alternative, you can expose it on an Environmental ActiveGate with the Log analytics collector module enabled. This module is enabled by default on all of your ActiveGates.
The request consumes one of the following payload types:
text/plain
—limited to a single log event.application/json
—supports multiple log events in a single payload.Be sure to set the correct Content-Type header and encode payload with UTF-8: application/json; charset=utf-8
.
POST | SaaS | https://{your-environment-id}.live.dynatrace.com/api/v2/logs/ingest |
Environment ActiveGateCluster ActiveGate | https://{your-activegate-domain}:9999/e/{your-environment-id}/api/v2/logs/ingest |
To execute this request, you need an access token with logs.ingest
scope.
To learn how to obtain and use it, see Tokens and authentication.
When using log processing with the custom processing pipeline (OpenPipeline), ingest supports all JSON data types for attribute values. This requires SaaS version 1.295+ when using the SaaS API endpoint or ActiveGate version 1.295+ when using the ActiveGate API endpoint. In all other cases, all ingested values are converted to the string type.
The body of the request. Contains one or more log events to be ingested.
The endpoint accepts one of the following payload types, defined by the Accept header:
text/plain
: supports only one log event.application/json
: supports multiple log events in a single payload.LogMessageJson
objectThe log message in JSON format. Use one object representing a single event or an array of objects representing multiple events.
The object might contain the following types of keys (the possible key values are listed below):
unparsed_timestamp
attribute (for Log Monitoring Classic, this attribute isn't indexed).NONE
is used.string
type are supported. All attributes are indexed and can be used in queries. Keys are case-insensitive (lowercased).string
type are supported. Semantic attributes are indexed and can be used in queries. If an unsupported key occurs, it is not indexed and can't be used in indexing and aggregations. Keys are case-insensitive (lowercased).Attributes structure
Complex objects are flattened facilitating ease of handling, and a simpler representation. The following guidelines outline the process:
The keys are concatenated using a dot (.) until a simple value is reached in the hierarchy. For example:
Base JSON:
{"test": { "attribute": {"one": "value 1", "two": "value 2"}}}
Result:
{"test.attribute.one": "value 1", "test.attribute.two": "value 2" }
When an array is encountered, a multi-value attribute is created at that level. If there are non-simple values within the array, the JSON stringified value is maintained.
Name conflicts are resolved as follows:
In case of a name conflict, where a key is overwritten, it is prefixed with "overwritten". For example:
Base JSON:
{"host.name": "abc", "host": { "name": "xyz"}}
Result:
{"host.name": "abc", "overwritten1.host.name": "xyz"}
If a second conflict arises, an index is added starting with 1:
Base JSON:
{"service.instance.id": "abc", "service": { "instance.id": "xyz", "instance": { "id": "123"}}}
Result:
{"service.instance.id": "abc", "overwritten1.service.instance.id": "xyz", "overwritten2.service.instance.id": "123" }
Limitations
The object value can be a single constant or an array of constants. The length of the value is limited. Any content exceeding the limit is trimmed. Default limits:
Supported timestamp keys:
Supported content keys:
Supported severity keys:
Supported semantic attribute keys:
This is a model of the request body, showing the possible elements. It has to be adjusted for usage in an actual request.
[{"content": "Exception: Custom error log sent via Generic Log Ingest","log.source": "/var/log/syslog","timestamp": "2022-01-17T22:12:31.0000","severity": "error","custom.attribute": "attribute value"},{"content": "Exception: Custom error log sent via Generic Log Ingest","log.source": "/var/log/syslog","timestamp": "2022-01-17T22:12:35.0000"},{"content": "Exception: Custom error log sent via Generic Log Ingest","log.source": "/var/log/syslog"},{"content": "Exception: Custom error log sent via Generic Log Ingest"}]
Only a part of input events were ingested due to event invalidity. For details, check the response body.
Success. Response doesn't have a body.
Failed. This is due either to the status of your licensing agreement or because you've exhausted your DPS license.
Failed. The requested resource doesn't exist. This may happen when no ActiveGate is available with the Log Analytics Collector module enabled.
Failed. Request payload size is too big. This may happen when the payload byte size exceeds the limit or when the ingested payload is a JSON array with the size exceeding the limit.
Failed. Too Many Requests. This may happen when ActiveGate is unable to process more requests at the moment or when log ingest is disabled.
Failed. The server either does not recognize the request method, or it lacks the ability to fulfil the request. In Log Monitoring Classic, this may happen when indexed log storage is not enabled.
Failed. The server is currently unable to handle the request. This may happen when ActiveGate is overloaded.
SuccessEnvelope
objectSuccess
objectThe HTTP status code
Detailed message
{"details": {"code": 1,"message": "string"}}