Our Site Reliability Guardian & Workflows incorporate the Cloud Automation use cases. As Cloud Automation support will be discontinued on December 31, 2024, we recommend a timely Upgrade from Cloud Automation to Site Reliability Guardian. Please contact your account team for additional information and assistance.
With Cloud Automation, you can use quality gates to automatically validate your builds, deployments, and releases based on service-level objectives (SLOs). Cloud Automation quality gates are based on Keptn, a Cloud Native Computing Foundation open-source project.
Quality gates are benchmarks in the software delivery lifecycle that define specific, measurable, and achievable success criteria that a service must meet before it is advanced to the next phase of the software delivery pipeline. Quality gates can validate any service-level objective (SLO), giving you the ability to ensure automated and consistent evaluation of software quality.
Establishing clear, consistent, and effective quality gates that are automatically validated at each phase of the delivery pipeline is essential for improving software quality and speeding up delivery.
Dynatrace Cloud Automation is licensed based on the consumption of Cloud Automation units (CAUs). For details, see Cloud Automation monitoring (CAUs).
To enable Cloud Automation quality gates for a specific Dynatrace-monitored service, you need to connect your service with quality gates by adding two specific tags (keptn_managed
and keptn_service
) to the service in Dynatrace. This automatically adds the service to the dynatrace
Cloud Automation project and quality-gate
stage. Targeting a different project or stage isn't possible. To display a quality gate result in the release inventory, two additional tags (keptn_project
and keptn_stage
) are required.
The following steps describe how you can define those tags manually. To automate this process, you can also use auto-tagging rules or the DT_TAGS
environment variable.
keptn_managed
.keptn_service
; Value: <your_service_name>
.keptn_project
; Value: dynatrace
.keptn_stage
; Value: quality-gate
.When entering the value, be sure you use lowercase and that there are no spaces between words or special characters.
For more information about tagging, see Best practices and recommendations for tagging.
After adding the above-mentioned tags to your service, the service will show up in the Cloud Automation bridge.
To view the service
Automatic synchronization takes place every minute.
After connecting a service with a quality gate, you can customize the quality gate configuration by using a Dynatrace dashboard and supported tiles. Cloud Automation analyzes the dashboard and automatically creates the SLIs and SLOs that define the quality gate configuration.
To learn how to set up SLOs and SLIs on the Dynatrace-monitored service for which you want to run quality gates evaluations, see Service-level objectives and Service-level indicators.
To add and configure a dashboard for quality gate evaluations
Go to Dashboards or Dashboards Classic (latest Dynatrace) and select Create dashboard.
For Dashboard name, use the following pattern: KQG;project=<project>;service=<service>;stage=<stage>
. Be sure to replace the placeholders (<...>
) with real values derived from your Cloud Automation instance (example: KQG;project=dynatrace;service=iampservice;stage=quality-gate
).
Select Edit to edit the dashboard.
Add and configure any of the following tiles:
Select Done.
To exclude a tile from the quality gate evaluation, add the key-value pair exclude=true
to the tile title. For example, a Response time; exclude=true
title excludes the Response time tile from the quality gate evaluation.
You can use the Markdown tile to configure the comparison and scoring strategy.
By default, Cloud Automation performs a quality gate evaluation using the following comparison and scoring properties:
comparison:compare_with: "single_result"number_of_comparison_results: 1include_result_with_score: "pass"aggregate_function: avgtotal_score:pass: 90%warning: 75%
For details about comparison and scoring, consult the Cloud Automation documentation.
To override the default values, add a Markdown tile to the dashboard with one of the following semicolon-separated key-value pairs:
KQG.Compare.Results
0
)<value>
as the comparison:\number_of_comparison_results
value. comparison:\compare_with
will be set automatically according to this value.KQG.Compare.WithScore
pass
, all
, pass_or_warn
)<value>
as the comparison:\include_result_with_score
value.KQG.Compare.Function
avg
, p50
, p90
, p95
)<value>
as the comparison:\aggregate_function
value.KQG.Total.Pass
%
)<value>
as the total_score:\pass
value.KQG.Total.Warning
%
)<value>
as the total_score:\warning
value.Example override:
KQG.Total.Pass=90%;KQG.Total.Warning=75%;KQG.Compare.WithScore=pass;KQG.Compare.Results=1;KQG.Compare.Function=avg
While you can have multiple Markdown tiles on the dashboard, be sure to have one Markdown tile to configure comparison and scoring. This Markdown tile must consist of key-value pairs only.
You can use the SLO tile to define a metric with its target and warning state.
An SLO tile will produce an SLI with the same name as the underlying SLO. The SLO's target and warning thresholds are mapped to the Pass
and Warning
criteria. Querying remote environments or using custom management zones or timeframes is not supported.
You can use Data Explorer tiles to define a metric and its thresholds for Pass
, Warning
, and Fail
states.
Data Explorer tiles must include a single query (one metric). Metric selectors provided via the Advanced mode are supported.
The Data Explorer tile allows you to define thresholds that are used when evaluating this metric as part of the quality gate evaluation. To define the Pass
, Warning
, and Fail
criteria, you must define the threshold colors and values. The threshold values depend on the selected metric and must be strictly monotonically increasing. The threshold colors have to reflect the following state order: Pass
> Warning
> Fail
, or Fail
> Warning
> Pass
.
Examples:
To select a color for the quality gate evaluation state, open the color panel and select one of the available cell colors. Any configuration that uses different colors will fail.
Available cell color range:
Pass
Warning
Fail
To configure the thresholds for validating a metric, you can define the metric unit. As a result, the metric value is correctly converted from the base unit to the selected metric unit.
You can use the Problems tile to derive the current number of problems when executing a quality gate evaluation.
A Problems tile on the dashboard is mapped to an SLI called Problems
with the total count of open problems.
To link the new Dynatrace dashboard (which represents the quality configuration) to your Cloud Automation instance
Retrieve your current dynatrace.conf.yaml
configuration file or, if you don't have one, create it.
In your dynatrace
project, create a dynatrace.conf.yaml
file with the following content:
spec_version: '0.1.0'
Adapt the configuration file by adding the dashboard
property and its value.
To set the value, you have two options:
Upload the updated configuration file (which will override an existing one).
Run the command below, making sure to replace
<your_cloud_automation_url>
with your Cloud Automation instance URL<your_service_name>
with your service name<a_valid_bearer_token>
with a valid OAuth 2.0 Bearer token. See Authentication for details<your_configuration_file_base64_encoded>
with the base64 encoded content of the configuration filecurl --request 'POST' \'https://<your_cloud_automation_url>/resource-service/v1/project/dynatrace/stage/quality-gate/service/<your_service_name>/resource' \--header 'accept: application/json' \--header 'Content-Type: application/json' \--header 'Authorization: Bearer <a_valid_bearer_token>' \--data-raw '{ "resources": [ { "resourceContent": "<your_configuration_file_base64_encoded>", "resourceURI": "dynatrace/dynatrace.conf.yaml" } ] }'
Example command:
Base64 encode the content your configuration file, e.g. dynatrace.conf.yaml
, that you created.
cat ./dynatrace.conf.yaml
spec_version: '0.1.0'dashboard: query
base64 ./dynatrace.conf.yaml
c3BlY192ZXJzaW9uOiAnMC4xLjAnCmRhc2hib2FyZDogcXVlcnkKCg==
Use the value c3BlY192ZXJzaW9uOiAnMC4xLjAnCmRhc2hib2FyZDogcXVlcnkKCg==
as resourceContent
in the payload of your request.
curl --request 'POST' \'https://abc12345.cloudautomation.live.dynatrace.com/api/resource-service/v1/project/dynatrace/stage/quality-gate/service/iampapservice/resource' \--header 'accept: application/json' \--header 'Content-Type: application/json' \--header 'Authorization: Bearer eyJhbGciO[...]fVTfPKN1rOpfaO94FW0' \--data-raw '{ "resources": [ { "resourceContent": "c3BlY192ZXJzaW9uOiAnMC4xLjAnCmRhc2hib2FyZDogcXVlcnkKCg==", "resourceURI": "dynatrace/dynatrace.conf.yaml" } ] }'
In the example above, we add (resp. overwrite) the resource dynatrace/dynatrace.conf.yaml
on your Cloud Automation instance abc12345
of your IAM PAP Service
service running in the quality-gate
stage of your dynatrace
project using a valid Bearer token.
After connecting your Dynatrace-monitored service with Cloud Automation quality gates, you can trigger quality gate evaluations.
How the evaluation works:
Cloud Automation queries service-level indicators from Dynatrace and compares them against service-level objectives. If the objectives are met, the quality gate evaluation is succeeded
. Otherwise, the evaluation is failed
.
Cloud Automation quality gate automates the validation of different software versions or builds. To evaluate different software versions or builds, you can run multiple quality gates.
You have two options to trigger a quality gates evaluation for release validation: using the REST API, and in the Cloud Automation bridge. See below for instructions.
To learn how to set up SLOs and SLIs on your Dynatrace-monitored service for which you want to run quality gates evaluations, see Service-level objectives and Service-level indicators.
Cloud Automation bridge
After running the quality gate evaluation, you can view evaluation results on the Cloud Automation bridge in Services. When you select a service, a heatmap and a chart appear on the evaluation board, displaying the evaluation comparison for the quality-gate
stage.
For more information, see Services view.
Release inventory
The release inventory displays the quality gate result by showing a colored traffic light. Select the link next to the traffic light to open the Cloud Automation bridge for more details on the quality gate result.
To make this work, make sure to
keptn_managed
, keptn_service
, keptn_project
, and keptn_stage
) to the service as explained above.releasesVersion
with the right release version when triggering a quality gate.For additional insights into Dynatrace quality gates, check our Dynatrace University tutorials: