In this use case, you'll
Dashboards to observe GitLab pipelines and
Workflows.SDLC events are events with a separate event kind in Dynatrace that follow a well-defined semantics for capturing data points from a software component's software development lifecycle. The SDLC event specification defines the semantics of those events.
The main benefit is data normalization and tool agnosticism.
As a result,
Dashboards, apps, and
Workflows can be built on SDLC events with well-defined properties rather than tool-specific details.
This information is intended for platform engineers and DevOps engineers who use GitLab in their Internal Development Platform (IDP).
In this tutorial, you'll learn how to
Dashboards to analyze data and drive improvements.Install the Configuration As Code tool of your choice. Either install the Terraform CLI or install the Monaco CLI. Depending on the tool you choose, pick the correct setup section below.
Create a new platform token with the following permissions and store it in a secure place:
app-engine:apps:run.settings:objects:read.settings:objects:write.document:documents:write.document:documents:read.Clone the Dynatrace configuration as code sample repository using the following command.
git clone https://github.com/Dynatrace/dynatrace-configuration-as-code-samples.git
You can choose between two options:
Set up Terraform.
Prepare the Terraform configuration.
The configuration consists of
Move to the gitlab_pipeline_observability_terraform directory with the following command.
cd dynatrace-configuration-as-code-samples/gitlab_pipeline_observability_terraform
Store the retrieved platform token in an environment variable.
$env:DYNATRACE_PLATFORM_TOKEN='<YOUR_PLATFORM_TOKEN>'
Store your Dynatrace environment URL in an environment variable.
Make sure to replace <YOUR-DT-ENV-ID> with your Dynatrace environment ID, e.g. abc12345.
$env:DYNATRACE_ENV_URL='https://<YOUR-DT-ENV-ID>.apps.dynatrace.com'
Check the OpenPipeline configuration for SDLC events.
These steps modify the OpenPipeline configuration for SDLC events. If your OpenPipeline configuration contains only default/built-in values, you can apply it directly. If you have any dynamic routes, first download your configuration and then manually merge it into the provided configuration.
Step 3 indicates if a configuration merge is needed or if you can apply the provided configuration directly.
Go to Settings > Process and contextualize > OpenPipeline > Software Development Lifecycle.
Check the Dynamic routing section, are there any other routes than Default route?
If the answer is yes, follow the steps in the next section. Otherwise, skip ahead.
Merge OpenPipeline routing entries.
If you have any OpenPipeline routing entries that aren't default, then you need to merge them before you can proceed.
Download your OpenPipeline configuration.
You need to set up the DYNATRACE_ENV_URL and DYNATRACE_PLATFORM_TOKEN environment variables according to the Dynatrace Terraform provider installation guide.
Then you can use the export utility of the Dynatrace Terraform provider to download all routing entries for SDLC events.
Navigate to the directory that includes the terraform-provider-dynatrace_vx.y.z binary and use it to export the dynatrace_openpipeline_v2_events_sdlc_routing configurations, for example,
./terraform-provider-dynatrace_v1.86.0 -export dynatrace_openpipeline_v2_events_sdlc_routing.
You should see an output similar to this example:
The environment variable DYNATRACE_TARGET_FOLDER has not been set. Use the'configuration' folder as default.Downloading "dynatrace_openpipeline_v2_events_sdlc_routing" Count: 1Post-Processing Resources ...- [POSTPROCESS] dynatrace_openpipeline_v2_events_sdlc_routing - openpipeline_v2_events_sdlc_routingPost-Processing Resources - Group child configs with parent configs ...Finishing touches ...Writing ___resources___.tfWriting ___datasources___.tfWriting main.tfWriting ___variables___.tfWriting main ___providers___.tfWriting modules ___providers___.tfRemove Non-Referenced Modules ...Finish Export ...Terraform executable path: /usr/local/bin/terraformExecuting 'terraform init'... finished after 3 seconds
The export utility has created a folder configuration. You can now open the following files:
./configuration/modules/openpipeline_v2_events_sdlc_routing/openpipeline_v2_events_sdlc_routing.openpipeline_v2_events_sdlc_routing.tf.main.tf, which contains the routing configurations for the GitHub pipelines.Merge all routing_entry blocks of your downloaded routing file into the routing_entries block of the resource events_sdlc_global_routing_table in main.tf, and then save the file.
This is mandatory because the Dynamic Routing table is a global configuration, and the order of the entries, as well as the matcher clauses, determines the overall routing.
Apply the Terraform configuration.
Run this command to verify the provided Terraform configuration.
terraform plan
Run this command to apply the provided Terraform configuration.
terraform apply
To receive events processed by OpenPipeline, you need an access token with the following OpenPipeline scopes:
openpipeline.events_sdlc.openpipeline.events_sdlc.custom.To generate an access token:
Access Tokens.You can only access your token once upon creation. You can't reveal it afterward.
Select these scopes:
openpipeline.events_sdlc)openpipeline.events_sdlc.custom)Save the generated token securely for subsequent steps.
We refer to it as <YOUR-ACCESS-TOKEN>.
In GitLab, create the webhook with the following settings:
Enter the URL of your placeholders for your Dynatrace environment ID <YOUR-DT-ENV-ID>.
https://<YOUR-DT-ENV-ID>.live.dynatrace.com/platform/ingest/custom/events.sdlc/gitlab
You can enter an optional webhook name and description, but skip the Secret token setting since a custom header manages request validation.
In the Trigger section, select the following events to trigger the webhook.
Add a custom header to your webhook with the name Authorization and value Api-Token <YOUR-ACCESS-TOKEN>.
Now that you've successfully configured GitLab and Dynatrace, you can use
Dashboards and SDLC events to monitor your GitLab pipelines and merge requests within the entire development organization.
In Dynatrace, open the GitLab Pipeline Pulse, GitLab Merge Request, and GitLab Deployments dashboards to:
To try out, go to Dynatrace Playground.
Use these insights for the following improvement areas:
Increase CI/CD pipeline efficiency.
Observing workflow executions lets you identify bottlenecks and inefficiencies in your CI/CD pipelines.
Knowing about these bottlenecks and inefficiencies helps optimize build and deployment processes, leading to faster and more reliable releases.
Improve developer productivity.
Automated pipelines reduce the manual effort required for repetitive tasks, such as running tests and checking coding standards. This automation allows developers to focus more on writing code and less on administrative tasks.
Get data-driven development insights. Analyzing telemetry data from CI/CD pipelines provides valuable insights into the development process. You can use the telemetry data to make informed decisions and continuously improve the development flows.
Check and adjust your CI/CD pipelines regularly to make sure they're running smoothly.
In Dynatrace, adjust the timeframe of the relevant dashboards to monitor the long-term impact of your improvements.
We highly value your insights on pipeline observability. Your feedback is crucial in helping us enhance our tools and services. Visit the Dynatrace Community page to share your experiences, suggestions, and ideas directly on the Feedback channel for CI/CD Pipeline Observability.
Dashboards