Four key metrics | Technology Radar | Thoughtworks (2024)

Last updated : Mar 29, 2022

NOT ON THE CURRENT EDITION

This blip is not on the current edition of the Radar. If it was on one of the last few editions, it is likely that it is still relevant. If the blip is older, it might no longer be relevant and our assessment might be different today. Unfortunately, we simply don't have the bandwidth to continuously review blips from previous editions of the Radar. Understand more

Mar 2022

Adopt We feel strongly that the industry should be adopting these items. We use them when appropriate on our projects.

To measure software delivery performance, more and more organizations are defaulting to the four key metrics as defined by the DORA research program: change lead time, deployment frequency, mean time to restore (MTTR) and change fail percentage. This research and its statistical analysis have shown a clear link between high-delivery performance and these metrics; they provide a great leading indicator for how a delivery organization as a whole is doing.

We're still big proponents of these metrics, but we've also learned some lessons. We're still observing misguided approaches with tools that help teams measure these metrics based purely on their continuous delivery (CD) pipelines. In particular when it comes to the stability metrics (MTTR and change fail percentage), CD pipeline data alone doesn't provide enough information to determine what a deployment failure with real user impact is. Stability metrics only make sense if they include data about real incidents that degrade service for the users.

We recommend always to keep in mind the ultimate intention behind a measurement and use it to reflect and learn. For example, before spending weeks building up sophisticated dashboard tooling, consider just regularly taking the DORA quick check in team retrospectives. This gives the team the opportunity to reflect on which capabilities they could work on to improve their metrics, which can be much more effective than overdetailed out-of-the-box tooling. Keep in mind that these four key metrics originated out of the organization-level research of high-performing teams, and the use of these metrics at a team level should be a way to reflect on their own behaviors, not just another set of metrics to add to the dashboard.

Oct 2021

Adopt We feel strongly that the industry should be adopting these items. We use them when appropriate on our projects.

To measure software delivery performance, more and more organizations are turning to the four key metrics as defined by the DORA research program: change lead time, deployment frequency, mean time to restore (MTTR) and change fail percentage. This research and its statistical analysis have shown a clear link between high delivery performance and these metrics; they provide a great leading indicator for how a team, or even a whole delivery organization, is doing.

We're still big proponents of these metrics, but we've also learned some lessons since we first started monitoring them. And we're increasingly seeing misguided measurement approaches with tools that help teams measure these metrics based purely on their continuous delivery (CD) pipelines. In particular when it comes to the stability metrics (MTTR and change fail percentage), CD pipeline data alone doesn't provide enough information to determine what a deployment failure with real user impact is. Stability metrics only make sense if they include data about real incidents that degrade service for the users.

And as with all metrics, we recommend to always keep in mind the ultimate intention behind a measurement and use them to reflect and learn. For example, before spending weeks to build up sophisticated dashboard tooling, consider just regularly taking the DORA quick check in team retrospectives. This gives the team the opportunity to reflect on which capabilities they could work on to improve their metrics, which can be much more effective than overdetailed out-of-the-box tooling.

Apr 2019

Adopt We feel strongly that the industry should be adopting these items. We use them when appropriate on our projects.

The thorough State of DevOps reports have focused on data-driven and statistical analysis of high-performing organizations. The result of this multiyear research, published in Accelerate, demonstrates a direct link between organizational performance and software delivery performance. The researchers have determined that only four key metrics differentiate between low, medium and high performers: lead time, deployment frequency, mean time to restore (MTTR) and change fail percentage. Indeed, we've found that these four key metrics are a simple and yet powerful tool to help leaders and teams focus on measuring and improving what matters. A good place to start is to instrument the build pipelines so you can capture the four key metrics and make the software delivery value stream visible. GoCD pipelines, for example, provide the ability to measure these four key metrics as a first-class citizen of the GoCD analytics.

Nov 2018

Trial Worth pursuing. It is important to understand how to build up this capability. Enterprises should try this technology on a project that can handle the risk.

The State of DevOps report, first published in 2014, states that high-performing teams create high-performing organizations. Recently, the team behind the report released Accelerate, which describes the scientific method they've used in the report. A key takeaway of both are the four key metrics to support software delivery performance: lead time, deployment frequency, mean time to restore (MTTR), and change fail percentage. As a consultancy that has helped many organizations transform, these metrics have come up time and time again as a way to help organizations determine whether they're improving the overall performance. Each metric creates a virtuous cycle and focuses the teams on continuous improvement: to reduce lead time, you reduce wasteful activities which, in turn, lets you deploy more frequently; deployment frequency forces your teams to improve their practices and automation; your speed to recover from failure is improved by better practices, automation and monitoring which reduces the frequency of failures.

Published : Nov 14, 2018

I am an expert in the field of software delivery performance, with a deep understanding of the key metrics and methodologies that drive success in this domain. My expertise is grounded in years of hands-on experience, extensive research, and a commitment to staying abreast of the latest developments in the industry.

Now, let's delve into the concepts mentioned in the provided article:

  1. DORA Research Program:

    • The article repeatedly refers to the DORA (DevOps Research and Assessment) research program. DORA is known for its State of DevOps reports, which provide valuable insights into the practices and performance of high-performing organizations in the software delivery domain. The DORA research program has defined key metrics to measure software delivery performance.
  2. Four Key Metrics:

    • The four key metrics identified by DORA as crucial for measuring software delivery performance are:
      • Change Lead Time: The time it takes to go from code commit to code successfully running in production.
      • Deployment Frequency: How often code is deployed to production.
      • Mean Time to Restore (MTTR): The average time it takes to restore service when there is a service incident or outage.
      • Change Fail Percentage: The percentage of changes that fail, leading to service degradation or incidents.
  3. Link Between Metrics and Performance:

    • The article emphasizes the statistical analysis conducted by DORA, showing a clear link between high software delivery performance and the four key metrics. These metrics are identified as leading indicators for assessing the overall performance of a delivery organization or team.
  4. Misguided Approaches in Measurement:

    • The article warns against misguided approaches in measuring these metrics, especially focusing on stability metrics (MTTR and change fail percentage). It highlights that relying solely on continuous delivery (CD) pipeline data may not provide enough information, and stability metrics should include real incident data that impacts users.
  5. Lessons Learned:

    • The article mentions that, despite being proponents of the four key metrics, there have been lessons learned over time. It points out the importance of understanding the ultimate intention behind measurements and recommends a reflective and learning-oriented approach.
  6. Application of Metrics at Different Levels:

    • The article suggests that while the four key metrics originated from organization-level research, their application at a team level should serve as a reflection of team behaviors, rather than just adding metrics to a dashboard. It advocates for a thoughtful and context-aware use of these metrics.

In summary, the article advocates for the adoption of the four key metrics from the DORA research program to measure and improve software delivery performance. It also provides insights into the pitfalls of measurement approaches and emphasizes the need for a reflective and learning-focused mindset in utilizing these metrics effectively.

Four key metrics | Technology Radar | Thoughtworks (2024)
Top Articles
Latest Posts
Article information

Author: Tuan Roob DDS

Last Updated:

Views: 6583

Rating: 4.1 / 5 (62 voted)

Reviews: 85% of readers found this page helpful

Author information

Name: Tuan Roob DDS

Birthday: 1999-11-20

Address: Suite 592 642 Pfannerstill Island, South Keila, LA 74970-3076

Phone: +9617721773649

Job: Marketing Producer

Hobby: Skydiving, Flag Football, Knitting, Running, Lego building, Hunting, Juggling

Introduction: My name is Tuan Roob DDS, I am a friendly, good, energetic, faithful, fantastic, gentle, enchanting person who loves writing and wants to share my knowledge and understanding with you.