The Metrics behind High-Performing DevOps Organizations
The 2019 Accelerate State of DevOps report was recently released. This annual research compilation is a great resource to see what’s going on in the world of agile and DevOps.
The report evaluates organizations against five key metrics, collectively called software delivery and operational performance metrics:
- Lead time for changes: How long does it take for a change to get from dev to production?
- Deployment frequency: How often is change deployed?
- Time to restore service: If production has an issue, how long until it is corrected?
- Change failure rate: What percentage of changes deployed to production don’t work properly?
- Availability: How often is a service able to be accessed by end-users?
What I’ve always liked about the first four metrics is that they measure the effectiveness of the software development and delivery process by evaluating a combination of speed and quality. Many organizations feel there is a significant tradeoff between these characteristics, but in reality, high-performing organizations achieve speed and quality.
The metric I always jump to first when reading this report is change failure rate, as it is the one most associated with testing and quality. This is a great metric to measure the quality of the entire software development, testing, and delivery process.
Unfortunately, in 2019, much like in past years, the gap between the highest- and lowest-performing organizations is significant when it comes to change failure rates. The highest-performing organizations have a change failure rate seven times better than low performers. And while the gap for lead time for changes has closed, the quality gap has not.
If you’ve spent any time talking to organizations that are low performers, the reasons this is the case should be obvious. While there are often many causes for struggle, one called out by the Accelerate report is the lack of automation in these organizations.
We know that automation does not solve all of our quality challenges, but the report highlights how many of the highest-performing organizations (the top 20%) leverage automation, compared to the lowest performers (the bottom 12%).
What the report calls elite performers are embracing automation more in a variety of key areas:
- Builds (about 44% more)
- Unit tests (about 53% more)
- Acceptance tests (about 107% more)
- Performance tests (about 56% more)
- Security tests (about 107% more)
- Provisioning and deployment to test environments (about 85% more)
Low performers recognize that they need to improve quality, but their approaches are not very effective. They may fire all of the QA staff and turn testing over to developers (or outsource it), create a new, centralized automation group for the entire organization, or purchase an automation tool and foist it upon the QA organization, telling them to just get started.
Getting into automation requires education and support for the QA team, with everyone embracing the culture behind automation and the practices that will drive it.