DORA's State of DevOps in 2024
Evidence of the AI-led paradigm shift in Software Development tempered by caution
Google's 2024 Accelerate State of DevOps report highlights significant advancements and challenges within the DevOps landscape, particularly focusing on the transformative role of Artificial Intelligence (AI). This comprehensive analysis draws from insights provided by over 39,000 professionals worldwide, offering a detailed examination of software delivery performance, AI adoption, platform engineering, and developer experience.
Background on DORA Metrics
The DORA (DevOps Research and Assessment) metrics have been a cornerstone of DevOps performance measurement for over a decade. DORA began its research in 2014, aiming to identify the practices and capabilities that enable high-performing technology teams. By analyzing data from thousands of organizations across industries, the team developed a framework to measure software delivery performance. The result was the introduction of the "Four Key Metrics," which became the gold standard for assessing DevOps maturity:
Change Lead Time: The time it takes for a code commit to be deployed in production.
Deployment Frequency: How often an organization deploys code changes to production.
Change Failure Rate: The percentage of deployments causing failures in production that require remediation.
Mean Time to Recovery (MTTR): The time it takes to recover from a failure in production.
These metrics were designed to measure both throughput (speed) and stability (reliability), reflecting the trade-offs organizations face in delivering high-quality software quickly.
The Evolution of DORA
Over the years, DORA metrics have undergone refinement and expansion:
2018: The publication of Accelerate: The Science of Lean Software and DevOps by Nicole Forsgren, Jez Humble, and Gene Kim solidified the Four Key Metrics as essential indicators of DevOps success. The book emphasized that elite performers achieve both high throughput and stability, debunking the myth that speed compromises quality.
2019-2021: DORA reports highlighted the growing adoption of cloud technologies, continuous integration/continuous delivery (CI/CD), and automation as enablers of high performance. These practices were strongly correlated with improvements in the Four Key Metrics.
2022-2023: The metrics were further contextualized to account for organizational culture, leadership styles (e.g., transformational leadership), and developer experience. This period also saw increased emphasis on user-centricity, showing how aligning software development with user needs improves product quality and reduces burnout.
2024 DORA Performance Metrics
The 2024 Accelerate State of DevOps report introduced significant updates to how software delivery performance is measured:
Rework Rate: Added as a complementary metric to Change Failure Rate, rework rate measures unplanned deployments needed to fix user-facing bugs. This addition provides deeper insights into software delivery stability.
Throughput vs. Stability Factors: Recognizing that throughput (speed) and stability (reliability) are not always perfectly correlated, DORA now analyzes these dimensions separately. This nuanced approach helps organizations tailor their improvement strategies based on specific goals.
AI's Impact on Metrics: The report also explored how AI adoption influences these metrics. While AI improves productivity and code quality, it has introduced challenges like larger change lists, which negatively impact throughput and stability.
In the 2024 report, four distinct clusters emerged from the data highlighting the performance levels of responding organizations:
Overtime, the performance levels continue to improvement. For instance, elite performers now have a change failure rate of <5% versus <15% 4 years ago.
When compared to low performers, elite performers realize:
127x faster lead time
182x more deployments per year
8x lower change failure rates
2293x faster deployment recovery times
Artificial Intelligence: Transformative Impact on DevOps
AI is revolutionizing DevOps by automating processes and enhancing decision-making capabilities. Its integration has led to significant improvements in productivity, code quality, and system monitoring while also presenting new challenges.
Enhanced Productivity and Efficiency
AI-driven tools automate mundane and repetitive tasks such as code generation, testing, and deployment. This automation allows developers to focus on more strategic initiatives and reduces the time to market for new software features. AI copilots assist developers by providing intelligent code suggestions and autocompletion, thereby improving coding speed and accuracy. 75% of respondents reported positive productivity gains from AI in the three months preceding the DORA survey.
AI has a substantial and beneficial impact on flow, productivity, and job satisfaction. Productivity, for example, is likely to increase by approximately 2.1% when an individual’s AI adoption is increased by 25% (see Figure 7). This might seem small, but this is at the individual-level. Imagine this pattern extended across tens of developers, or even tens of thousands of developers.
Counter to popular belief that AI will help people do more valuable work. The survey highlight that AI is accelerating work people consider more valuable while time spent on toilsome work appears to be unaffected.
Improved Code Quality and Testing
AI enhances code quality by identifying errors and vulnerabilities early in the development cycle. Automated code reviews and real-time suggestions ensure adherence to coding standards and reduce the likelihood of defects in production. AI-powered testing frameworks generate adaptive test cases, improving test coverage and accuracy while minimizing manual effort.
Overall, the patterns here suggest a very compelling story for AI. A 25% increase in AI adoption is associated with a:
7.5% increase in documentation quality
3.4% increase in code quality
3.1% increase in code review speed
1.3% increase in approval speed
1.8% decrease in code complexity
Increased Batch Size = Negative Impact of Software Delivery Performance
Historically, DORA research has found that improvements to the software development process, including improved documentation quality, code quality, code review speed, approval speed, and reduced code complexity lead to improvements in software delivery.
So, it is surprising to see that an increase in AI adoption is decreasing delivery throughput and delivery stability. The latter is a significant decrease of 7.2%.
As AI increases batch sizes with a greater amount code changed in the same amount of time, it creates larger change lists. DORA has consistently shown that larger changes are slower and more prone to creating instability.
That is an important lesson for organizations to learn as AI makes it easier to make larger changes to the codebase.
Recommendation on AI Adoption Strategies
The 2024 State of DevOps report highlights the transformative impact of AI on software development practices, but it also presents new challenges that organizations must address to optimize their DevOps processes. Companies should not underestimate the roadblocks, growing pains and potential negative impacts of AI.
Adopting AI at scale requires a measured, transparent, and adaptable strategy. Recommendations on how to develop that strategy include:
Define a clear AI mission and policies to empower your organization and team.
Create a culture of continuous learning and experimentation with AI.
Recognize and leverage AI’s trade-offs for competitive advantage.
Emphasizing user-centric development and maintaining stable priorities are key strategies for leveraging these advancements effectively. As AI continues to evolve, its role in shaping the future of DevOps will likely expand further, offering new opportunities for innovation and excellence.