AI Won’t Fix Broken Systems: Lessons from the 2025 DORA Report
AI adoption is nearly universal, but the 2025 DORA Report shows that faster coding doesn’t always mean increased productivity.

TLDR
- AI adoption is nearly universal, but productivity gains are mostly perceived, not always measured.
- Individual speed ≠ system performance. Faster code means little if pipelines, reviews, and release processes can’t keep up.
- Delivery instability is rising, especially where teams adopt AI without rethinking workflows or quality gates.
- Strong systems amplify AI’s value: mature version control, healthy data ecosystems, and robust internal platforms make the biggest difference.
- AI mirrors the system it enters—fixing processes and culture is the real unlock for long-term impact.
AI is rapidly reshaping software engineering. The 2025 DORA report shows that adoption in software engineering has become nearly universal: 90% of survey respondents use AI, and more than 80% believe it has increased their productivity.
The keyword here is ‘believe,’ as it has been shown in various studies that AI productivity gains can be deceptive. One study (METR) showed that developers believed AI tools made them 20% more efficient, while in reality they were slowed down by AI tools by 19%. Also, individual productivity gains in writing code often do not reflect as increased productivity across the entire software delivery lifecycle, and the DORA reports dedicated a whole section of this year’s report to that.
Engineering organizations don’t need faster typers
AI adoption, as per the DORA research, correlates with improvements in several key areas:
- Higher levels of individual effectiveness
- Higher code quality
- Better team and organizational performance
Individual developers report producing more code, faster. The top use case for AI tools is writing new code, stated by 71% of coding respondents. Yet software delivery remains a system problem. As Chris Westerhold, Global Practice Director for Engineering Excellence at Thoughtworks, put it in The Hangar podcast:
Most engineering organizations do not need faster typers.
The common engineering bottlenecks are flaky pipelines, no testing strategy, poor documentation, or organizational structures, the usual roadblocks to getting to business value.Your team might get marginally faster at writing code, but unless you address those systemic issues, you’re never going to realize the full value of AI tools.
The DORA report also found no clear link between AI adoption and reductions in friction or burnout and even observed increased delivery instability in some organizations.
While a tool designed to automate repetitive duties might seem like a clear path to a smoother workflow, our data indicates that workplace friction is a much larger and more complex issue than the mere completion of rote tasks. As we’ve indicated, some research points to friction as a product of processes beyond the individual.
AI Engineering Waste
In his analysis on the Thoughtworks blog, Chris warns that AI tools can even contribute to the emergence of a new kind of waste in engineering organizations – AI engineering waste.
Examples of AI engineering waste could be:
- Prompt-response latency: Engineers spend valuable time waiting for AI models to generate responses, delaying workflows and breaking focus.
- Context loss: If AI systems lose track of conversations or project-specific context, developers must repeatedly re-explain issues, leading to frustration and wasted effort.
- AI toolchain fragmentation: Teams juggle multiple, disconnected AI tools and platforms, which leads to frequent context switching and increased cognitive load.
- Validation overhead: Thoroughly reviewing and validating AI-generated code for correctness, security, and coherence adds significant effort to the process.
Without the right structure and processes, AI can turn speed into chaos.
The DORA report also acknowledges that successfully adopting AI in software development is not as simple as just using new tools.
The research identifies seven DORA AI Capabilities that influence positive impacts of AI adoptions in certain organizations:
- Clear AI strategy and communication
- A healthy, accessible data ecosystem
- Strong version control practices
- Working in small batches
- User-centered design focus
- High-quality internal platforms
- Tight alignment between teams and systems
Organizations that have these capabilities in place tend to amplify AI’s impact; those that don’t often see uneven or unstable results.
For example, strong version control becomes even more critical when AI-generated code dramatically increases the volume of commits. Similarly, working in small batches reduces friction for AI-assisted teams and supports faster, safer iteration.
How to Adopt AI Well
AI doesn’t inherently make engineering better—it magnifies whatever system it operates within. In teams with well-defined processes and clean architectures, AI can enhance quality and flow. In teams with tangled pipelines or unclear governance, it can accelerate chaos.
To translate AI adoption into lasting organizational performance, teams must treat it as a systems design problem, not a tooling upgrade.
The report points to several key enablers, some of which are:
- Redesign workflows to match new development speeds. Don’t assume existing processes can carry increased output.
- Invest in internal platforms that centralize documentation, tools, and data.
- Clarify governance and roles so that AI usage aligns with quality and compliance standards.
- Use Value Stream Management (VSM) to ensure local productivity gains translate to system-level improvement.
The AI Mirror
The DORA 2024 explicitly states that AI reflects and amplifies your organization’s true capabilities. This is why AI functions both as a mirror and a multiplier. It shines a light on what’s working, accelerating what’s already in motion, but it also surfaces what needs to change.
We are seeing that AI’s effects on performance depend on the system in which the work takes place.
Without intentional changes to workflows, roles, governance, and cultural expectations, AI tools are likely to remain isolated boosts in an otherwise unchanged system—a missed opportunity.