X
Request a Demo

January 29, 2026 | , , , , , ,

The Innovator’s Dilemma in Quality Measurement

Last week, NCQA convened a meeting of HEDIS® technology vendors to discuss the transition to digital quality. It was attended by a range of vendorssome old and many newOur very own Bob Cleric, Senior Director of Engineering and one of Astrata’s subject matter experts in digital quality, was therehe’s the guy in the tie! I had the opportunity to debrief with Bob at length when he returnedThe discussion focused on both technical necessities that vendors are still waiting for from NCQA and practical imperatives to drive plan readiness. And reading between the lines, I think the range of approaches that vendors are taking is starting to emergeAll of this is reminiscent of a story I’ve seen many times.

Rebecca Jacobson, MD, MS, FACMI

Co-Founder, CEO, and President

“We are in this together.”

 

Digital Quality Measurement and the Innovator’s Dilemma (A Familiar Story in Healthcare) 

There’s a pattern you start to notice if you’ve spent any time in healthcare technology. The companies that are really good at solving yesterday’s problems often struggle the most when the problem itself changes. Clay Christensen famously called this the Innovator’s Dilemma: the idea that successful companies don’t fail because they’re poorly run or short on talent, but because they’re doing exactly what made them successful in the first place, despite fundamental technology shifts. It is almost always the newer companies, starting with a clean technology slate, that are best positioned to win in times of change. And the shift to digital quality measurement is a textbook example of this phenomenon with very clear implications for health plans as they select partners and technologies.

 

The Old World of Quality Measurement 

For a long time, quality measurement followed a very specific script. Care happened. Time passed. And quality teams submitted their reports to the regulatory bodies after the fact. Data came from claims, supplemental feeds such as lab feeds or immunization registries and manual chart review. Measurement cycles ran months behind care delivery. And the work was hard (really hard), which meant organizations were willing to pay for tools, services, and expertise to manage the complexity. HEDIS® incumbents were built for that world. And to be clear, they were very good at it. They built platforms that could aggregate enormous amounts of data, support regulatory reporting at scale, and wrap everything in consulting and services to make it all usable. Vendors used proprietary data models which led to vendor lock-in and even more service costs,  often late-breaking and unanticipated. That’s what success looked like. But here’s the thing Christensen warned us about: success locks in assumptions. 

 

The New World of Quality Measurement 

Digital quality measurement (dQM) quietly asks a fundamentally new question. Instead of “How do we measure quality after care happens?” it asks: “How do we enable quality as care happens?” That’s a much bigger shift than what it sounds like. Instead of retrospective reports, you’re dealing with near-real-time signals. Instead of claims, you’re working with clinical data. Instead of abstractors and analysts, you’re leaning on automation and embedded workflows. And ultimately, you’re reaping the benefits of faster learning with lower burden. How do you make this happen? You need to leverage new technology. 

Enter digital quality measurement with a new set of assumptions. Digital Quality products are built as FHIR- and CQL-native and they are meant to support the future of healthcare quality measurement. Key differences in the new technology include near real-time measurement, use of clinical data over claims data and abstraction, and a shift away from manual ETL to automation. All of this is possible because digital quality is standards-based, leveraging the HL7 FHIR data standard, US Core, and the NCQA HEDIS® Implementation Guide (IG). By relying on standards, we say goodbye to proprietary models and vendor lock-in. That means product companies will have to compete on product fit for the emerging opportunity and improve computational efficiency to bring down price. That’s exactly what Christensen describes as disruptive innovation. 

Early on, these new solutions don’t look as comprehensive as the incumbents. The new products don’t cover all the measure sets yet. They have less history. The products aren’t shrink-wrapped, and the pricing isn’t 100% worked out. 

And that’s where the dilemma kicks in. 

Incumbents ignore the new tech, not because they are blind, but because it conflicts with their models of a la carte, service-based pricing, and because the new technology has high development costs and plenty of execution risks. For them, the rational move is to keep improving on what already works. For incumbent vendors, that might mean transforming FHIR data back into the incumbent’s proprietary data model, or recoding NCQA’s CQL measure packages back into proprietary measure logic. “Stay with us and you can avoid the pain of this change,” they say. But that rarely works. Because when the new technology is inherently better and cheaper, no amount of incremental product improvement is enough. That’s when disruption gets a foothold. And it’s also when smart buyers begin to see the writing on the wall.  

 

Why Pivoting Is Harder Than It Looks 

When people ask why these big platforms haven’t “just modernized,” they usually underestimate what they’re asking for. These systems were designed for aggregation and reporting, not real-time activation. Retrofitting them isn’t a feature release—it’s a fundamental rewrite. And rewriting a platform that still generates meaningful revenue is one of the hardest decisions a company can make. There’s no villain here. Just incentives. When your business model is aligned to manual work, customization, and services, automation isn’t exciting. It’s existentially threatening. When you’ve built your revenue on proprietary data models, open data standards like FHIR and US Core aren’t empowering, they’re terrifying. All of this is what leads previously successful companies to miss the opportunity, while newer entrants quickly take their place.  

 

Where Are The Innovators Going 

This is where newer quality engines like Astrata’s eMeasure come in—not as “better versions” of the old world, but as products built for a different world entirely. These new engines are made to run with clinical data, as often as daily, with much less data transformation and fewer manual operations, and minimal lag from release of the measure specs and digital measure packages to running in production environments. We are already seeing the opportunity for cost savings compared to traditional measurement—a combination of computational efficiency in how they handle data and labor savings as ETL fades and loading becomes automated.

Health plans that see all of this have an opportunity to get to the head of the line with new tech, and develop partnerships that help them influence the direction of these products. The endgame here isn’t a prettier dashboard or a faster report. It’s a world where quality measurement doesn’t feel so separated from healthcare delivery.

It’s continuous.
It’s embedded.
It’s part of how care gets delivered, not how it gets audited months later.

When quality becomes infrastructure instead of paperwork, the winners won’t be the companies most optimized for yesterday’s model. They’ll be the ones that were designed for this future from the beginning. Digital quality measurement isn’t just a new feature set. It’s the Innovator’s Dilemma playing out in healthcare—slowly, quietly, and then all at once.

And by the time it feels obvious, the leading health plans will already be well ahead.