March 17, 2026 | CQL and FHIR, Digital Quality Measures (DQM), Digital Quality Transformation, Quality Improvement, Star Ratings, Uncategorized, Year Round Prospective HEDIS®
For years, the annual cycle for National Committee for Quality Assurance (NCQA) HEDIS® measure development and deployment has been optimized for retrospective measurement. That design made sense when quality measurement was largely about looking backward at performance. But as health plans increasingly invest in digital HEDIS® measurement, the timeline that governs measure releases and implementations must change too.
Rebecca Jacobson, MD, MS, FACMI
Co-Founder, CEO, and President
This two-part blog addresses two related “time” problems facing health plans pursuing digital HEDIS® measurement:
These two timelines are deeply connected. Without a stable annual cycle, the industry will struggle to operationalize digital measurement. And without a realistic multi-year transition plan, health plans will never reach the point where they can leverage the full benefits of digital quality.
Before diving in, you can read Part 1 here.
Even with a stable annual cadence in place, health plans face a larger challenge: transitioning their operations to digital HEDIS® measurement. That is a substantial transformation.
At Astrata, we work with health plans through this transition using a digital quality maturity model that helps assess readiness, map data to FHIR resources, and build toward parity with traditional reporting. Along the way, we’ve identified five principles that are essential for making the transition feasible.
But principles alone don’t create a plan. Health plans also need a concrete sense of what the multi-year journey actually looks like so they can allocate resources, set milestones, and know whether they’re on track.
Astrata’s DQM maturity model structures this journey into four distinct phases, each with a defined scope and duration. The table below maps those phases to a realistic calendar, assuming a health plan beginning its transition now.
| Phase | Typical Duration | Approximate Timing | What Happens |
|---|---|---|---|
| Phase 1: Initial Mapping and Testing | 6-12 months | 2025-2026 | Map core FHIR resource types; run an initial subset of measures against a limited population; identify data quality issues early |
| Phase 2: Full Mapping and Benchmarking | 8-12 months | 2026-2027 | Complete final data mapping and benchmarking; expand to the full measure set; scale to the full member population; prepare for parallel reporting |
| NCQA Parallel Reporting Year | One-time submission | 2028-2029 | Report from both traditional and digital engines simultaneously for one year; complete auditor review; resolve remaining edge cases |
| Production Deployment | Ongoing | By 2029-2030 | Digital measurement fully deployed and integrated into health plan operations |
A few things are worth noting about this roadmap. Phase 1 is not optional runway. It is where the foundation gets built, and skipping or compressing it creates serious risk downstream. The combination of phases 1 and 2 can span anywhere from 14 to 24 months depending on data infrastructure maturity, which means plans that haven’t started yet have little margin to spare.
The parallel reporting year is also a harder lift than it may appear. Running two measurement systems simultaneously, reconciling results, and completing an auditor review in a single submission cycle takes sustained organizational effort. Plans should budget and staff for it accordingly, not treat it as a formality.
Most importantly, the ability to hit these milestones depends on the annual HEDIS® release cycle becoming more predictable over this period. If measure specifications, implementation guides, and CQL packages continue to arrive late in the measurement year, health plans will find it increasingly difficult to complete Phase 2 and enter the parallel reporting year on schedule. This is exactly why the two time problems described in this post must be solved together.
Five key principles are essential for making this transition feasible.
One of the most important lessons from early adopters is that organizations must be able to run measures digitally during the transition, not just after it.
Mapping data to the NCQA HEDIS® IG cannot be done in isolation. Teams must iteratively test their mappings against real measures written and executed in CQL. Without a digital engine to run measures during development, there’s no way to validate whether data mapping actually works.
At the same time, most health plans don’t want to purchase a second engine just to complete this process. Astrata’s pricing model is designed with that reality in mind.
Successful programs build gradually:
This staged approach lets teams move quickly while managing compute costs and operational risk. Astrata’s methodology builds incrementally toward full-population coverage to help you gain momentum early.
Many FHIR resources appear repeatedly across measures; others appear only occasionally. But eventually, all of them matter. Health plans should have a deliberate plan to test mappings across the entire resource set early in the process. Otherwise, rarely-used resources can cause unexpected failures late in implementation.
Astrata’s approach is designed to surface those resources early in the transition process.
A disciplined methodology prioritizes the most frequent data quality and completeness issues upfront. Once those are resolved, teams can work through the long tail of edge cases without losing momentum.
Astrata’s eMeasure software provides the visibility needed to make informed decisions about mapping adjustments throughout this process.
One of the most important organizational challenges is building alignment with internal IT teams. Standards-based engineering for digital quality measurement is fundamentally different from the interoperability work many IT organizations have done to meet federal API requirements.
For example, compliance with interoperability rules from the Assistant Secretary for Technology Policy often focused on exposing APIs such as patient access APIs, with no requirement for the completeness or accuracy of data delivered through them.
Digital quality measurement is different:
This is not a box-checking exercise. It is mission-critical analytics infrastructure, and IT needs to understand that distinction early so they can allocate the time and resources the work actually requires.
The Astrata eMeasure team can help bridge these competing priorities and support faster organizational alignment.
The path to digital quality measurement depends on solving these two interrelated time problems simultaneously.
First, the industry needs a stable annual timeline that puts validated measures in health plans’ hands early enough to support prospective care improvement, not just retrospective reporting.
Second, health plans need a deliberate multi-year transition strategy that systematically moves them from traditional HEDIS® operations to a fully digital measurement ecosystem.
Solve both, and digital measurement can finally deliver on its promise: using quality data during the measurement year to improve care, not just to report on it after the fact.