The Difference Between Cleansing Your Data and Actually Keeping It Current

Most health systems have run an item master cleanup project at some point. They dedicate staff, bring in a vendor, resolve duplicates, normalize supplier names, fill in missing fields, and finish with a cleaner catalog than they started with. Then they move on.

Six months later, the gains are already eroding. Twelve months later, a significant share of those records are outdated again.

This is not a failure of execution. It is a fundamental misunderstanding of what data quality actually requires.

Data Decays Whether You Are Watching or Not

Item master data does not hold still. A survey by AHRMM found that fewer than 40 percent of health system supply chain leaders rated their item master data as "highly accurate," despite most having completed at least one formal cleanup initiative in the prior three years. The gap between cleanup and confidence is not a mystery: industry estimates consistently put annual item master decay rates above 30 percent, meaning nearly a third of your enriched records will drift toward inaccuracy within the year if you do not actively manage them.

The sources of decay are mundane but relentless. Manufacturers update product specifications and packaging configurations. CMS releases a new HCPCS code set every January 1, retiring codes that were valid the year before. The FDA updates its device registration data continuously as products receive clearance, change ownership, or are withdrawn from the market. GS1, the global standards body that manages GTIN assignments, processes hundreds of thousands of product changes annually across the medical device and supply categories that health systems purchase. None of those changes automatically update your item master. They accumulate silently until a claim gets rejected, a scan fails at the point of use, or a stockout triggers emergency freight.

A one-time cleanup project addresses the backlog. It does not address what happens the day after the project closes.

What Gets Missed When Data Is Treated as a Project

The consequences of treating data quality as periodic rather than continuous are concrete and traceable to the income statement.

Take HCPCS codes. The Centers for Medicare and Medicaid Services adds, revises, and deletes codes each calendar year. A health system that validated its billing codes in Q1 of one year is operating on outdated data by Q1 of the next. Retired codes generate claim denials. Denials generate rework, write-offs, or delayed revenue. For high-cost implantable devices, where reimbursement can run into the tens of thousands of dollars per case, even a modest denial rate from stale billing attributes is a meaningful revenue problem.

GTIN accuracy follows the same pattern. The FDA's Global Unique Device Identification Database is updated continuously as manufacturers submit new product registrations and modify existing ones. Item masters that are validated against this database once and not refreshed will carry GTINs that no longer match current product configurations, producing barcode scan failures at the point of use. When scans fail, consumption data becomes unreliable. When consumption data is unreliable, replenishment triggers are late, and the resulting shortages get covered by emergency purchasing at premium cost. A study published in the Journal of Healthcare Management found that hospitals with lower item master completeness rates incurred significantly higher emergency freight expenses than peer institutions, with the gap persisting until underlying data quality was resolved.

The Difference Continuous Enrichment Makes

The argument for continuous enrichment is not that periodic cleanup is worthless. A well-executed baseline remediation is necessary. The argument is that it is not sufficient on its own.

Continuous enrichment means that as manufacturers update specifications, those updates are reflected in the item master automatically rather than accumulating until the next project. It means that when CMS publishes its annual HCPCS update, billing code validation runs against the current release rather than last year's. It means that GTIN coverage stays current as products are added to the catalog and existing products change. The Healthcare Industry Resilience Collaborative, which coordinates shared data standards across more than 1,200 hospitals, frames this as the underlying logic for sustained investment in shared product data standards: recurring value requires recurring maintenance, not periodic remediation.

The operational difference is also real. McKinsey research found that knowledge workers in data-intensive roles spend an average of 20 percent of their time correcting data quality problems. In a supply chain department, that capacity cost has a direct financial equivalent. A team spending a fifth of its time on data corrections is a team that is not doing contract optimization, clinical value analysis, or resiliency planning. Continuous data quality does not eliminate supply chain work. It redirects it toward the work that generates margin rather than the work that repairs broken records.

Clean Data Is Not a Destination

The health systems that get the most durable value from item master investment are the ones that treat data quality as an ongoing operational function rather than a project with a completion date. The margin gains from contract compliance, HCPCS-based reimbursement, GTIN-driven inventory accuracy, and reduced maverick spend are all recurring. They compound over time when the underlying data stays current. They erode almost immediately when it does not.

A one-time cleanup captures the backlog. Continuous enrichment captures the value every month after that. The distinction between the two is not technical. It is strategic.

Next
Next

The Health Systems That Saw the Petroleum Disruption Coming Were Already Ready When It Hit