ERP and EHR Transformation Without Item Master Chaos: A Data-First Playbook for Supply Chain Leaders
A step-by-step framework for treating product data as a first-class workstream so your transformation raises the standard instead of carrying old problems forward
Every ERP and EHR transformation does one of two things to your item master. It exposes the problems that were always there, or it fixes them. There is no neutral outcome. The data your organization migrates into a new platform shapes how that platform performs from day one, and the teams, workflows, and patients who depend on it feel the difference immediately.
Most health systems go into these projects knowing their item master has issues. Duplicate records, missing GTINs, outdated classifications, inconsistent vendor names. The plan is usually to address them during the project. What actually happens is that the project timeline fills up with configuration decisions, training programs, and integration work, and the item master cleanup gets deferred until after go-live, which means it gets done under pressure, in a live operational environment, at significantly higher cost. Gartner research consistently identifies poor data quality as the leading cause of ERP implementation underperformance, with remediation costs post-go-live running multiples higher than pre-migration cleanup would have required.
The alternative is a data-first approach that treats the item master as a first-class work stream from the start of the project, not an afterthought at the end of it. This playbook describes what that looks like in practice.
The Core Reality: An ERP or EHR transformation is the most significant opportunity a health system gets to reset its data standard. Organizations that recognize this use the migration to build a foundation that outlasts the implementation. Those that do not spend the next two years correcting problems they moved from the old system into the new one.
Phase 1: Assess What You Are Actually Working With
The first step is an honest inventory of your current item master. Not the one that exists in the project charter, but the one that exists in your system today, with all of its gaps, inconsistencies, and accumulated drift.
A meaningful assessment looks at attribute completeness across key fields: GTINs, UNSPSC classifications, HCPCS codes validated against current CMS standards, manufacturer identifiers, vendor mappings, and clinical attributes like sterility and implantable status. It also looks at duplicate rates, contract alignment gaps, and items that exist in the item master with no active purchasing history, which are often legacy records that have been carried through previous system migrations without ever being reviewed.
The goal of assessment is not to produce a list of problems. It is to produce a prioritized view of which data gaps create the most migration risk and operational impact, so the cleanup work that follows is sequenced by consequence rather than alphabetical order.
What to Measure: Attribute completeness rate, duplicate item count, contract alignment gaps, inactive legacy records, and classification accuracy across UNSPSC and HCPCS. These five signals give a clear picture of where the item master stands and what the migration is inheriting.
Phase 2: Define the Target Data Model
Before any data is cleaned or migrated, the project team needs to agree on what good looks like in the new system. That means defining a target data model: the specific fields, formats, classifications, and governance standards that every item record must meet in the new environment.
This is where ERP and EHR transformation projects often underinvest. The tendency is to model the new system around the data that already exists rather than the data the organization actually needs. The result is a new platform configured to accommodate old limitations rather than overcome them. The AHRMM Cost, Quality, and Outcomes (CQO) Movement describes the target state as a unified product record that connects clinical, operational, and financial attributes in a single, trusted source. Defining that standard before migration begins is what makes it achievable.
A well-defined target model also makes every subsequent phase of the project cleaner. Cleansing work has a clear benchmark to hit. Validation has a defined pass-fail standard. Integration testing has a data specification to verify against. The target model is not just a data architecture decision. It is the quality contract the project holds itself to.
Phase 3: Cleanse and Enrich Before You Migrate
This is the phase most projects skip, delay, or underscope. It is also the phase that most directly determines post-go-live performance.
Cleansing addresses what is wrong: duplicates that need to be consolidated, manufacturer names that need to be normalized, vendor mappings that need to be corrected, and inactive records that need to be retired before they move into the new system. Enrichment addresses what is missing: GTINs that need to be sourced and verified, classifications that need to be updated, clinical attributes that need to be populated, and substitute relationships that need to be mapped.
Both require external reference data. Cleaning manufacturer names against the GS1 global data network produces a different result than cleaning them against internal records alone. Validating GTINs against FDA device registration data catches errors that internal validation misses. Updating UNSPSC codes against the current release ensures classifications reflect the product landscape as it exists today, not as it was catalogued three years ago.
The scale of this work surprises most project teams. In a recent 60-day engagement, Symmetric delivered more than 98,000 GTIN and packaging updates for a single health system preparing for an ERP transition. That volume is not unusual for a large health system with a mature item master that has been through previous migrations without dedicated data remediation.
From a Recent Engagement: 2,747 duplicate records resolved and 98,000+ GTIN and packaging updates delivered in 60 days for a single health system ERP transition, stabilizing scanning and protecting go-live readiness across all active supply locations.
Phase 4: Validate With the People Who Will Use the Data
Data that looks clean in a spreadsheet does not always look right to the people who work with products every day. Clinical staff recognize when a product description does not match how the item is actually used. Supply chain managers catch when a substitute relationship is mapped to a product that is not clinically equivalent. Finance teams notice when a classification does not match how spend has historically been tracked.
Stakeholder validation is not a nice-to-have step in a data-first migration. It is where the gap between technically correct and operationally accurate gets closed. ERP implementation outcomes consistently finds that projects with structured stakeholder engagement in data validation have significantly higher user adoption rates post-go-live, because the system reflects how those users actually work rather than how the data team assumed they worked.
Validation should be structured, time-boxed, and focused on the items and categories that carry the most clinical and operational weight. Critical care products, high-spend categories, single-source items, and anything connected to patient charging or reimbursement should be reviewed by someone who understands both the data and the clinical context. Everything else can be reviewed through automated validation rules.
Phase 5: Integrate Through Tested Connections
The item master does not live in isolation. It connects to ERP procurement workflows, EHR clinical documentation, contract management systems, inventory platforms, and charge capture processes. Each of those connections is a point where data quality problems either surface or get suppressed until they cause a more serious failure downstream.
Integration testing with clean, enriched item master data is meaningfully different from integration testing with legacy data. Tests run against complete, standardized records produce results that reflect actual go-live conditions. Tests run against records with missing attributes, incorrect classifications, or mismatched vendor names validate system behavior under conditions that will not exist after migration, which means problems that should surface in testing are not discovered until they affect real transactions. The Healthcare Information and Management Systems Society (HIMSS) has noted that integration failures are among the most disruptive post-go-live issues in healthcare system implementations, and they are disproportionately driven by data quality gaps rather than technical configuration errors.
The practical implication is straightforward: integration testing should not begin until the item master has reached its target completeness standard. Testing on dirty data produces false confidence. Testing on clean data produces genuine readiness.
Phase 6: Maintain Quality After Go-Live
A data-first migration that ends at go-live is only half a solution. Item master data decays at an estimated rate of more than 30% per year as manufacturers update specifications, products are discontinued, regulatory classifications change, and new items enter the catalog without complete attributes. Without a maintenance mechanism, the data quality work done before migration begins to erode immediately.
Sustainable maintenance requires three things working together. First, governance workflows that validate new records against the target data model before they enter the system. Second, clear ownership of data quality by category so there is accountability for keeping specific sections of the item master current. Third, a connection to authoritative external sources, including FDA registration databases, GS1 product data, and manufacturer feeds, so that changes in the market are reflected in item records automatically rather than through reactive cleanup cycles.
Health systems that build this infrastructure as part of the migration project protect the investment they made in pre-migration data quality. Those that treat the cleanup as a project with an end date watch that investment degrade within the first year and find themselves facing the same item master problems at the next system refresh.
The Governance Imperative: The go-live moment is the best opportunity a health system will ever have to establish a new data governance standard from a clean baseline. Organizations that use it well maintain the quality gains. Organizations that do not start decaying on day one.
What the Playbook Requires From a Data Partner
Executing a data-first migration requires a partner with capabilities that most ERP implementation teams and general data consultants do not have: deep familiarity with healthcare product data specifically, access to authoritative external reference sources, and the capacity to work at the scale and speed that large health system migrations demand.
The work is not generic data management. It requires knowledge of how GTINs, UNSPSC codes, HCPCS codes, and clinical attributes interact with purchasing systems, clinical documentation, and billing workflows. It requires the ability to resolve duplicate records in a way that preserves the right purchasing history. It requires validated substitute mapping that meets clinical standards, not just operational convenience. And it requires integration experience with the ERP and EHR platforms the health system is migrating to.
It also benefits from participation in shared industry standards. The Healthcare Industry Resilience Collaborative (HIRC), which connects more than 1,200 hospitals, has built its work around normalized item data and validated substitute relationships as a shared infrastructure. A data partner connected to that network brings standards and reference data that no single health system could build independently.
How Symmetric Health Solutions Fits Into This Playbook
Symmetric Health Solutions is the specialized data partner built for exactly this work. The platform enriches item masters with verified manufacturer data, FDA registration information, and GS1 global product identifiers, resolves duplicates at scale, validates and backfills GTINs and packaging strings, normalizes manufacturer and supplier records against active contract data, and maps clinically validated substitute relationships across categories.
Symmetric supports each phase of the playbook described here. Assessment surfaces the completeness gaps and risk areas that need to be addressed before migration. Cleansing and enrichment brings item records to the target data model standard. Validation workflows engage clinical and supply chain stakeholders against enriched data rather than legacy records. Integration support ensures the connected systems the new platform depends on receive clean, complete data from day one.
Post-go-live, Symmetric's continuous enrichment engine keeps the item master current as the market evolves, protecting the data investment the migration required. And as a member of HIRC, Symmetric's normalization work contributes to shared standards across more than 1,200 hospitals, giving health system data a connection to the broader industry infrastructure rather than existing in isolation.
The outcome is a transformation that delivers what it promised. A new platform running on clean, complete, and continuously maintained product data, with the operational stability, user confidence, and financial accuracy that depends on getting the item master right before go-live rather than fixing it after.

