Section 20.4: Data Governance and Validation Controls
Implementing robust data governance frameworks specific to specialty pharmacy data (dispensing, clinical, operational) to ensure quality, integrity, security, and meet payer/pharma reporting requirements.
Data Governance and Validation Controls
From Data Entry Clerk to Data Steward: Ensuring the Foundation is Sound.
20.4.1 The “Why”: Garbage In, Garbage Out, Contracts Lost
In the previous sections, we’ve built our “Integrated Smart Home” (Section 20.1) and connected it to the outside world via digital bridges (Sections 20.2 & 20.3). We now have data flowing seamlessly: electronic referrals arrive, clinical updates are exchanged, prior authorizations are automated. Our technology stack is impressive. But all of this technology is utterly worthless—in fact, it can be actively dangerous—if the data flowing through it is inaccurate, incomplete, or inconsistent.
This is the concept of “Garbage In, Garbage Out” (GIGO). If a technician incorrectly types the patient’s date of birth during intake, every system downstream—the PMS, the CRM, the RTVB check, the ePA submission, the adherence report—will inherit that error. The result? Billing rejections, delayed therapy, potentially catastrophic clinical errors, and ultimately, failed audits.
In community pharmacy, data quality is important for billing and basic safety. In specialty pharmacy, data quality is existential. Your contracts with payers and pharmaceutical manufacturers are entirely dependent on your ability to prove your performance through data. If your reported adherence scores are based on inaccurate dispense dates, if your reported “Time to First Fill” (TTFF) excludes referrals lost in a work queue, if your clinical assessments have missing fields, you will not just fail an audit; you risk losing your network contracts and your access to limited distribution drugs (LDDs).
Furthermore, accreditation bodies like URAC and ACHC have stringent requirements for data integrity, validation, and reporting. Poor data quality is a primary reason pharmacies fail accreditation surveys.
This section is your masterclass in building a culture and a system that prioritizes data quality above all else. We will introduce the formal discipline of Data Governance—the policies, processes, roles, and controls that ensure your pharmacy’s data assets are accurate, secure, and fit for purpose. You will transition your mindset from simply being a user of data to being a Data Steward—someone who takes ownership of the quality and integrity of the information that powers patient care and business success.
Pharmacist Analogy: The Building Inspector vs. The Construction Crew
Imagine your specialty pharmacy is a complex, high-tech skyscraper being built (our “Integrated Smart Home”).
- The IT Team and Software Vendors are the Construction Crew. They lay the foundation (infrastructure), erect the frame (PMS), install the wiring (APIs), and put in the smart devices (CRM, CTI, Analytics). They build the structure.
- The Pharmacists, Technicians, and Intake Coordinators are the Workers Inside. They are the ones actually using the building day-to-day, moving materials (prescriptions), talking to residents (patients), and generating activity (data).
Who ensures the building is safe, up to code, and functioning as designed? That’s the Building Inspector. This inspector doesn’t pour concrete or install wires, but they have a crucial role:
- They establish the Building Codes (Data Governance Policies): Rules that everyone must follow (e.g., “All electrical wiring must use copper,” “All patient DOBs must be entered MM/DD/YYYY”).
- They conduct Inspections (Data Validation Controls) at critical points: Checking the foundation before framing starts (input validation), inspecting the plumbing before walls go up (process validation), and doing a final walkthrough before issuing an occupancy permit (output validation).
- They appoint Site Foremen (Data Stewards) in each area (e.g., the electrical foreman, the plumbing foreman) who are responsible for the quality of work in their domain.
- They maintain the Blueprints (Data Dictionary) to ensure everyone is working from the same plan and using the same definitions.
Data Governance is the practice of being the “Building Inspector” for your pharmacy’s data. It’s not just an IT function; it is a clinical and operational imperative. As an advanced specialty pharmacist, you are often a key “Site Foreman”—a Data Steward responsible for the quality of clinical data. You need to understand the “Building Codes” and how to perform the “Inspections” to ensure the entire structure is sound.
20.4.2 Defining Data Governance: Policies, Roles, and Quality Dimensions
Data Governance is the formal orchestration of people, processes, and technology to manage an organization’s data assets. It’s about establishing clear rules of the road and accountability for data quality. A robust Data Governance framework has several key components:
Core Components of a Data Governance Framework
1. Roles and Responsibilities
- Data Governance Council: Senior leadership (Pharmacy Director, IT Director, Compliance Officer) who set the strategy and approve policies.
- Data Owners: Business leaders responsible for a specific data domain (e.g., Director of Clinical Services owns clinical data; Director of Operations owns operational data). They are ultimately accountable for data quality in their domain.
- Data Stewards: Subject Matter Experts (SMEs), often pharmacists or lead technicians, assigned responsibility for defining and controlling specific data elements within a domain. (e.g., A clinical pharmacist might be the steward for “Adherence Calculation Logic”).
- Data Custodians: The IT team responsible for the technical aspects – storage, security, backup, and transport of the data.
2. Policies and Standards
- Data Governance Policy: The overarching document establishing the program’s authority, scope, and objectives.
- Data Quality Standards: Defines the acceptable quality thresholds for key data elements (e.g., “Patient phone number must be 95% complete and 98% valid”).
- Data Definition Standards (Data Dictionary): A central glossary defining every key term and metric (e.g., How exactly is “Time to First Fill” calculated? What counts as the ‘start’ and ‘end’ time?).
- Data Security Policy: Defines access controls, encryption standards, and breach protocols (integrates with HIPAA).
3. Processes and Controls
- Data Quality Monitoring: Regular processes (often automated) to measure data quality against the defined standards.
- Data Validation Rules: System-based rules to prevent bad data entry (see Section 20.4.4).
- Issue Resolution Process: A defined workflow for identifying, logging, prioritizing, and fixing data quality problems.
- Change Management Process: A formal process for requesting and approving changes to data definitions, systems, or reports (see Section 20.5).
The Six Dimensions of Data Quality: Your Inspection Checklist
How do you measure “good” data? Data Governance defines six key dimensions. These are the categories your “Building Inspector” checks. As a Data Steward, you must understand these dimensions to identify and fix problems.
Accuracy
Does the data correctly reflect the real-world object or event? Example: Is the patient’s DOB entered in the system (05/15/1980) their actual date of birth?
Completeness
Are all the required data elements present? Is data missing? Example: Does every prescription record have an associated ICD-10 code, as required for PA?
Consistency
Does the same data element have the same value across different systems or reports? Example: Does the patient’s address in the PMS match their address in the CRM?
Timeliness
Is the data available when it is needed? Is it up-to-date? Example: Did the lab result HL7 feed arrive within 1 hour of the result being finalized?
Validity / Conformity
Does the data conform to the defined format, rules, or standards? Example: Is the phone number entered as “555-123-4567” (valid format) or “5551234567 Ext 12” (invalid format)?
Uniqueness
Is this the one and only record for this entity, or are there duplicates? Example: Do we have two different patient profiles for “Jane Doe” because her name was misspelled during intake once?
20.4.3 Critical Data Domains in Specialty Pharmacy: Where Quality Matters Most
Data Governance isn’t just theory. It must be applied rigorously to the specific data elements that drive your pharmacy’s clinical and business operations. Let’s examine the key “data domains” and the unique quality challenges within each.
1. Dispensing Data Domain
This is the core data generated by the PMS dispense event. It’s the foundation for billing, inventory, and adherence calculations.
| Key Data Elements | Data Quality Dimensions at Risk | Impact of Poor Quality |
|---|---|---|
| NDC (National Drug Code) | Accuracy, Validity | Billing rejection (wrong NDC billed), clinical error (wrong drug dispensed), incorrect formulary check. |
| Dispense Date | Accuracy, Timeliness | Incorrect adherence (PDC) calculation, incorrect refill scheduling, failed audits. |
| Days Supply | Accuracy, Consistency | Incorrect adherence calculation, billing rejections (days supply exceeds plan limit), early/late refill reminders. |
| Lot Number & Expiration Date | Accuracy, Completeness | Inability to manage recalls effectively (URAC/ACHC failure), dispensing expired product (patient safety). |
| Shipping Address & Tracking Number | Accuracy, Completeness | Lost/delayed shipments, patient complaints, wasted high-cost medication, inability to prove delivery for audits. |
2. Clinical Data Domain
This data documents the patient’s condition, response to therapy, and your clinical interventions. It’s critical for proving outcomes to payers/pharma and for patient safety.
| Key Data Elements | Data Quality Dimensions at Risk | Impact of Poor Quality |
|---|---|---|
| ICD-10 Diagnosis Code | Accuracy, Completeness, Validity | PA denial (wrong diagnosis submitted), incorrect clinical protocol assignment, inability to report outcomes by disease state. |
| Clinical Assessments (Structured Data) | Completeness, Consistency, Timeliness | Inability to report clinical outcomes (e.g., side effect prevalence), missed safety signals, failed pharma data contracts, accreditation failure (lack of documented care plan). |
| Lab Results (from HL7 or manual entry) | Accuracy, Timeliness, Validity (Units) | Incorrect clinical decision-making (e.g., dosing based on wrong CrCl), missed drug toxicity signals, patient harm. |
| Adherence Scores / Barriers (Documented) | Accuracy, Consistency, Timeliness | Inaccurate reporting to payers/pharma, missed opportunities for intervention, incorrect assumptions about therapy failure. |
3. Operational Data Domain
This data tracks the efficiency and effectiveness of your internal workflows. It’s critical for pharma service level agreements (SLAs) and internal process improvement.
| Key Data Elements | Data Quality Dimensions at Risk | Impact of Poor Quality |
|---|---|---|
| Referral Received Date/Time | Accuracy, Timeliness, Consistency | Incorrect “Time to First Fill” (TTFF) calculation, inaccurate reporting to pharma, inability to identify intake bottlenecks. |
| Workflow Statuses & Timestamps (e.g., “PA Submitted,” “Welcome Call Complete”) | Accuracy, Timeliness, Completeness | Inaccurate Turnaround Time (TAT) reporting, inability to manage work queues effectively, failed SLA audits. |
| Call Center Metrics (ASA, Abandon Rate, AHT from CTI) | Accuracy, Completeness | Inaccurate staffing models, poor patient service, failed accreditation metrics. |
4. Financial Data Domain
This tracks the revenue cycle and patient financial assistance. Accuracy here is fundamental to business survival.
| Key Data Elements | Data Quality Dimensions at Risk | Impact of Poor Quality |
|---|---|---|
| Patient Insurance Information (BIN, PCN, Group, Member ID) | Accuracy, Completeness, Validity | Billing rejections, delays in therapy, incorrect RTVB results, compliance issues (billing wrong plan). |
| Prior Authorization Number & Dates | Accuracy, Completeness, Timeliness | Billing rejections (“PA Not on File”), claim denials requiring appeals, lost revenue. |
| Copay Assistance Applied (Manufacturer Card, Foundation Grant) | Accuracy, Consistency | Billing errors (patient charged wrong amount), reconciliation failures with assistance programs, compliance risks. |
| Reimbursement Data (Allowed Amount, Paid Amount from Remittance) | Accuracy, Timeliness | Inaccurate financial reporting, inability to identify payer underpayments, poor contract negotiation position. |
20.4.4 Input Validation Controls: Building the “Code Check” at the Door
The most effective way to ensure data quality is to prevent bad data from entering the system in the first place. These are your “input validation controls”—the building codes enforced the moment data is typed or received.
1. System-Level Validation Rules (Your Best Friend)
This is where your IT team and PMS/CRM vendor earn their keep. The system itself should be configured to reject or flag invalid data *at the point of entry*. This is far more reliable than relying on humans.
- Required Fields: The system simply won’t let you save a new patient record if the DOB field is blank. It forces Completeness.
- Format Masks: The “Phone Number” field only accepts digits and formats it automatically as (XXX)XXX-XXXX. The “DOB” field requires MM/DD/YYYY. This enforces Validity.
- Dropdown Lists (Controlled Vocabularies): This is critical. Instead of letting a user type the “State” (allowing “GA,” “Georgia,” “ga”), you provide a dropdown list of valid state abbreviations. Instead of a free-text “Reason for PA Denial,” you provide a dropdown of common denial codes. This enforces Consistency and Validity.
- Range Checks: The “Days Supply” field cannot accept a value less than 1 or greater than 90. This enforces Validity.
- Lookup Validations: When a user types an NPI number, the system does a real-time check against the national NPI database to ensure it’s a valid number and populates the prescriber’s name. This enforces Accuracy and Validity.
- Duplicate Checks: When creating a new patient, the system checks if a patient with the same Name + DOB already exists. This prevents violations of Uniqueness.
2. Interface Validation (Checking Incoming Data)
When data arrives electronically (HL7, C-CDA, SCRIPT), your interface engine must act as the gatekeeper.
- Schema Validation: Does the incoming message conform to the standard? (e.g., Is this valid XML? Does it have all the required HL7 segments?). If not, the message is rejected back to the sender.
- Required Field Check: Even if the message is valid, does it contain the critical data your pharmacy needs? (e.g., Does this
NewRxmessage have an ICD-10 code in the<Diagnosis>tag?). If not, the interface engine might route it to an “error queue” for manual review. - Cross-Reference Validation: Does the patient MRN (
PID-3in HL7) match an existing patient in your system? If not, does it need to create a new patient or flag it as a potential mismatch?
3. Human Validation Processes (The Necessary Safety Net)
Systems can’t catch everything. You still need well-trained humans performing checks.
- Intake Data Verification: The classic “Read Back.” When taking info over the phone, reading back the name spelling, DOB, and address is a simple but powerful accuracy check.
- “Two-Tech Check” / Pharmacist Verification: Standard pharmacy practice. A second person reviews critical data entry (Rx typing, billing info) before it proceeds.
- Training & Competency: Formal training on *why* data quality matters, common error types, and how to use system validation tools. Regular competency assessments ensure staff understand the standards.
Tutorial: Designing an Input Control for ICD-10 Codes
Let’s apply this. ICD-10 codes are critical but prone to error. How would a Data Steward design input controls?
- System Control 1 (Required): Make the `Primary_ICD10` field mandatory in the PMS/CRM Patient Profile before a specialty drug can be prescribed. (Ensures Completeness).
- System Control 2 (Validity): Implement a lookup validation. As the user types “M05…”, the system suggests valid ICD-10 codes (M05.79, M05.80, etc.) from a built-in library. Don’t allow free text. (Ensures Validity & Accuracy).
- System Control 3 (Consistency): For key drugs, implement a cross-check. If the drug is Humira and the ICD-10 entered is for Hypertension, flash a warning: “Warning: ICD-10 does not match common indications for this drug. Please verify.” (Enhances Accuracy).
- Interface Control: Configure the interface engine to reject any incoming
NewRxmessage that is missing the<Diagnosis>tag or contains an invalid ICD-10 code format. Route these to an error queue. - Human Control: Train intake staff on the importance of the ICD-10 for PAs. Add a step to the Benefits Investigation workflow: “Verify ICD-10 code is present and clinically plausible.”
By layering these controls, you dramatically reduce the chance of an incorrect or missing ICD-10 code causing a PA denial downstream.
20.4.5 Process Validation Controls: Checking the Work-in-Progress
Input controls are vital, but errors can still creep in as data moves through your complex workflows or gets transformed between systems. Process controls are checks performed during the workflow to catch inconsistencies or inaccuracies.
1. Reconciliation Reports
These are automated or manual processes designed to compare data between two related-but-separate systems or process steps to ensure they match. They target Consistency and Completeness.
- PMS Dispense vs. Shipping Manifest: Does every prescription marked as “Shipped” in the PMS have a corresponding tracking number and delivery confirmation from the FedEx/UPS shipping system? If not, investigate why (e.g., shipment voided, tracking number not uploaded).
- CRM PA Status vs. PBM Portal Status: Does the “Approved” status logged in the CRM match the official status on the payer’s portal? (Often run by RPA bots). If not, which is correct?
- Inventory On Hand (PMS) vs. Physical Count: Does the perpetual inventory count in the system match the actual count in the refrigerator? If not, investigate potential dispensing errors, theft, or receiving errors.
- Remittance Advice vs. Claims Billed: Does the payment received from the PBM (on the 835 remittance file) match the amount you expected to be paid for each claim? If not, flag for investigation by the billing team.
2. Workflow Audits & Exception Queues
These controls monitor the flow of work itself, looking for bottlenecks or data inconsistencies between steps. They target Timeliness and Consistency.
- “Stuck” Order Reports: A report that flags any patient case in the CRM that has been in the same status (e.g., “PA Pending”) for longer than a defined threshold (e.g., 3 days). This prompts manual investigation.
- PMS/CRM Mismatch Queues: An automated check that compares key fields (e.g., Current Address, Primary Insurance) between the PMS and CRM daily. Any discrepancies are routed to a “Data Integrity Queue” for a human to resolve the conflict and update the “Single Source of Truth.”
- Missing Data Alerts: Automated alerts triggered if a critical data point is missing at a key workflow step (e.g., If a prescription moves to “Ready to Bill” but the `PA_Number` field is blank, flag it for review).
3. Automated Data Quality Rules
These are logic checks, often run nightly by the IT team against the database, looking for data that violates predefined business rules. They target Validity and Accuracy.
- Logical Date Checks: Flag any records where `Date_Filled` is before `Date_Written`, or `Date_Shipped` is before `Date_Filled`.
- Outlier Detection: Flag any dispense records with a `Days_Supply` > 90 or a calculated `Cost` > $100,000 (potential typo).
- Format Validation: Scan all `Phone_Number` fields for records that don’t match the standard format (indicating a failure in the input mask or a bad data import).
- Referential Integrity Checks: Ensure that every `Patient_ID` listed in the `Dispense_Log` table actually exists in the `Patient_Master` table. (This prevents “orphan” records).
Process Mining: The Advanced Technique
A cutting-edge approach used by larger organizations is Process Mining. This uses specialized software to analyze the timestamps of every status change in your workflow systems (PMS, CRM).
The software automatically generates a visual flowchart of how work actually flows through your pharmacy, highlighting bottlenecks, deviations from the standard process, and steps where data quality issues are commonly introduced.
For example, Process Mining might reveal that PAs submitted via ePA get approved in 2 days, while those submitted via the PBM portal take 7 days, and those still done by fax take 14 days. This provides objective data to justify investing in more ePA integrations.
20.4.6 Output Validation Controls: Certifying the Final Report
Your pharmacy’s reputation, contracts, and accreditation often hinge on the accuracy of the reports you submit to external parties (payers, pharma, URAC/ACHC). Output controls ensure that these final reports are accurate, properly defined, and securely delivered.
1. Report Logic Validation & Certification
Before any report is sent externally, it must be validated. This means confirming that the logic used to generate the numbers is correct and matches the agreed-upon definitions.
- Source-to-Target Mapping: Documenting exactly which tables and fields in the Data Warehouse are used to calculate each metric in the report.
- Manual Recalculation (Spot Check): For critical KPIs (like Adherence or TTFF), take a small sample of patients from the report and manually recalculate their metric by tracing the data back to the source systems (PMS, CRM). Does your manual calculation match the report’s number?
- Data Steward Sign-Off: The designated Data Steward for that domain (e.g., Clinical Pharmacist for adherence report) reviews the logic and methodology and formally “certifies” the report’s accuracy before it is submitted.
2. Data Dictionary & Metric Definition
This is arguably the most important non-technical control. Everyone must agree on what the words mean. A Data Dictionary is the central, governed repository for these definitions.
- Clear Business Definitions: What exactly constitutes the “Start Time” for TTFF? Is it the timestamp on the fax image, or the time the patient was entered in the CRM? What defines an “Adherent Day” for PDC? Does it count doses taken early? These must be explicitly defined and approved by the Data Governance Council.
- Technical Definitions: The exact SQL query logic or BI tool formula used to calculate the metric.
- Consistency: The definition used for internal reporting must match the definition required by the external party. If a pharma partner defines adherence differently than URAC, you may need to calculate and report it both ways, clearly labeling each.
3. Secure Data Transmission & Access Controls
How the data leaves your building is just as important as its accuracy. This involves both technical and administrative controls.
- Secure File Transfer Protocol (SFTP): Never email patient-level data files. Use SFTP sites provided by the payer/pharma partner, which encrypt the data in transit.
- Data Aggregators: Often, you don’t send data directly to 50 pharma partners. You send one standardized feed to a “Data Aggregator” (like IntegriChain, Komodo Health), who then securely distributes it according to each manufacturer’s specific requirements.
- Role-Based Access (Reporting Tools): Who can run which reports? A clinical pharmacist may need access to adherence reports, but not to financial margin reports. Access to the BI tool must be tightly controlled based on job function.
- De-Identification/Aggregation: For some uses (e.g., research, benchmarking), data may need to be de-identified (removing PHI according to HIPAA standards) or aggregated (reporting only summary totals, not individual patient data).
The Dangers of “Shadow IT” and Excel Reporting
One of the biggest risks to output quality is “Shadow IT”—when individuals bypass the governed Data Warehouse and reporting tools to create their own reports, often using manual extracts and Microsoft Excel.
While Excel is familiar, it is a terrible tool for governed reporting:
- No Audit Trail: You don’t know where the data came from or how it was manipulated.
- Prone to Errors: Manual copy/paste and complex formulas are easily broken.
- Inconsistent Definitions: Different people may calculate the same metric differently in their spreadsheets.
- Security Risks: Saving patient-level data in unsecured Excel files on local desktops is a major HIPAA violation.
Data Governance aims to eliminate this by providing reliable, validated reports through the central BI platform, making “Shadow IT” unnecessary.
20.4.7 Master Data Management (MDM): The Quest for the “Golden Record”
In Section 20.1, we introduced the concept of a “Single Source of Truth” within the PMS database, enabled by Primary and Foreign Keys. This works well within one system. But what happens in our “Integrated Smart Home” where we have multiple systems (PMS, CRM, Dispensing Automation, Patient Portal), all potentially storing copies or versions of the same core data (Patient demographics, Prescriber details, Drug formulary)?
If Jane Doe updates her address via the Patient Portal, how do you ensure that change synchronizes correctly to the PMS and the CRM and the shipping software without creating conflicts or duplicates? This is the challenge addressed by Master Data Management (MDM).
MDM is a discipline (and often a dedicated software platform) focused on creating and maintaining a single, trusted “master” record—the “Golden Record”—for critical shared data entities across the entire organization.
MDM vs. Single Source of Truth
- Single Source of Truth (SSOT): Typically refers to the designated system or database table that is the authoritative source for a specific piece of data (e.g., the PMS is the SSOT for dispense history). Other systems refer to it (via FKs or APIs) but don’t own it.
- Master Data Management (MDM): Focuses on creating a consolidated, clean, authoritative “Golden Record” for core business entities (Patient, Provider, Product, Location) that might exist in multiple systems. It involves processes for matching, merging, cleaning, and synchronizing this master data across the enterprise.
How MDM Works in Practice (Patient Example)
- Data Sources: Patient data might arrive from the PMS (via e-Rx), the CRM (via web form), or a Patient Portal. Each system has its own internal patient ID.
- Matching Engine: The MDM tool uses sophisticated algorithms (probabilistic matching) to identify that “J. Doe” at 123 Main St (from PMS) and “Jane Marie Doe” at 123 Main Street #2 (from CRM) are likely the same person, even if the data isn’t identical.
- Golden Record Creation: The MDM system creates a single “Golden Record” for Jane Doe, linking the internal IDs from the PMS and CRM. It uses survivorship rules to pick the “best” version of each attribute (e.g., use the address from the CRM as it was updated more recently).
- Synchronization: The MDM tool then pushes updates back to the source systems. If Jane updates her phone number in the Portal, the MDM detects this, updates the Golden Record, and then uses APIs to push the new phone number to both the PMS and the CRM, ensuring consistency everywhere.
Implementing a full MDM solution is a massive undertaking, typically reserved for larger organizations. However, the principles of MDM—identifying critical data entities, defining survivorship rules, and creating processes to synchronize data—are crucial even for smaller pharmacies striving for data consistency across their integrated systems.
20.4.8 Practical Tutorial: Auditing Adherence Data Quality
Let’s put your Data Steward hat on. You are a clinical pharmacist tasked with validating the accuracy of the Q3 Adherence Report before it’s sent to a key pharma partner. The report shows your Humira patients have an average PDC of 92%. How do you audit this?
Pharmacist Playbook: The Adherence Audit Trail
This is a source-to-target validation, working backwards from the final number.
- Step 1: Understand the Definition (Data Dictionary). First, pull up the pharmacy’s official Data Dictionary definition for “PDC – Humira.” It should specify:
- Numerator: Sum of Days Supply dispensed during the measurement period (Q3: July 1 – Sept 30).
- Denominator: Number of days in the measurement period (92 days).
- Inclusion Criteria: Patients with >= 2 Humira fills, continuously eligible during Q3.
- Handling Rules: How are early fills handled? Is there a cap on Days Supply? (e.g., Cap at 92 days).
- Step 2: Sample Patients from the Report (Output Validation). Ask the analytics team for a patient-level detail file that backs up the 92% average. Randomly select 5-10 patients from this list. Let’s pick Jane Doe; the report says her PDC was 95%.
- Step 3: Check the Data Warehouse Calculation (Process Validation). Ask the analytics team for the specific query or calculation used in the Data Warehouse for Jane Doe. It might show:
- Fill 1: July 15, Days Supply = 28
- Fill 2: Aug 12, Days Supply = 28
- Fill 3: Sept 9, Days Supply = 28
- Fill 4: Oct 7, Days Supply = 28 (Outside period)
- Total Days Supply in Q3 = 28 + 28 + 28 = 84 days.
- PDC = 84 / 92 = 91.3%.
- Step 4: Validate Warehouse Data vs. PMS (Input/Process Validation). Now, check if the Data Warehouse data is even correct. Log into the live PMS (the SSOT). Look up Jane Doe’s dispense history for Q3.
- July 15: Humira, DS 28 – Matches.
- Aug 12: Humira, DS 28 – Matches.
- Sept 9: Humira, DS 28 – Matches.
- Step 5: Document and Remediate. Document both findings (Calculation Error, Source Data Verified) in the Data Quality Issue Log. Assign the report logic error to the BI team to fix. Re-run the report once corrected. Sign off on the final, validated report.
This meticulous, step-by-step tracing from the final output back to the original source transaction is the core work of data validation. It requires understanding the definitions, the systems, and the potential points of failure.
20.4.9 Section Summary: The Pharmacist as Data Steward
We have established that in specialty pharmacy, data is not just a byproduct of operations; it is a critical asset that dictates clinical success, business viability, and regulatory compliance. The concept of “Garbage In, Garbage Out” has profound consequences when dealing with high-cost therapies and complex patient journeys.
This section equipped you with the framework of Data Governance—the policies, roles (Owners, Stewards, Custodians), and standards needed to manage this asset effectively. We dissected the Six Dimensions of Data Quality (Accuracy, Completeness, Consistency, Timeliness, Validity, Uniqueness) that serve as your inspection checklist.
We then applied these concepts to the critical Data Domains within specialty pharmacy (Dispensing, Clinical, Operational, Financial), identifying the unique risks within each.
Most importantly, we explored the practical Validation Controls needed at each stage of the data lifecycle:
- Input Controls: Preventing bad data at the source through system rules, interface validation, and human checks.
- Process Controls: Catching errors during workflows through reconciliations, audits, and automated quality rules.
- Output Controls: Ensuring the accuracy and integrity of final reports through logic validation, clear definitions (Data Dictionary), and secure transmission.
We also touched upon the advanced concept of Master Data Management (MDM) as the strategy for ensuring consistency across multiple integrated systems.
Your role as an advanced specialty pharmacist transcends clinical expertise. You are now also a Data Steward. You have a responsibility to understand how data is generated, how it flows, how its quality is measured, and how to identify and fix errors. By embracing this role, you ensure that the sophisticated technology ecosystem we are building rests on a foundation of trusted, reliable data, ultimately leading to safer patient care and a more successful pharmacy operation.