Data Quality

Once the data has been collected, states (or other payers) and providers must establish a process for verifying and checking data accuracy. Failure to verify data entry for accuracy will limit the validity of performance management feedback reports related to process improvement. Data quality represents the foundation of a sound PIPM system. While states and other systems vary in their approach to ensuring data verification and accuracy, this step is widely recognized as critical in the overall development of a PIPM system.

States participating as grantees in the CSAT-funded Strengthening Treatment Access and Retention—State Implementation (STAR-SI) project offer the following examples of ways to ensure data integrity:

  • Automatic linkages. In the state of Washington, providers have the capability to automatically upload information into the state systems. This reduces the potential state burden by shifting the responsibility for data verification to the providers.
  • Built-in checks. Ohio and Maine have incorporated edit checks into their systems to ensure the accuracy of data as it is entered into the system. Oklahoma includes a review of the data part of monthly meetings with providers; they also agencies whose data changes to verify its accuracy.
  • Feedback reports. New York provides feedback reports to providers on data quality, and then allows providers to re-submit corrected data. These bimonthly reports allow New York Office of Alcoholism and Substance Abuse Services (OASAS) staff to review discrepancies with providers between STAR-QI data submissions and other data submitted to OASAS (e.g., CDS reports). These reviews help to prioritize corrective action.
  • Incentives. Oklahoma uses incentives to encourage agencies to submit process improvement data.
  • Ongoing training and technical assistance. High staff turnover at the provider and sometimes the state level makes training an important part of ensuring the ongoing quality of information collected for use in PIPM systems. Ongoing training promotes continued use of a consistent method for verifying data accuracy.

Several states have created teams at the state level to assist with data reporting and integrity. For example:

  • In Oklahoma, the state formed a Data Integrity Review Team (DIRT) to provide on-site review and technical assistance on all data reporting issues for any providers Add to portal. As a part of this project, the six National Outcome Measures (NOMS) show improvements in addiction treatment facilities statewide since the team started the visits in July of 2007. Read the full story Add to portal for more information.
  • In Maine, the OSA Agency Monitoring Team (AMT) became the Change Team charged with the task of monitoring data and performance of the contracted agencies. Their Change Project Form Add to portal reflects the change cycles of the AMT to address development and testing of the feedback reports. Maine also created frequently asked questions Add to portal for providers related to their TDS system.
  • New York developed a series of data entry and report analysis training modules Add to portal to support their providers in using the STAR-QI system. The training modules have been repeated due to staff turnover. The state also presented information and led question and answer sessions on data management for all attendees at New York’s two STAR-SI learning collaborative sessions. These discussions helped address issues that needed attention across groups, and encouraged sharing of strategies for efficient data collection, entry, and interpretation.
  • In Ohio, the department offers technical assistance and follow-up through site visits, telephone calls, or conferences and written correspondence with Ohio Department of Alcohol and Drug Addiction Services (ODADADS) staff, boards, peer mentor and the STAR-SI coach.