Data Quality

Data Quality

Data Quality is the collection of relevant, accurate data that can help program staff answer questions about recruitment, retention and the effectiveness of their programs and components.

Tools & Tips

Resources for Reporting Measurable Skill Gains (MSG) Types 3, 4, and 5

Adult education programs can now report additional types of MSG for workplace literacy and integrated education and training (IET) participants in NRS Tables 4 and 4c. The NRS tip sheet provides an overview of the MSG primary indicator of performance and reporting requirements, focusing on MSG types 3, 4 and 5. It includes a discussion of the validation and documentation required and examples of the types of MSG outcomes that can count.

Resources for Reporting Measurable Skill Gains (MSG) Types 3, 4, and 5

Adult education programs can now report additional types of MSG for workplace literacy and integrated education and training (IET) participants in NRS Tables 4 and 4c. The NRS tip sheet provides an overview of the MSG primary indicator of performance and reporting requirements, focusing on MSG types 3, 4 and 5. It includes a discussion of the validation and documentation required and examples of the types of MSG outcomes that can count.

Enhancing Intake to Improve Services: Collecting Data on Barriers to Employment

WIOA recognizes that certain groups face more barriers to entering the workforce than others and focuses funding and services on these most vulnerable populations. This tip sheet explains the 11 Barriers to Employment as defined in the legislation, the purpose and benefit of collecting these measures for states, and challenges to and strategies for collecting these barriers which can help improve local program services.

NRS Tips: Increasing Posttesting to Improve Measurable Skill Gains

The primary indicators of performance for which adult education programs must collect data include measurable skill gains (MSG), or “a measure of a participant’s interim progress towards a credential or employment.” Although many factors affect an adult education program’s MSG outcome, a critical one is a program’s pretesting and posttesting rate. This NRSTips summarizes strategies that two states—Maine and Rhode Island—have used to successfully increase their posttesting rates.

NRS Tips: Collecting Data for Post-Exit Indicators in Practice

Under WIOA, adult education programs must collect data on program participants after program exit for each performance indicator. This NRSTips provides examples of successful strategies from one local program and one state for collecting required data on two performance indicators—credential attainment and employment.

NRS Tips: Tracking Students Over Time

Tracking student outcomes can take time and effort, however, it is important for local programs to obtain this information to support program management and improvement efforts. This NRS Tips will help you better understand the WIOA requirements related to following up with students after they exit and tracking their successes. The tip sheet includes information about why it is important to track students, a table with guidelines for whom to track when, and describes the primary methods for tracking students’ outcomes: data matching and supplemental data collection.

Local Data Quality Checklist

States are required to submit an annual data quality checklist to the Office of Career, Technical and Adult Education.  Similarly, some states may use a local data quality checklist that focuses on data collection and reporting activities, to help those programs remain informed about what is necessary to know and do to maintain data quality.  This self-monitoring tool is available for local staff focused on data collection and reporting activities.

NRS Tips: Reporting Race and Ethnicity

The U.S. Department of Education (ED) released guidance on how educational institutions and other ED-funded programs should collect and report data on race and ethnicity beginning July 1, 2010. This NRS Tips describes changes to the way federally funded adult education programs will collect and report race and ethnicity data based on ED’s guidance on the new standards and their implications.

Guides

Demonstrating Success: A Technical Assistance Guide for Collecting Postexit Indicators

Performance on the postexit indicators demonstrates the success of programs in preparing participants for success in employment and postsecondary education. However, collecting the data for the indicators is new to adult education and poses many challenges to states. This guide serves as a technical assistance resource for state and local staff to understand the postexit indicators and suggests ways to improve the quality of these data. It explains the indicators and how each are collected and calculated and provides guidance to help states and programs overcome the many challenges they face in collecting these data. It also offers approaches to enhance the completeness and quality of data in each step of the data collection process.

Linking Data Quality with Action: Evaluating and Improving Local Program Performance

This guide offers new approaches and tools to identify and prevent data quality problems. It introduces a local data quality checklist, modeled after OCTAE’s state checklist, which states can use to understand and evaluate local data collection practices. This guide also brings together previously developed material on improving data quality into a single resource, a data quality toolkit that permits easy access to this content, to support ongoing state and local training around data quality practices.

Webinars

Promising Approaches to Data Quality

With the new regulations under WIOA, problems in your data quality can affect not only you and your program, but your partner agencies too. It’s more important than ever that states and local programs ensure that their data is of high quality and quickly address any data issues that they find. In this webinar, staff from Georgia, Missouri, and New York discussed the approaches they use to ensuring data quality, including what their processes are, how those processes were put into place, who is involved, and what results they have seen in their data since implementing these processes.