Training evaluation data must be analyzed to determine if project goals are met. Explore techniques for analyzing data and using those insights to draw conclusions on learning effectiveness.
- It's important to collect data about the training programs you evaluate. But that data is useless unless you analyze it to draw actionable conclusions. Key action number five in the ATD Competency Model is Analyzes and Interprets Data. Here, we try to make sense of all the information we collect and put it in a format, such as a summary, graph, or report that allows our stakeholders to easily understand it. Keep in mind that an analysis is more than just a data presentation.
Let's say we create a training program to help managers reduce errors on expense reports. The data presentation might tell us that the error rate went down from 18% before the training to 3% after the training. That sounds great. But an analysis can tell us why this happened and how we can make the training even better. Perhaps we learn that when errors still happen, they tend to fall into one of two categories: expense reports submitted after their due date and expense reports that contain a request for an exception to the company policy.
Using this data, we could reduce the error rate even further. So, how do you analyze and interpret your data? I like to take a simple approach whenever possible. First, go back to the original research questions in your evaluation plan. Next, look for data that can provide straightforward answers to those questions. Finally, drill down into the data to find out why things turned out the way they did. I've used an interviewing skills training program as an example throughout this course.
You can download a sample evaluation data spreadsheet to try your hand at finding the answers to our research questions for this project. Did we reduce new hire turnover from 30% to 15%? Are supervisors using the new interviewing procedures? And do supervisors love the new procedures? Let's first look at turnover. Here are the results of six months after we did the training. Supervisors in the other three regions did not attend the training so we can compare those groups to the Midwest region.
Turnover is down across the board, but look at the Midwest region where we did the training. It went all the way down to 12% beating our 15% goal. Part of a good data analysis is to dig deeper to find out what's driving the results. For example, let's look at one of our other research questions. Here we see that the 43 supervisors who were observed using the new procedures had an average new hire turnover rate of just 10%. While the seven who didn't use the new procedures had a 27% turnover rate.
This is just scratching the surface of what you can do to analyze your data. I've created a sample evaluation data spreadsheet that you can download to analyze the interviewing skills project more closely. I've also included a sample evaluation report that shows some additional analysis examples. I encourage you to practice with this data and then apply these concepts to one of your own evaluation projects.
Check the exercise files for sample evaluation plans, reports, checklists, and worksheets that you can use to evaluate your own employee development program.
Lynda.com is a PMI Registered Education Provider. This course qualifies for professional development units (PDUs). To view the activity and PDU details for this course, click here.
The PMI Registered Education Provider logo is a registered mark of the Project Management Institute, Inc.
- Common learning assessment models: Kirkpatrick, Phillips, Brinkerhoff, and alternatives
- Identifying expectations
- Collecting data
- Analyzing data
- Making recommendations