From the course: Data-Driven Learning Design

Understanding the need for new learning analytics

From the course: Data-Driven Learning Design

Understanding the need for new learning analytics

- Imagine you have a stomachache. You visit your doctor, and without even examining you, she says, "Yes, I believe you have a stomachache. Here's some pills. Let's see if they work." Now if this really happened, you would be extremely surprised and probably wouldn't go back. And although this scenario sounds unrealistic, we do this everyday as learning professionals. We too often only look at what happen with the instructional design after a program has launched, after it fails. Let's take a deeper look. The standards for measurement in learning are all focused on the post-learning evaluation. We track metrics such as completed hours of learning, pre and post test scores and participant feedback. These are all very valuable and will reveal some insights into the success of a learning program. We can report on the average star rating of a course. These metrics are also what we use to keep our stakeholders informed on the progress of L and D and in organization. But, what if the results of these metrics are disappointing? What if the scores show failure? Our stakeholders are disappointed and we lose their trust in us as learning professionals. More importantly, we've earned a poor reputation with our learners. This can make it extremely difficult to reengage with them and nearly impossible to build a learning culture. So, what can we do to manage this risk? The first step is to dedicate more time reviewing data to better diagnose learning needs. This means making sure the learning department is proactive rather than reactive with assessing learning goals. As you know, the business comes to the learning department with something they believe needs training. Seasoned learning professionals will use performance consulting skills to determine if the request is valid or not. Then it is off to design and development. This model means the learning department is always in a reactive position and this costs both time and money. Again, the better approach is to monitor your existing data to better understand and predict learner needs. With immediate insights, we can identify and resolve performance gaps quickly. Secondly, use data to support decisions about learning content. Learning experience designers use good instructional design principles to build content. But they also can make a lot of choices without evidence. This includes when and where to insert a video, what tone to write content in or when to publish a course. All these decisions have an impact on the end learning experience. In a digital environment, there's actually a lot of data available, both inside and outside of your LMS to determine audience preferences. Too often we neglect this upfront analysis and go straight to the storyboard. Now that you're more aware of the pitfalls of relying on data after training is deployed, be more proactive. Start by carving out more time to review existing data and also reviewing how data supports your business training requests. And like a good physician, Imagine how your patient's health that is your learner success will improve when you take that time to properly diagnose their needs.

Contents