It is reasonable to want to show that the learning solution you invested in had an impact that justified the cost.

It is basic good sense to know you got value for money. If only it were simple to calculate the impact of learning and its consequences on an organisation. There would be little reason for this article, as we would be happy to lean into data such as learner engagement, course completion, and learner satisfaction surveys.

The complexity of causation

Impact, for most businesses, cannot be defined in such straightforward terms. Most managers want to see a causal connection between learning and company performance. Therefore, customer satisfaction surveys, sales data, and productivity numbers feel like a better measure. Yet, it isn't easy to draw a straight line between the learning solution and these outcomes because organisations are constantly introducing other strategies to improve performance.

It is this complexity that means most organisations and learning solutions have yet to fully resolve ROI requirements. Here we explore some of the ways you might be able to draw a straight line between a solution and an organisational outcome; it all comes down to definitions and assumptions.

Breaking down the problem

  • Begin with recognisably powerful learning methods.

It sounds like common sense; however, it is often overlooked when assessing the viability and efficacy of learning solutions. Yet, if the learning methods are based on years of credible research, then it is likely that the approach will be effective.

However, it is less effective if the learning is based on passively reading, listening, or watching materials followed by a series of multi-choice questions. Such knowledge acquisition has a limited impact on the application of the ideas, so little difference is made to the learner's abilities. In the modern world, a content bank poses a significant cost to businesses, both to design and deploy, as well as to keep evergreen. And when it comes to ROI, it has only a minor impact on performance.

A learning solution must encourage active learning strategies and require a critical mindset. It needs to support the learner in applying the ideas in context, evaluating the outcomes, and adapting the approach where necessary. Learners need a mindset confident enough to see failures as a moment to superpower themselves, and the resilience to sustain this attitude when the evidence suggests they should give up.

  • Find a way to reward active, transformation learning.

All learning solutions include passive and administration elements. Learners usually find these parts the least inspiring, but also the most comfortable. There is little challenge in pressing play on a video and a sense of satisfaction in uploading made-up activity logs onto the LMS.

Therefore, the metrics gathered from learning cannot focus on time on the system or the number of videos watched. Instead, you need to find a way of reporting on the amount of challenging learning undertaken, which you can assume has an impact.

How about offering an experience points system? When the learner opts for passive and administrational tasks, give them a low score. When the learner takes on active learning, where there are inherent challenges, offer them a higher score. The metrics reported will show the overall learning score per participant and allow organisations to see the breakdown of how the learner got to this total.

  • Use reflective assessments and reward even more for these.

Accountability in learning tends to raise the level of engagement. There is no need to sign up for endless qualifications to achieve this. You do not need to pay for formal learning qualifications when you can reward the learner for informal learning. To get the most out of your learning solution, an organisation needs to encourage learning every moment of every day.

If the learner undertakes an informal learning journey and then completes a reflective assessment, you can reward them for this effort with badging that becomes a metric of commitment to the learning process. If the learner is required to find third-party validation of their improvement, this also provides a representation of direct and noticeable impact.

Conclusion

There are ways of better evidencing L&D ROI. Collating passive, active and reflective learning metrics as a measure of both user engagement and the more powerful learning completed, the active and reflective, creates oversight of whether the learning is more likely to have a meaningful impact.

If you are interested to learn more about a learning solution that does all this, contact us today at [email protected]. We would love to share our thoughts with you.