evaluating learning: did it work?
If learning is important to organisational success, measuring and evaluating its effectiveness is crucial. This will depend, firstly, on a clear identification of learning programmes’ objectives – such as how they will meet identified needs – followed by evaluation against costing and benchmarking criteria to answer the basic question: 'Has it worked?”
According to research cited by the AHRI, top performing organisations are three times more likely to be sophisticated users of analytics in their personnel management than lower performing organisations. It finds that the application of logic and analysis techniques on workforce data, which could include outcomes from learning and development – can provide great insight into the cost impacts and business benefits of your HR investments.
However, HR has been slow to implement HR technologies to analyse the outcomes of learning. As Peter Forbes of Navigo Research comments: while there is no shortage of data available – the real challenge is integrating disparate information to create meaningful insights for managers, HR departments often lack the technical capability to implement these technologies.
Ilonka Jankovich of the Randstad Innovation Fund – a corporate venture fund focused on innovation using online platforms, big data analytics, machine learning, sourcing, screening and selection tools – noticed a tendency to look for sweeping solutions to big problems such as evaluations of return on investment. But experience shows that
it‘s better to start small by identifying a particular challenge or problem and then develop a specific solution for this. You can then broaden out the solution and capabilities from there.
types of evaluation
According to the widely used and influential model developed by Donald Kirkpatrick, evaluation can be carried out on four levels:
|Evaluation type||Captured by||Notes|
|reaction||feedback and post-training||easy and inexpensive|
|learning||tests or assessments, pre- and post-training||relatively simple to set up but may not capture complex learning|
|behaviours||observation over time to assess whether change has taken place and whether it is relevant and sustainable||line-manager dependent|
|results||usual management information procedures||designed to relate to the organisation rather than the individual|
Surveys have found most evaluations focus on ‘reactions’ because of difficulties and time costs of measuring other outcomes. This type of evaluation focuses on feedback, with a danger of becoming “stuck in a recurring loop of evaluation”, in the words of the CIPD.
But while other evaluation techniques (such as ROI analysis) have been found equally wanting, there is unanimity on the need to focus on the main outcome: did it work for the organisation?
One way of breaking out of this is through harnessing the data often ‘siloed’ in organisations. While questions remain about how this will be done and how big data will be managed, this is likely to be as truly transformational as the current shifts to collaboration and social learning, lifelong learning, and full alignment of learning with business objectives.
more articles about: learning and development
- a multi-generational future
- challenging times demand new approaches
- training or learning?
- responsibility for learning
- identifying and analysing learning needs
- talent development across your organisation
- learning and a more diverse workforce
- evaluating learning: did it work?
- designing a learning and development program