Written by:

Holly Macdonald

Date:

June 3, 2009

One of the biggest reasons that organizations cut training budgets, is because the track record of providing training solutions is not strong.  It is really hard to convince your decision-makers that this time they should believe that the training will work.  It’s the organizational equivalent of crying wolf: “I know it didn’t go so well the last dozen times, but I’m sure it will this time, trust me”.  Who is going to believe you?

In the world of learning & development, this is what we refer to as evaluating the effectiveness of the learning intervention, or “did it work”.  Ninety-nine percent of the world’s learning experts use the “Kirkpatrick Levels”:

  • Level 1: Reaction – what is their reaction to the training (commonly described as “did they like it”)
  • Level 2: Learning – can they recall what they learned?
  • Level 3: Behaviour – have their changed their behaviour as a result of the training
  • Level 4: Business Results – did the training impact the business

Industry statistics frequently show that the majority of training departments do “level 1”, significantly fewer get to “level 2”, a small number shoot for “level 3” and almost none of them attempt “level 4”.   Some organizations attempt the “level 4”, and think that they can skip level 2 & 3 altogether. 

If this sounds like your organization, then you need to do some evaluation-rehab.  You need to figure out if your training programs work!  You need to make a link between what was trained and the business results and you need to be able to draw a clear and distinct line between them.  So, how can you do that without dismantling your entire learning organization or creating the impression that your team isn’t effective? 

Well, start with one program, and call it a “training review”. 

  1. Look at your “level 1” assessment information – if you use the smile sheet at the end of a training session – chances are the questions that you ask are pretty generic and measure whether or not someone enjoyed the day, the instructor, the food, etc.  What you *really* want to measure is the reaction is to the content.  A couple of specific questions that link the anticipated business results with the learning objectives will help.  Instead of “what did you find useful”, ask “what will cusotmers think of the model presented and how will the training help you use it?”.  Encourage reaction to what you want people to learn, not if they had a good time. 
  2. Select a program that is really critical to your organization’s success and create a survey/questionnaire.  This can be administered online (use a free survey tool like surveymonkey) or as a paper questionnaire or interview guide.  Base your questions on the content of the course and the business results that the course is intended to change.  This is for your “level 2” assessment information.
  3. You’ll need to ask a significant number of participants that completed the course to get some statistical validity.
  4. Ask their direct supervisors/managers to provide input as well – based on their observation of behavior.  This is the basis of your “level 3” assessment.
  5. Compare your learning objectives to the business results – are they aligned?  showing positive impact?
  6. Analyze your findings and if they are good, then do a quick “ROI”.  If they are not good – use the information to make changes to the training course and build your evaluation strategy to review how well the training worked.

It isn’t really all that hard, but it does take effort and maybe a little expertise, but can be well worth it in the end.