training evaluation

Training Evaluation

The goal of evaluation

This page explore the methods of training evaluation. Whenever any organisation trains its employees it does so with an objective in mind. This objective can be extremely varied; equipping the people to do their job to the necessary legal requirement; giving ideas on how things might be done differently; opening minds to new ways of thinking; improving skill levels; as a reward for a job well done. With an objective in mind, training is budgeted, designed, organised and implemented.

In assessing whether the training achieves the objective, many organisations rely on the initial reaction of the participants at the end of the training, along with the observations of the training professionals who organised the training. Training professionals can be heard saying ‘as long as the participants are happy…’ hence the term happy sheets to describe the feedback forms completed at the end of most training sessions.

Assessment

With some training an assessment of its success and impact is intrinsic. Training towards a qualification, for example, requires the participant to pass an exam, or complete an assessment. These tests establish whether the participant has gained knowledge, or ability. This test approach can be applied to other areas. For example, if a person is trained to use a software package they can be easily tested to see whether they have retained knowledge and can apply this knowledge in using the software programme.

Whether knowledge is actually used in the workplace is a different question. In the latter example we may be able to establish quite simply, if the person has learnt the software programme, but we cannot predict if or in what circumstances they will use it. Moreover if we discover that this person does apply their new knowledge of the software programme, we may also wish to know whether this new skill has any positive impact on their work. Then we could go further by asking whether the use of this new skill has had any positive impact for the organisation.

Evaluation is linked to the objective for the training. In order to evaluate a clear objective is required. It is likely that budget holders are going to be increasingly required to justify and measure the effectiveness of any training investment.

The need to evaluate training

The need for evaluating any training is to ask the fundamental question, ’Is the money we are spending, well spent?’ Training professionals might rephrase this question to ask, ‘is training achieving the objective set for it?’ But business professionals are more likely to ask ‘what value does this investment bring to the organisation?’ When we start to explore evaluation we soon arrive at the question of, ‘what are we evaluating?’

So before we evaluate we perhaps need to ask why we are evaluating. What is the objective of the evaluation? And what is the objective of the training?
In this way a desire to evaluate training can easily be lost in a confusion of objectives. In order to clarify we can simplify evaluation by first summarising what is to be evaluated:

1. Has the person learnt anything from the training?
2. Has the person applied this learning in their job?
3. Has the application of this learning had any positive outcomes for the individual?
4. Has the application of this learning had any positive outcomes for the organisation?

Learning

Evaluating training could aim to establish if learning has taken place, if this learning has been applied and what impact this application has had.
In the current economic climate evaluation needs to establish if a learning objective has been achieved, but to be truly useful to business professionals with budgetary responsibility, it needs to establish the value to the organisation.

A UK transport organisation recently conducted advanced driving training for all its truck drivers. The training taught them how a different driving style can save fuel and make journey times shorter. The impact was a 20% reduction in the volume of fuel purchased.
So evaluation is useful to assess if the objective for the training has been achieved, but even better is to establish the value to the organisation.

Training Reviews are often used as a away to evaluate

Super training from Martin, as is the norm

Rated 5.0 out of 5
September 30, 2024

Martin Chapman delivered yet another very useful, interesting training on Resilience and Well-being for me and some of my colleagues. His easy, engaging manner ensures that the time flies by; the hour and a half session is always filled with good ideas, tips and humour.

R. Forbes

Presenting Virtually

Rated 5.0 out of 5
September 19, 2024

Martin C. is a fantastic trainer!

NA

Engaging and practical, actionable tips

Rated 5.0 out of 5
September 17, 2024

The session was light and engaging, providing useful actionable tips that are very easy to remember for the next time I need to do a virtual presentation. Really enjoyed the positive atmosphere and the engagement opportunities with the presenter (Martin Chapman) and the team. I would enjoy having a follow up session to focus on practice.

Carla Camanho

Great course for anyone managing people

Rated 4.0 out of 5
September 13, 2024

I attended the People Management Training yesterday and found it incredibly useful. The session provided valuable insights into understanding not only my own management style but also how to approach managing staff more effectively. I learned a lot about leveraging individual skills and personality traits to build stronger, more cohesive teams. The training offered practical strategies that I can immediately implement to foster better communication and collaboration. I am looking forward to attending further sessions and building on what I have learned already. Highly recommended for anyone looking to improve their leadership skills!

Sarah Holdaway

Insightful

Rated 5.0 out of 5
August 23, 2024

Martin recently conducted a training session on impactful presentations and shared some excellent tips. I appreciate his active engagement with the participants and the collaborated environment he fostered. He demonstated a strong uderstanding of the subject matter. I particularly enjoyed the emphasis on interactive activities.

Ritika

Is evaluation possible?

If the objective is to establish whether the training achieves a learning objective, this is fairly straight forward and the post course evaluation can potentially do this well. For example, a group of managers have a need to increase their abilities in budgeting and finance. Training is implemented and the managers are asked; ‘did you learn what you needed to know about successful budgeting?’ Usually participants on training courses are clear on whether they have learned something useful or not.

Whether the managers will change their behaviour and, from the example, improve their budgeting accuracy is another matter. Achieving a learning objective might be reasonably simple, but can we establish the value of training to the organisation?

With some subjects this is more easily quantifiable. Negotiation skills training for lawyers can be followed up to discover if negotiation techniques learnt in training have been applied and resulted in better outcomes for the firm and for clients. For example one lawyer following an ITD negotiation skills programme said, ’the training taught me that in face to face negotiation silence can be powerful, so when in a damages claim the opposing lawyer offered £10,000 I said nothing and after a period of silence the offer was increased to £15,000. I could so easily have accepted the first offer on behalf of the client, but by staying silent we achieved a much better result.’ Examples like this demonstrate learning has taken place and has been applied with a positive result.

 
Sales Training

Sales training can be simpler to evaluate. Take the example where the need is for lawyers to develop a pipeline of business opportunities. Training is developed and implemented. Afterwards, evaluation can review the lawyers’ pipeline before and after the training.

Evaluation is possible but it tends to be complicated. Exploring and finding out how learning has been applied takes time and effort. Any evaluation tool needs to be sophisticated enough to be adapted to different training subjects and methods. It might be useful to have the same tool to measure software training as well as leadership training as well as stress management training. We may also want a tool which will be flexible in the information we require it to discover. For example we might want to discover if the participant learnt useful ideas in a training session, but we also might want to learn if they found the training motivational. Then we might want to explore how it has been applied and whether it had any positive results and if these results have any impact on the organisation.

One of the challenges with measuring the impact of training is to quantify the amount of impact directly attributed to training versus the amount attributed to other factors, if possible the evaluation tool needs to account for this or at least acknowledge these factors.

 
ITD training evaluation

Pre evaluation of training

We are currently working on a methodology for pre-evaluating training. This involves measuring participant’s confidence in delivering on the training objectives before and after the training activity. From this we then hope to identify if we could predict which participant’s need extra support in helping to deliver the training objectives. Additionally it could help us identify if any participants might be better not to attend the training if there would be no chance of them delivering on the objectives. This methodology would also give us further data to support the argument that training needs internal support from line managers and others, for it to be effective, and not for it to be seen as an isolated event, with no follow up.

 

The MIRTEX approach

MIRTEX is the Measure of Impact and Results of Training Index. 1 – 10.

ITD developed the MIRTEX evaluation tool following a conversation with a client, where the client stated, ‘What would be great is if we could say that this training scored 6 out of 10 or that training scored 9 out of 10, a bit like the Richter scale but for positive impact. The higher the score achieved the more positive the impact’.

The MIRTEX approach has the objective of providing a tool to evaluate and measure training investment. Training evaluation is inherently difficult and complex, especially when attempting to measure the impact of business skills training. Sales training can be a little simpler, if sales results can be used as a measure of impact.

The MIRTEX approach uses a telephone structured interview technique based on an evaluation scale.
Training participants are called between 1 – 4 weeks following a training course. The MIRTEX researcher uses a blend of structured questions with a flexible approach to find evidence of the impact of the training.
The evaluation scale scores the training from 0 to 10 where 0 is no impact and 10 is measurable financial impact.

Evidence

The MIRTEX approach goes direct to the training participants and seeks evidence of impact. One potential challenge with this approach is that it depends on the reliability of answers from the participants, in other words, could the participants over state or under state the impact of the training. Clearly they could, but one way of reducing this possible weakness is in the ability of the researcher to ask searching questions that check statements and answers to previous questions. In addition statements can also be validated by seeking corroboration from their line manager.
The MIRTEX approach goes some way in collecting evaluation data which is both quantitive and qualitative.
It evaluates how well the training has performed in providing useful ideas for the participants, takes this further by discovering whether these ideas have been applied, still further by assessing the impact of this application and finally to measure any financial impact of the training.

 

Training Evaluation Case Studies

Make an Enquiry