Register
 (photo: )
03.06.2015, 19:50

Return on Expectation

FM Magazine, Regulars, Training & Education

Martin Davies, Director of Training with BIFM Training (Quadrilect), discusses the implications of adopting different training and learning and development models in facilities management.

 

Most people recognise the importance of training, but how do you gauge the quality, effectiveness and value of training and learning interventions?

Whether provided in-house or sourced externally, the individual learner and their organisation need to be sure training is valuable. Feedback from peers and recommendations about established training courses and providers are useful, but can’t replace a systematic evaluation of the quality and effectiveness of training.

Good evaluation can be conducted in a range of ways from utilising simple but effective methods through a range of more detailed and in-depth approaches.

 

Training vs. learning and development

Learning and development (L&D) as a term indicates a change of emphasis from traditional ‘training’ to a broader concept that is generally more learner-centred than trainer-centred. So in contrast:

  • Training is often instructor-led, content-based, and designed to lead to knowledge, skills or behaviour change

  • Learning is often self-directed, work-based, achieved in a variety of ways and often planned to result in increased adaptive potential for today’s changing work environment

I use the term L&D throughout to include all aspects of training, learning & development.

 

Effective evaluation

To be effective, L&D must be designed, implemented and measured against defined objectives – such as developing knowledge, skills, aptitude and competencies, and it should support both the learner’s needs and those of the organisation.

Evaluation can vary in scope from simple but effective questionnaires through to far more detailed measures against performance and organisational goals. Where practical, a learning needs analysis can be undertaken at the outset to ensure the L&D aims and objectives for the learner and the organisation are aligned, and that the learning and other outcomes are well-defined and capable of being measured - perhaps during and certainly after the process.

 

Different dimensions of evaluation

Evaluation can focus on input or output factors and at different levels, so how the evaluation is devised and set up is key to what you will be able to measure. For example you can look at:

  • ‘Input’, such as the quality of course content and the methods of presentation, or

  • ‘Outcomes’, such as improved skills, qualifications, enhanced, performance, productivity or even profitability

It’s essential to define L&D needs and objectives and once you have done so there is a range of measures including:

  • Post-training questionnaires where learners rate the training

  • Interviews or testimonies of learners

  • Observation of the training

  • Assessing behaviour change

  • Metrics such as 360 feedback

  • Performance reviews to measure new or enhanced competencies

  • Impact on key performance indicators (KPIs)

  • Return on investment (ROI)

 

The first few of these can be relatively easy to arrange, yet still provide valuable feedback if designed well, but it takes more to measure ROI or the effect on KPI’s, and potentially even business success.

For example a well designed post-training questionnaire can cost-effectively produce useful evaluation - if the learner clearly understands their agreed learning needs and objectives - and is then able to articulate whether those needs have been met or not.

But to illustrate dangers here - if a course is charismatically presented, and was interesting and enjoyable in its own right, it may achieve very high satisfaction ratings without necessarily meeting the learning needs or outcomes hoped for.

Hence the derogatory term of ‘happy sheets’ is sometimes used for such feedback, but that is more to do with poorly designed and unfocused questionnaires than their use.

Effective questionnaires are carefully designed with focus on content and whether clearly defined learning outcomes have been achieved or not. Such information is therefore not simply a ‘satisfaction’ rating, but can provide valuable feedback, and can form the starting point for more reflective evaluation such as a review with a line manager, HR, or peers at work.

Therefore whatever method or depth of evaluation planned, it is essential for learning needs to be defined, and those learning outcomes must be the focus whatever training intervention is planned, and after delivery the results must be measured objectively afterwards.


In-depth evaluation options to consider

The Kirkpatrick model is one of the oldest methods but still a well-respected option, and suggests four ‘levels’ of training evaluation:

  • Reactions – liking or feelings for a programme

  • Learning – principles & facts absorbed

  • Behaviour - using learning on the job

  • Results - increased production, reduced costs

 

In practice reactions are often the main source of evaluation, because of the effort and time involved in measuring the other levels, but if devised and analysed properly even the first level can be useful. Post-training interviews with learners, usually by their line manager or HR / training manager is a more objective way of challenging and reviewing the learning absorbed. Further to that behaviour change can be measured over time, possibly involving 360 feedback, performance appraisals, or reviews that measure the new or enhanced competencies. Ultimately the most in-depth evaluation can be designed to measure impact on key performance indicators and other business related metrics.


Return on investment (ROI), is similar to the last of the Kirkpatrick levels, looking at the financial benefits as compared to the cost of learning. It seeks to analyse the L&D contribution to boosting output or performance, and so a lot of care has to be taken to devise appropriate measures, and this can be time-consuming if done thoroughly. Here too there are dangers, such as the potential for corruption by other performance-affecting activities or coincidental factors running alongside the L&D. In addition if it is carried out post-project with no baseline set at the outset then it will be difficult to accurately measure. However if time and care is taken, effective and holistic studies comparing all factors that can raise performance can be effective, and whilst not common there are established and rigorous approaches to ROI that can be replicated.


Easterby-Smith and others suggest another model, with some contrasting aspects to consider:

  • Proving – did the training work, and have measurable impact?

  • Controlling – the time taken, costs, and consistency or compliance measurements

  • Improving – reviewing the training content, methods and trainers

  • Reinforcing – using evaluation and reflection as a part of the learning process itself

This approach is particularly relevant to an ongoing training programme where the evaluation feeds into continuous improvement.


Chartered Institute of Personnel & Development (CIPD) ‘RAM’ model finally, whilst aimed at HR professionals it is another perspective to consider when planning and deciding any L& D intervention. It combines many aspects of the models described here, and it stands for:

  • Relevance: how training will meet business opportunities and challenges

  • Alignment: ensuring through stakeholders that L&D is aligned to key factors such as organisational strategy, reward, marketing or financial strategies

  • Measurement: consider a mixture of methods such as ROI and measures of expected change and improvement such as ‘return on expectation’ and KPIs.

Note that the term ‘return on expectation’ recognises that learners, line managers, and trainers may have differing views on the value of learning outcomes, but that all are significant, and a further dimension to be aware of amongst stakeholders.

 

In conclusion, a firm focus on learning outputs will result in more effective L&D and better value for money, and there are simple as well as sophisticated approaches to evaluate learning outcomes and the extent to which learning provision is aligned with business objectives.

Contact BIFM Training www.bifm-training.com for advice about learning needs analysis or any help with training & learning in the profession of facilities management.

article Rating

vote data

leave Reply

 (photo: Confida FM)
Confida FM  - 02.06.2017, 17:12

New Beginnings for Ayrshire FM Business

The launch of Confida FM promises a bright new future for a wife and husband management team - and the facilities management industry in Scotland...and beyond.

 (photo: WPS Parking Systems BV)
WPS Parking Systems BV  - 15.03.2017, 14:04

Next-generation Parking Technology for Celtic Manor

WPS, one of the UK’s leading parking equipment manufacturers, has won a second contract to provide its ParkAdvance™ technology at the prestigious Celtic Manor Resort in Newport.

 (photo: Infomill Ltd)
Infomill Ltd  - 09.01.2017, 12:22

New Version of PartsArena Pro App

Heating engineers across the UK will enjoy a new, updated version of PartsArena Pro from January 2017 as creator, Infomill rolls out its latest improvements to the leading source of heating parts...

Crocus on site. (photo: Bodet Ltd.)
Bodet Ltd  - 22.09.2016, 20:30

Nursery and Garden Produce Staff Go Biometric

Leading nursery and online garden produce retailer Crocus have just installed the very latest Bodet interactive time and attendance system, to provide accurate and reliable staff attendance data...