Innovation in training evaluation - using Kirkpatrick's model

The following paper is based on a proposal to develop an evaluation framework for an experiential leadership training programme. It was part of an EU Transfer of Innovation project between Slovenia, Belgium and myself in Scotland. The EU application was not successful, but the ideas and methods within the paper provide useful ideas for structuring a wider approach to training evaluation.

The Evaluation Innovation involved the design and application of a framework for evaluating the impact and effectiveness of a Leadership Development Programme. The evaluation framework was based on an existing tried and tested model to ensure a robust methodology was developed that would generate evidence of results and learning that would benefit and inform:
The evaluation framework was designed to be innovative in how it tackled the following challenges:
Existing frameworks for informing the evaluation of the Leadership Development Programme included:

The Kirkpatrick Four Levels™ Model 

This is an internationally recognised training evaluation framework which provided a template and reference point for designing appropriate evaluation methods for the Leadership Development Programme. The four evaluation levels include:
1 Reactions or are people happy with the training input?
2 Learning or what do people remember from the training sessions?
3 Behaviour or do people use what they have learned in the workplace?
4 Impact or what are the outcomes over a period of time?

The Kirkpatrick model was extended (Phillips & Phillips, 2001) to include a Pre-level relating to needs assessment at individual and business/organisation levels, and to explore the potential for an additional Level 4a relating to the Return on Expectations and Investment. The extended Kirkpatrick model is developed in Appendix 1 to show the aims at each level and to highlight the evaluation tasks. Examples of specific evaluation methods and tools are also included. These were developed to the specific needs of the Leadership Development Programme as part of the Transfer of Innovation Project. This framework also provides a template for analysing evaluation evidence and findings.

Innovative Evaluation Methods

Many training evaluations are limited to gathering satisfaction feedback on the training itself and/or recording learning and results at a superficial level. We proposed to draw on a range of evaluation methods that have been developed to encourage participation, dialogue and evaluative thinking. In this way we would gain deeper insights into the learning and outcomes of participants, and in addition participants would gain from learning reflective and evaluative skills as an integrated part of the Leadership Development Programme. These methods are more effective as they provide different ways for participants to express what they are thinking, feeling and learning through different media such as visual tools. Essentially, they ask the evaluation questions in different ways. These methods have been developed from a range of sources including Reviewing Skills (Roger Greenaway); and Using Visual Approaches (Evaluation Support Scotland).

The following are a selection of tools.
Evaluation Tools Description
Training Evaluation A simple end of module evaluation sheet which focuses on two aspects: REFLECTION on participant learning and course FEEDBACK
H Form Evaluation This participatory evaluation method can be used at the end of the Programme to assess and make recommendations about the whole Leadership Development Programme.
Evaluating Transferable Skills This is a word sort and prioritisation tool which can be used to identify skills developed, explore gaps, and measure change.
Review and Evaluation using Pictures and Objects This review tool helps individuals to explore and express deeper insights and learning, and could be used prior to completing the module evaluation form.
Impact Map This mapping tool provides a way to measure/show the distance travelled for each participant against the expectations/outcomes of the Programme (quantitative).

Return on Expectations and Investment

Return on investment is an important element of feedback for business but is a difficult area to quantify in relation to training. Whilst financial and objective measures can be devised in relation to the cost of training, it is difficult to quantify benefits relating to improved leadership competence. This will be an area that can be explored within the Transfer of Innovation project with the aim of identifying a set of simple measures. In particular, we will consider Return on Expectation as a way of understanding and measuring the specific benefits for individual participants and business/organisations. For example: If a stated expectation is that participants become more creative, self-initiated (proactive) and positive in their approach to their work, then this could be explored through the use of before and after scales:

Becoming more creative:   ________________________________________________________

Becoming more proactive: ________________________________________________________

Becoming more positive:   _________________________________________________________

Where would you position yourself at the start of the Leadership Development Programme?
Where would you position yourself at the end of the Leadership Development Programme?
What is the change measurement between the start and the end of the programme?
How would you describe what this means?
These responses can provide both quantitative and qualitative measures and can be considered in relation to the inputs including cost of training, time etc.

Finally, Appendix 1 Leadership Programme Evaluation Framework provides an overview of the project.



Anderson, V., (2007) The Value of Learning: From return on investment to return on expectation. ‘Research into practice’ report. London: CIPD. 

Reports on a research project which explored how organisations are measuring and reporting on the contribution of learning to strategic value. The research shows how key organisational stakeholders expect learning to add value through for example contributing to effectiveness and the achievement of strategic impact. The research also demonstrates that learning contributes to longer-term and less tangible organisational outcomes.
Fenman Limited, (2003). Train the Trainers – The definitive guide to creating a great learning experience.

Humphris, Prof. D., Connell, Dr. C. & Meyer, Dr E. (2004). Leadership Evaluation: An Impact Evaluation of a Leadership Development Programme. Health Care Innovation Unit & School of Management, University of Southampton.

This research focused on evaluating the impact of a leadership development intervention on the individuals and their organisations. The framework used for the evaluation was Phillips and Phillips’ modified version of Kirkpatrick’s framework for training evaluation. It provides a useful reference points and guidance based on their findings.

Phillips, P. P. & Phillips, J. J. (2001), Symposium on the Evaluation of Training,
International Journal of Training and Development, vol. 5, no. 4, pp. 240-247.

© Lesley Greenaway 2011

Return to research-projects

It has been a tremendous experience working with Lesley. As a team we asked her to help us devise a tailored Evaluation Framework which suited both our model of working and our stakeholders. Lesley’s broad experience...

Pauline Edmiston
Transformation Team

Lesley has carried out several pieces of consultancy work for the Voluntary Action Fund, particularly around evaluation, impact measurement and group facilitation. Lesley has in depth knowledge of a range of evaluation methodologies...

Keith Wimbles, Chief Executive, Voluntary Action Fund

Lesley is one of the most inspirational of the trainers, managers and thought leaders I have worked with over more than 20 years in the learning & development field. She has a deep and wide understanding of people, of...

Nick McBain
Director, Clarity Coaching Scotland

Contact Me

+44 (0)1786 450 968