Using the Kirkpatrick Model to Evaluate Training

Lauren Hays
All training, whether in-person or online, needs to be evaluated. Evaluation helps you know if your training was effective, and if so, how people are using the knowledge they learned.
Donald Kirkpatrick, former Professor Emeritus at the University of Wisconsin, developed a four level evaluation framework that is widely used in corporate training.
The four levels are Reaction, Learning, Behavior, and Results.
Level 1: Reaction
At this level, you want to get the audience’s reaction to the training. Getting their initial reaction will help you know how well the training was received. According to Kirkpatrick Partners, at this level you specifically want to know “the degree to which participants find the training favorable, engaging and relevant to their jobs”.
Questions to ask of trainees include:
- Did you find the session engaging?
- How satisfied are you with the training? (1-10 scale)
- Did you find your level of participation in the training sufficient?
- How relevant did you find the information shared during the training? (1-10 scale)
Level 2: Learning
At this level, you want to understand if the audience learned the information you presented in your training. According to Kirkpatrick Partners, at this level you specifically want to know “the degree to which participants acquire the intended knowledge, skills, attitude, confidence and commitment based on their participation in the training”.
Questions to ask of trainees include:
- I now am able to do X (a skill you trained on).
- How comfortable are you doing X (a skill you trained on)? (1-10 scale)
- I am committed to trying X (a skill you trained on).
Level 3: Behavior
At this level, you want to determine if the audience applies the knowledge they learned. According to Kirkpatrick Partners, at this level you specifically want to know “the degree to which participants apply what they learned during training when they are back on the job.”
These questions may need to be evaluated a week or two after the training session. Questions to ask of trainees include:
- Did you put X (a skill you trained on) to use?
- Have you taught X (a skill you trained on) to any of your colleagues?
Level 4: Results
At this level, you want to determine if the learning objectives for your training were met. According to Kirkpatrick Partners, at this level you specifically want to know “the degree to which targeted outcomes occur as a result of the training and the support and accountability package”.
A question to ask of trainees include:
- I am able to [the objective you identified before the training].
For example,
- I am able to use interlibrary loan.
- I am able to request a resource from the library.
- I am able to locate information I need for my job.
*For tips on writing learning objectives please see my post from earlier this year titled Setting Learning Goals in Special Libraries.
While evaluation is important for all training, I particularly encourage you to incorporate an evaluation plan if you have recently switched to online training sessions. Evaluation will help you refine your online training so it is always relevant and applicable to your organization.

Lauren Hays
Lauren Hays, PhD, is an Assistant Professor of Instructional Technology at the University of Central Missouri. Please read her other posts about skills for special librarians. And take a look at Lucidea’s powerful integrated library systems, SydneyEnterprise, and GeniePlus, used daily by innovative special librarians in libraries of all sizes and budgets.
Similar Posts
Growing Your Leadership Skills: 7 Tips for Special Librarians
Great library leaders aren’t born—they’re made through learning self-reflection and practice. Here are seven strategies to help you grow and lead with impact.
Keeping Up with Copyright and Generative AI: What Special Librarians Need to Know
As generative AI becomes more prevalent copyright law is evolving to address its impact. A new report from the U.S. Copyright Office provides guidance on what is (and isn’t) copyrightable.
Understanding Shadow AI: Risks Costs and Governance
AI can enhance search discovery and efficiency but unsanctioned adoption—known as “shadow AI”—can lead to budget overruns and compliance risks. Here’s how to evaluate AI pricing models and build a governance strategy that balances innovation with cost control.
Interview with an Author: Fernandez on Streaming Video Collection Development
As demand for streaming video in libraries grows so do the challenges of managing access budgets and licensing. Co-author Michael Fernandez shares key insights from his book “Streaming Video Collection Development and Management”.
Leave a Comment
Comments are reviewed and must adhere to our comments policy.
0 Comments