In the May/June issue, we examined ways that Kirkpatrick's Level 1 evaluations, also called reaction evaluations, could be enhanced by changing the point of reference from that of the nursing professional development (NPD) practitioner to that of the learner. In that column, we also introduced the concept of calibrated questions and how integrating one to two calibrated questions to Level 1 evaluations can greatly encourage honest feedback from our learners.
This column will continue the journey through Kirkpatrick's Level 2 (learning) and Level 3 (behavior change) evaluations. Again, we will focus on ways to frame the evaluation questions to accurately assess the level and degree of knowledge, attitude, and behavior changes in our learners as a result of the educational session offered (Kirkpatrick & Kirkpatrick, 2005). With that in mind, let us begin.
CURRENT PRACTICE OF LEVEL 2 (LEARNING) EVALUATIONS
Have you wondered how the typical learner gauges the extent of their own learning during an educational session? Most of the time, it is based on their "feeling" of what they sensed they learned. For us NPD practitioners, we rely on pretests and posttests to document Level 2 evaluations to help us assess the efficacy of the session. Here is the issue: How effective are pretests and posttests as a barometer of their learning? Do these scores mean anything to them, or do the learners view them as merely an exercise to satisfy the class completion requirements?
OPPORTUNITIES FOR LEVEL 2 EVALUATIONS FROM THE LEARNER'S POINT OF VIEW
Let us consider opportunities to improve upon the current Level 2 evaluation goals. For those of us using pretests and posttests for Level 2 evaluations, consider adding calibrated questions to posttests. These calibrated questions can enhance the learner's experience as they focus on their learning (not the NPD practitioner's view of their learning), which is what Level 2 evaluations seek to assess. Here are several examples to consider adding to posttest questionnaires:
1. My learning of topic "X" was facilitated by the instructor performing "Y."
2. Most helpful to my learning were opportunities to do "X."
3. My learning could have been improved by "X."
4. My most valuable learning in this educational session was "X" because of "Y."
Again, the goal of adding these calibrated questions to the current posttest format is to engage the learners and elicit their honest opinions and perceptions of the educational session to continuously improve our educational offerings.
CURRENT PRACTICE OF LEVEL 3 (BEHAVIOR CHANGE) EVALUATIONS
Kirkpatrick's Level 3 (behavior change) evaluations and the assessment of the actual behavioral change pose another difficult challenge. This time, it involves both the learner and the evaluator. Why? There are many reasons for this, but more obviously, it is because behavioral changes occur over time. Furthermore, these changes may be hoped for but truly cannot be anticipated or predetermined. Generally, behavioral changes can widely fluctuate at first and then even out over time to produce a consistent result.
Let us be honest. How many of us have left an inspiring educational session on effective communication, mindfulness, leadership, teamwork, and/or resilience convinced that we will put what we learned into practice? How many of us on the postsession survey would "strongly agree" that the educational session would lead to a behavior change in our workplace? Chances are, these initial results would be encouraging to the facilitators of the session.
In reality, if someone could track our actual performance during the ensuing 3 months, how different might our actual performance be from the way we answered the initial questionnaire? Is it possible that the actionable performance might differ widely from that questionnaire response? Why might this be? Our experience from the field is that learners do not always have the necessary resources to implement behavior change. Other priorities often supersede behavior changes, which make them difficult to implement. Even though learners might have every intention of changing their behavior, external forces might prevent that from occurring.
Level 3 evaluation questions generally seek to measure the efficacy of the intervention, the NPD practitioner, and the educational session and how it impacts behavior change. Rarely do Level 3 evaluation questions seek to address or identify potential barriers that learners encounter while trying to implement change in their workplace.
OPPORTUNITIES FOR LEVEL 3 EVALUATIONS FROM THE LEARNER'S POINT OF VIEW
Here are several calibrated questions that might enhance Level 3 evaluations:
1. The areas where I may encounter or have encountered barriers to applying what I learned are "X."
2. The resources that might help or did help me apply what I have learned are "X."
3. The three specific behaviors which I learned that I will be able to apply or have applied are "X."
NPD SCOPE AND STANDARDS-EVALUATION
The NPD Scope and Standards of Practice Standard 6 delineates competencies regarding evaluation. One competency that both NPD generalists and specialists have is to "involve learners and stakeholders in the evaluation process" (Harper & Maloney, 2016). Though this can be accomplished in a variety of ways, reframing evaluations to focus on the learner's point of view rather than the facilitator's point of view elevates the implementation of this competency.
EVALUATION OF SIMULATION EDUCATION
Simulation education, as with all other educational modalities, needs to be appropriately evaluated with the same attention given to the learner's point of view. Simulation sessions can offer unique opportunities for Level 2 and Level 3 evaluations, particularly because there is dedicated discussion time during prebriefing and debriefing to engage learners in a more robust manner than can be done during traditional educational sessions. These dedicated discussion times can lend themselves well to verbally asking calibrated questions about learning and behavior change and allow opportunity for follow-up questions and further probing of the learner's point of view.
NEXT STEPS-LEVEL 4 EVALUATIONS
Our final column in this series will explore how NPD practitioners can optimize Level 4 evaluations by looking at organizational impact through different lenses. How have you enhanced your evaluations and evaluation processes? How have you "involved learners and stakeholders in the evaluation process?" Moreover, how has this involvement been, or not been, meaningful to them? What lessons have you learned? E-mail us at mailto:[email protected] and mailto:[email protected] to continue this exploration.
References