Access to clinical sites and preceptors are major concerns for advanced practice nursing programs (American Association of Colleges of Nursing [AACN], 2015). The AACN recommends that "the use of simulation to replace more traditional clinical experiences[horizontal ellipsis]be explored" (p. 33). Greater control over the simulation learning environment can also allow for more standardization and consistency with assessing student competence (Hodges et al., 2019). However, there is limited evidence about student learning outcomes from simulation at the advanced practice level (Hodges et al., 2019; Rutherford-Hemming et al., 2016).
BACKGROUND
With a movement toward competency-based education, leaders in nurse practitioner (NP) education have called for more discussion and evidence on the use of simulation as a method of evaluating clinical competencies (Jeffries et al., 2019). Currently, there is no standardized approach to how nursing education evaluates the competency of students. Competency-based education focuses on ensuring that all students have met a level of mastery sufficient for clinical practice (Hodges et al., 2019). Through simulation-based experiences, the proficiency of multiple competencies can be assessed simultaneously. Simulation has the potential to effectively assess students and prepare NPs to provide care to patients with increasingly complex health problems in an ever-changing health care system.
To strengthen the evidence about advanced practice student learning through simulation, faculty should use best practices to develop, assess, and report simulation experiences consistently and reliably. The International Nursing Association for Clinical Simulation and Learning (INACSL) has provided leadership in this arena. INACSL began developing standards in 2010 based on the existing literature and input from organization members; a standing committee has formed to complete regular review and updates of these standards and gather input from members and external reviewers (INACSL Standards Committee, 2016). There are few examples in the literature that utilize INACSL standards, and most are focused on the older 2013 edition (Barber & Schuessler, 2018; Schram & Mudd, 2015).
An INACSL standard of best practice for simulation describes participant evaluation, which may be formative or summative (INACSL Standards Committee, 2016). The evaluation of clinical competencies poses many challenges, including lack of a common language of what demonstrates proficiency of knowledge, skills, and abilities (Hodges et al., 2019). The INACSL participant evaluation standard recommends that formative evaluation include ongoing feedback. That identifies gaps in knowledge and skills and facilitates learning. Feedback plays an integral role in education and professional development. Formative feedback provides an assessment of student performance in relation to program objectives. When effective, feedback can enhance students' commitment to learning, development of self-efficacy, and willingness to continue working on the task or competency being mastered (see Miles, 2018; Schartel, 2012; the former focuses on undergraduate education).
Sustainable, effective, and evidence-based models are needed to measure competencies and provide meaningful feedback to NP students. We describe the implementation of a standardized patient (SP) experience using INACSL Standards of Best Practice: SimulationSM criteria (INACSL Standards Committee, 2016), methods of providing formative feedback to students, and the development of simulation evaluation of clinical competencies.
SIMULATION-BASED EXPERIENCE
This simulation-based experience (SBE) was implemented for 14 students who had completed their chronic disease management didactic course and were beginning their second of three practicum experiences. The goal of the SBE was to stimulate critical thinking, provide formative feedback, and assess students' competency in the National Organization of Nurse Practitioner Faculties independent practice competencies domain (2017). To best meet the overall goal and objectives, simulated clinical immersion using trained SPs was selected as the primary modality. To comply with INACSL standards for design and implementation of an SP experience, all students underwent a prebriefing session. Facilitators reviewed the purpose of the SP experience, applicable course objectives, expectations, simulation environment, fiction contract, student roles, and evaluative methods. Measurable objectives in a competency-based evaluation tool for the SP experience were developed based on overall course objectives, NONPF competencies, and the AACN (2011)Essentials of Master's Education in Nursing.
The scripted scenario was developed by graduate faculty based on clinical experience. SP actors were selected and trained by the simulation staff. One week prior to the simulation, students were given a list of six chronic conditions commonly seen in primary care to review guidelines and prepare for the experience. The SP presented for a six-month follow-up appointment. Students were provided with a face sheet documenting the chief complaint; past medical history; medication list; and current, relevant laboratory results. Students were given 15 minutes to obtain the history and perform a physical examination and then five minutes to review results of laboratory data, develop a problem list, and determine appropriate interventions to manage the patient's chronic conditions and preventive maintenance needs based on age, gender, ethnicity, and risk factors. Students were permitted to use resources they typically had available in their practicum settings (e.g., mobile device with medical apps, clinical guidelines, textbook). The SBE concluded with the student giving a concise, three- to five-minute oral presentation to faculty. Faculty were present in the room without intervening in the case. When the time limit was reached, students were given immediate, individualized formative feedback by faculty.
Three feedback measures were used: a facilitator evaluation tool, SP feedback survey, and a group debriefing session. Facilitator evaluation tools were developed by faculty and used to guide formative bedside feedback and determine achievement of the SBE objectives. These tools were informed by evidence-based practice guidelines and reviewed by faculty who currently practice in primary care settings. SPs provided students with feedback regarding their interpersonal skills and communication styles; they completed the evaluation after each visit, sharing feedback with individual students prior to the debriefing. Because of time constraints, the structured group debriefing took place one week following the SBE, guided by the debriefing with good judgment approach and led by a facilitator formally trained in simulation debriefing (Maestre & Rudolph, 2015).
DISCUSSION
To gather data on the perceived effectiveness of the simulation and help improve future simulation experiences, students completed a brief evaluation immediately following the SBE. The responses revealed students felt the simulation bridged the gap between classroom learning and clinical application, making the initial day in practicum less intimidating. Students felt they benefited from reviewing current guidelines on the most often seen chronic conditions and expressed having greater confidence managing chronic conditions prior to entering their practicum sites. Students valued the SP experience, noting "the more hands on, the better." Feedback from SPs and faculty enabled students to assimilate areas of the encounter that were performed well and information they needed to review prior to entering clinical. The process of developing the SBE experience resulted in a competency-based evaluation tool that can be modified for experiences.
Several lessons were learned from this SBE. First, group debriefing should take place as soon as possible, so recollection of the experience is fresh. Despite this SBE culminating in formative versus summative feedback and learning, students had high levels of anxiety that they felt affected their critical thinking abilities. A structured practice time the week before the SBE would help reduce anxiety, giving students more familiarity with the process. Students also requested the face sheet with patient information and a few minutes of time for preparation immediately prior to interacting with the patient. The SPs asked for more training on how to give more effective feedback.
CONCLUSION
This simulation-based experience allowed NP students a valuable opportunity to develop confidence in independent decision-making in a safe environment with a complex primary care patient. Students received timely multimodal feedback from faculty and SPs and reviewed current practice guidelines for chronic conditions commonly seen in primary care prior to entering the second practicum experience. As changes in NP education based on new essentials are being developed, reliable competency-based SP evaluation tools are needed. This project contributes to the simulation literature by providing an example of an SBE with feedback using competency-based education practices to develop advanced practice nurses.
REFERENCES