Course evaluation is an essential element in nursing program evaluation and contributes to the process of judging the worth or value of an educational program (Oermann & Gaberson, 2017). Course evaluation permits faculty to assess multiple aspects of course design and implementation, including the appropriateness of course learning outcomes (CLOs), the suitability of assessment methods, consistency between CLOs and learning activities, student achievement of CLOs, and course redundancies and deficiencies (Iwasiw & Goldenberg, 2015).
This author has developed the Course Evaluation Summary (CES), a useful tool for guiding and documenting nursing course evaluation. As a data collection and analysis tool, the CES has the potential to address the problem of minimal research evidence of course evaluation in nursing education reported by Cannon and Boswell (2012). Prior to developing this tool, the author conducted a literature search and found no similar tool for course evaluation.
The CES is used each semester to evaluate nursing courses in associate degree, baccalaureate, and master's entry-level nursing programs in Arizona. In recent successful nursing program reaccreditations, the CES provided the clear evidence of ongoing curriculum evaluation and improvement that accreditors wished to see.
INITIATING THE COURSE EVALUATION SUMMARY
The CES was developed as a means to evaluate the extent to which course design and implementation support student achievement of CLOs as specified in the course syllabus. To begin course evaluation using the CES, the faculty inserts the course name and number, faculty name(s), and semester date at the top of the form. (See Table 1 as Supplemental Digital Content 1, available at http://links.lww.com/NEP/A82, for an example, somewhat abbreviated for the purposes of this article.) In subsequent semesters, the course name/number will not change, but the date will change and faculty names may change.
The second row contains a numbered list of all CLOs. A copy-and-paste action from the syllabus is a simple shortcut for completing this section. Under the section with CLOs are five columns. The first column lists course assignments and assessments under the heading "Assignments & Assessments." Entries include everything that is used in the course to evaluate students: exams, quizzes, skills tests, clinical evaluations, written assignments, poster presentations, portfolios, and others.
The second column requires faculty to specify which CLOs are measured for each assignment and assessment (the number of the CLO is used). In nursing education, multiple approaches are available to assess whether students are learning course content (Cannon & Boswell, 2012). Although there is no rule for how many times a CLO should be measured in a course, each CLO should be measured at least once. In cases where an assessment is linked to multiple CLOs, indicating redundancy in measurement methods, it may be possible to eliminate one assessment (or more) without jeopardizing the measurement of student achievement. Faculty may also find that one CLO (or more) is not measured at all, indicating a gap in measurement and the need to add an assessment method.
In the column "Target Class Outcome," faculty specify, for each assignment/assessment, the target outcome (or goal) for the class, that is, what percentage of the class is expected to pass the assessment (or achieve a certain score on the assessment). Determination of the target outcome for each assessment is based on faculty judgment.
The above steps complete the start-of-semester preparation for course evaluation using the CES (indicated in the sample CES by shading). Most of the shaded information is copied in subsequent semester evaluations, with updates as needed (e.g., if faculty alter a percentage for a target outcome). As an example in practice, this author applied the CES upon assuming leadership of a nursing fundamentals course. During the start-of-semester CES preparation, the author identified that a certain subset of CLOs was measured every week in students' clinical written assignments and measured again at the end of the semester in a large written assignment. This large assignment provided no additional CLO assessment data beyond the weekly written work; it was simply more work. The author, with co-faculty support, eliminated this assignment from the course. The author also noted that one CLO was not measured at all; once notified, faculty developed a new and meaningful assignment in order to measure this CLO.
Other insights can be derived from the first application of the CES. For example, faculty may become aware that the wording of a CLO makes measurement difficult, indicating the need to rewrite it. Bloom's Taxonomy, considered the gold standard for the writing of learning outcomes, is an excellent resource for selecting wording that will make CLOs measureable (Cannon & Boswell, 2012).
COMPLETING THE CES AT THE END OF THE SEMESTER
The remainder of the CES is completed at the end of the semester. Data (grades) are collected for all assessments for all students and entered as group (full-class) data in the column "Actual Class Outcome." In the example, a target goal of 85 percent was set for passing the four NRS 101 Exams, but the actual percentage of students passing all exams was 81 percent.
When target outcomes are not achieved, there is no reason to assume that there is something wrong with the course; there are many reasons CLOs may not be met. In our example, for instance, students withdrew from the course early. In other instances, a new exam format may be troublesome. The CES allows for entering clarifying data and can help faculty investigate issues when student achievement is lower than expected.
In the final column, "Action Plan/Comments," faculty comment on whether target outcomes were met and generate an action plan as needed. When a target outcome is met, a comment such as "Goal met-will continue with current plan" is appropriate. When there is a mismatch between the target and actual outcomes, a simple explanatory comment or a minor course revision may be sufficient. When a significant problem in the achievement of a target outcome is noted, an action plan for a larger course change is likely needed.
The last rows of the CES provide areas for "Course Review," free-text documentation of factors relevant to the course during the semester. The first row, "Planned and Implemented Course Updates/Changes," contains content brought forward from the previous semester's course evaluation - changes that faculty planned to implement during the current semester. Space for other faculty-generated course changes made during the semester is provided in the "Additional Changes" section.
The section "Administrative/Other Factors Resulting in Course Changes" allows faculty to document course changes that were not faculty-generated but originated from some other source (e.g., administrative decisions, new clinical agency constraints). This section elicits information that can be useful over time to document why changes were implemented. Such information is often lost when faculty or administrators are no longer available to tell the story.
Students often have concerns about courses, and these are documented in the "Student Concerns" section. Evidence of faculty responsiveness to student concerns is good information to document for program quality and accreditation purposes (Keating, 2015). In the example, a student concern regarding "difficulties with accessing and using electronic MAR" is addressed in the plan to streamline the electronic charting requirements.
In the last row, "Proposed Changes for the Next Semester," faculty list changes that are planned based on data collected and reviewed in the rows and columns above. This list permits easy tracking of course revisions semester to semester. In the example, the proposed change, to add an NCLEX-RN exam review book, is derived from both student concerns and the low actual class outcome (48 percent) for the standardized fundamentals exam.
When the CES is used in subsequent semesters, the list of changes in the row "Proposed Changes for the Next Semester" is copied and pasted into the section "Planned and Implemented Course Updates/Changes" at the top of the next semester's "Course Review" section. This duplication of content from the end of one semester to the next provides a clear trail of evidence-based course improvement over time. Included in a self-study report, the course evaluation process documented in the CES can be helpful in nursing program accreditation efforts.
IMPLICATIONS FOR NURSING EDUCATION
The CES is an enlightening and useful tool developed to assess the extent to which CLOs are met within a course. The CES assists faculty to identify strengths and potential areas of concern in the achievement by students of learning outcomes within a course. It also guides faculty to develop an action plan for course improvement when appropriate. The CES prompts faculty to document course changes made in the semester under evaluation, including faculty-planned, administrative, and other changes. Student concerns are documented, and looking forward, proposed changes for the next semester are listed. Used every semester, the CES provides a history of continuous course evaluation and quality improvement. This author encourages nursing faculty to apply the CES to their courses and gives permission for faculty to revise this tool in any way that makes it more functional for specific academic settings.
REFERENCES