Theme 1: Instructional Design
Hands-on Evaluation of Learning
Course: EAC 584 - Evaluating Training Transfer and Effectiveness
This narrative discusses a team project focused on the evaluation of a learning event; and, deliverables included a reflection paper and a presentation. My employment as a curriculum designer and trainer for the North Carolina Electronic Disease Surveillance System (NC EDSS) training program enabled me to obtain approval from necessary stakeholders, to modify existing evaluation instruments and collect data. This was an exciting opportunity for me since I knew firsthand that the training’s evaluation techniques were worthy of improvement.
We began by reviewing the existing training goals, objectives, and outcomes of the two-day NC EDSS training, and decided to focus our evaluation efforts on select objectives from the first day of training. Our understanding of the Kirkpatrick model and evaluation principles assisted us in determining what type of instruments to develop, how to design effective Level 1 and Level 2 evaluation instruments and procedures, and how much data to collect. We revised an existing Level 1 instrument, and created a new Level 2 instrument.
Our next step was to attend a training and use the new instruments to evaluate trainee performance. Several trainees commented that they liked having a way to check their knowledge at the end of the first day of training, prior to moving on to the second day of training. The trainer was also able to address foundational areas where trainees struggled before moving on to more advanced content. This avoided losing trainee interest due to a lack of understanding.
Having an opportunity to work on this real-life project while taking a course on evaluating training transfer and effectiveness allowed us to immediately apply new knowledge and theory to practice. After analyzing results from the new evaluation instruments, our team recommended revising the training approach for one learning objective, and continuing to use the new instruments at future NC EDSS trainings. Several years later, one of the NC EDSS training program stakeholders thanked me for creating the improved evaluation instruments and informed me that they are still in use.
Click here to view Evaluation of Learning Reflection Paper
Click here to view Evaluation of Learning Presentation
We began by reviewing the existing training goals, objectives, and outcomes of the two-day NC EDSS training, and decided to focus our evaluation efforts on select objectives from the first day of training. Our understanding of the Kirkpatrick model and evaluation principles assisted us in determining what type of instruments to develop, how to design effective Level 1 and Level 2 evaluation instruments and procedures, and how much data to collect. We revised an existing Level 1 instrument, and created a new Level 2 instrument.
Our next step was to attend a training and use the new instruments to evaluate trainee performance. Several trainees commented that they liked having a way to check their knowledge at the end of the first day of training, prior to moving on to the second day of training. The trainer was also able to address foundational areas where trainees struggled before moving on to more advanced content. This avoided losing trainee interest due to a lack of understanding.
Having an opportunity to work on this real-life project while taking a course on evaluating training transfer and effectiveness allowed us to immediately apply new knowledge and theory to practice. After analyzing results from the new evaluation instruments, our team recommended revising the training approach for one learning objective, and continuing to use the new instruments at future NC EDSS trainings. Several years later, one of the NC EDSS training program stakeholders thanked me for creating the improved evaluation instruments and informed me that they are still in use.
Click here to view Evaluation of Learning Reflection Paper
Click here to view Evaluation of Learning Presentation