class: left, bottom, title-slide .title[ # Module 7: Evaluation ] .subtitle[ ## EME5601: Introduction to Instructional Systems ] .author[ ### Dr. Bret Staudt Willet ] .date[ ### November 20, 2024 ] --- class: inverse, center, middle #
**View the slides:** [bretsw.com/eme5601-fs24-module7](https://bretsw.com/eme5601-fs24-module7) --- class: inverse, center, middle #
<br><br> Module 6 <br> Recap --- #
Design & Development Design & Development are **late steps** in the iterative design cycle: <img src="img/Rothwell-fig2-1.png" width="480px" style="display: block; margin: auto;" /> <div class="caption"> Figure 2.1 from Rothwell et al. (2016, p. 20) <br><br> </div> --- #
Design & Development <img src="img/design-book.jpg" width="480px" style="display: block; margin: auto;" /> --- class: inverse, center, middle #
<br><br> Module 7 <br> Evaluation --- #
Evaluation <img src="img/girl-leaf.png" width="100%" style="display: block; margin: auto;" /> --- #
Evaluation <img src="img/girl-elephant.png" width="100%" style="display: block; margin: auto;" /> --- #
Evaluation <img src="img/blind_monks_examining_an_elephant.jpg" width="600px" style="display: block; margin: auto;" /> **"Blind Monks Examining an Elephant" by Itcho Hanabusa (1652–1724).** Image is in the public domain, hosted by [Wikimedia Commons](https://commons.wikimedia.org/wiki/File:Blind_monks_examining_an_elephant.jpg) --- #
Evaluation <img src="img/blind_monks_examining_an_elephant.jpg" width="420px" style="display: block; margin: auto;" /> [Blind Men and An Elephant](https://en.wikipedia.org/wiki/Blind_men_and_an_elephant) > "The parable of the blind men and an elephant is a story of a group of blind men who have never come across an elephant before and who learn and imagine what the elephant is like by touching it. Each blind man feels a different part of the elephant's body, but only one part, such as the side or the tusk. They then describe the elephant based on their limited experience and their descriptions of the elephant are different from each other." --- #
Evaluation <img src="img/blind_monks_examining_an_elephant.jpg" width="600px" style="display: block; margin: auto;" /> ### Exercise:
Walk around and notice the elephants (the things tricky to evaluate) --- class: inverse, center, middle #
<br><br> Module 7 <br> Evaluation **Part One: Evaluation of Learning** --- class: inverse, center, middle #
<br><br> Designing Learning Assessments <br> (Ch. 13) --- #
Learning Assessments <img src="img/otter.png" width="360px" style="display: block; margin: auto;" /> -- ### Performance Objectives
Performance Metrics -- - **Why?** Accountability for showing results -- - **What?** Monitoring learner achievement --- #
Learning Assessments <img src="img/otter.png" width="360px" style="display: block; margin: auto;" /> ### Performance Objectives
Performance Metrics <hr> ###
Discuss in groups: What is the best learning assessment or evaluation you have experienced? Why? --- #
Learning Assessments <img src="img/teaching-to-the-test.png" width="360px" style="display: block; margin: auto;" /> ### Performance Metrics are benchmarks that guide ISD -- <hr> ###
Reflect: What about issues of "teaching to the test"? --- #
Learning Assessments <img src="img/girl-elephant.png" width="480px" style="display: block; margin: auto;" /> ### **What** should be measured? -- - **Kirkpatrick's Four Levels:** - Participant reaction (enjoyment) - Participant learning (meeting performance objectives) - Transfer (on-the-job performance change) - Organizational impact --- #
Learning Assessments <img src="img/girl-elephant.png" width="480px" style="display: block; margin: auto;" /> ### **What** sources of information for performance metrics? -- - Performance objectives -- - Learner/worker actual performance -- - Stakeholder preferences -- <hr> **Challenges** affecting performance metrics: pp. 218-220 --- #
Learning Assessments <img src="img/girl-elephant.png" width="480px" style="display: block; margin: auto;" /> ### **Factors** for successful performance metrics: -- - Learner involvement -- - Manager and stakeholder involvement -- - Time and cost factored -- - Relevant quantifiable data --- #
Learning Assessments <img src="img/girl-elephant.png" width="480px" style="display: block; margin: auto;" /> ### **Methods** for performance metrics: -- - Questionnaires, interview guides, observation forms, simulations -- - Criterion-referenced tests: Essay, fill-in-the-blank, completion, multiple-choice, true-false, matching -- - Other: Advisory committee, external assessment center, attitude survey, group discussion, exit interview --- #
Learning Assessments <img src="img/girl-leaf.png" width="560px" style="display: block; margin: auto;" /> ### **Trustworthiness** of performance metrics (pp. 228-229) -- - **Reliability:** consistency of measurement (inter-rater, test-retest, inter-method, internal consistency) -- - **Validity:** accuracy of measurement (construct, convergent, discriminant, content, representation, face, criterion, concurrent, predictive) -- - **Credibility:** belief in measurement (method, tool) --- class: inverse, center, middle #
<br><br> Module 7 <br> Evaluation **Part Two: Evaluation of Design** --- class: inverse, center, middle #
<br><br> Evaluating Instructional and Noninstructional Interventions <br> (Ch. 14) --- #
Evaluating Interventions ### Key Questions <img src="img/Rothwell-table14-1.png" width="560px" style="display: block; margin: auto;" /> <div class="caption"> Table 14.1 from Rothwell et al. (2016, p. 235) <br><br> </div> --- #
Evaluating Interventions ### Types of Data <img src="img/data.jpg" width="420px" style="display: block; margin: auto;" /> -- - Quantitative vs. Qualitative -- - Self-Report, Observational, Extant/Naturalistic -- - Small vs. Big (Human vs. Computational) --- #
Evaluating Interventions ### Methods of Data Collection <img src="img/data-collection-medium.jpg" width="420px" style="display: block; margin: auto;" /> -- - Interviews and Focus Groups -- - Observation -- - Surveys and Questionnaires -- - Tests -- - Data Mining and Analytics --- #
Evaluating Interventions ### **Formative** Evaluation of Design -- - Formative **Product** Evaluation -- - Formative **Process** Evaluation -- <hr> **Planning for formative evaluation:** -- - Expert reviews -- - Management or executive rehearsals -- - Individualized pretests and pilot tests -- - Group pretests and pilot tests --- #
Evaluating Interventions ### **Summative** Evaluation of Design -- **Kirkpatrick's Four Levels:** -- - **Level 1:** Learner satisfaction (did they like it?) -- - **Level 2:** Knowledge or skill acquisition (did they learn it?) -- - **Level 3:** Learner transfer (did they apply it on the job?) -- - **Level 4:** Organizational impact (did it make a difference?) --- #
Evaluating Interventions ### **Summative** Evaluation of Design **Phillips' (2011) ROI Model:** - **Level 1:** Learner satisfaction (did they like it?) - **Level 2:** Knowledge or skill acquisition (did they learn it?) - **Level 3:** Learner transfer (did they apply it on the job?) - **Level 4:** Organizational impact (did it make a difference?) -- - **Level 5:** Return-on-investment (financial impact of Level 4) --- #
Evaluating Interventions ### **Summative** Evaluation of Design **Brinkerhoff's (2010) Success Case Method (SCM):** -- Holistic and systemic approach to answer: -- - How well is an organization using learning to improve performance? -- - What organizational processes/resources are in place to support performance improvement? -- - What needs to be improved? -- - What organizational barriers stand in the way of performance improvement? --- #
Evaluating Interventions <img src="img/girl-elephant.png" width="720px" style="display: block; margin: auto;" /> ###
Discuss in groups: Develop a formative or summative approach to evaluating interventions... --- class: inverse, center, middle #
<br><br> Revising Instructional and Noninstructional Solutions Based on Data <br> (Ch. 15) --- #
Revising Interventions ### Stakholder Support <img src="img/Rothwell-exhibit15-1a.png" width="48%" /><img src="img/Rothwell-exhibit15-1b.png" width="48%" /> <div class="caption"> Exhibit 15.1 from Rothwell et al. (2016, pp. 268-269) <br><br> </div> --- class: inverse, center, middle #
<br><br> Implementing Instructional and Noninstructional Interventions <br> (Ch. 16) --- #
Implementing Interventions ### Implementation Plan **Begin with the end in mind** <img src="img/Rothwell-exhibit16-1.png" width="600px" style="display: block; margin: auto;" /> <div class="caption"> Exhibit 16.1 from Rothwell et al. (2016, p. 273) <br><br> </div> --- #
Implementing Interventions ### Implementation Plan **Engage stakeholders:** (Table 16.1, pp. 279-280) -- - Subject matter experts -- - Technology experts -- - Instructional designers -- - Graphic designers -- - Facilitators -- - Communication specialists -- - Project managers -- - Business leaders --- #
Implementing Interventions ### Implementation Dissemination -- - Self-directed learning -- - Self-paced interventions -- - In-person interventions -- - Blended learning -- - Train-the-trainer -- - Monitoring quality (back to performance metrics) -- <hr> ### Implementation Diffusion -- Focus on the speed, depth, and quality of **adoption** --- class: inverse, center, middle #
<br><br> Expanding the <br> ISD Analysis Toolbox --- #
ISD Analysis Toolbox <img src="img/toolbench.jpg" width="280px" style="display: block; margin: auto;" /> - Systems Analysis - Performance Analysis - Needs Assessment / Needs Analysis - Training Requirements Analysis - Root Cause Analysis - Competency Assessment - Learner Assessment - Setting Analysis - Developmental Setting Assessment - Job Analysis - Task Analysis - Content (Subject Matter) Analysis --- #
ISD Analysis Toolbox <img src="img/toolbench.jpg" width="280px" style="display: block; margin: auto;" /> - Goal Analysis - Learning Task Analysis - Hierarchical Analysis - Cluster Analysis - Procedural Analysis - Materials Assessment / Materials Analysis - Cost-Benefit Analysis --- #
ISD Analysis Toolbox <img src="img/toolbench.jpg" width="280px" style="display: block; margin: auto;" /> - **Performance Metrics** - **Formative Product Evaluation** - **Formative Process Evaluation** - **Kirkpatrick's Four Levels Evaluation** - **ROI Five Levels Evaluation** - **Success Case Evaluation** --- class: inverse, center, middle #
<br><br> Course Evaluations --- class: inverse, center, middle #
<br><br> Looking ahead --- #
Semester Schedule <img src="img/across-time.jpg" width="480px" style="display: block; margin: auto;" /> - **Module 1:** Introduction to Instructional Systems Design - **Module 2:** Systems Analysis - **Module 3:** ISD & HPT - **Module 4:** Needs Assessment - **Module 5:** Work Analysis - **Module 6:** Design & Development - **Module 7: Evaluation** --- #
Major Assignments <img src="img/build.jpg" width="320px" style="display: block; margin: auto;" /> - **Assignments** (70%) - Module 2 Assignment: Systems Analysis paper (150 points) - Module 3 Assignment: Annotated Bibliography 1 (50 points) - Module 4 Assignment: Needs Assessment paper (150 points) - Module 5 Assignment: Annotated Bibliography 2 (50 points) - Module 6 Assignment: Work Analysis paper (150 points) - **Module 7 Assignment: ISD Process Model paper (150 points)** --- #
ISD Process Model <img src="img/build.jpg" width="320px" style="display: block; margin: auto;" /> -- **2,000 - 2,500 words** -- - Propose and explain modifications to an existing model or a new models -- <hr> - **Setting:** Organizational Characteristics -- - **Problem Analysis:** Gaps in the Current Education/Training Process -- - **Recommendations:** Recommendations for a New or Revised Model -- - **Conclusion:** Discussion of Your Recommendations. --- class: inverse, center, middle #
<br><br> Questions <hr> **What questions can I answer for you now?** **How can I support you this week?** <hr>
[bret.staudtwillet@fsu.edu](mailto:bret.staudtwillet@fsu.edu) |
[bretsw.com](https://bretsw.com) |
[GitHub](https://github.com/bretsw/)