Last week we welcomed Lynn Wahl, Instructional Designer for UNC Charlotte, to discuss how they used design thinking to create online faculty development workshops. In a continuation of the topic, Lynn joins us today to talk about how to assess a learning opportunity such as the online workshops to ensure they continue to engage attendees and meet their needs.
Thank you very much to Lynn for showcasing the creative endeavors and assessment practices going on at UNC Charlotte!
Enjoy the read and enjoy your day,
Lindsey Downs, WCET
A major focus in planning and development training is ensuring that it fulfills the needs of its intended audience. This is particularly important in online workshops since they often require a higher facilitation effort that you don’t want to be wasted. This places a heavy emphasis on iterative revisions of workshops to make sure everything is working well.
One method commonly used to determine where to make changes to training are end of workshop surveys; but in online workshops, surveys alone are not enough to show the complete picture of how participants engage with content, assessments, and the facilitator.
Learning analytics tracked within a learning management system can provide the rest of the picture developers need to make meaningful revisions to training.
The Center for Teaching & Learning at UNC Charlotte began delivering online pedagogy workshops in Spring 2017. The first online workshop on writing learning objectives was created using a design thinking methodology. We discussed the development of our online faculty workshops in a post last week on Frontiers. After great participant feedback, two more workshops were transformed to an online offering.
All three of our online workshops utilize different facilitation models matched to the topics and the types of interactions most helpful for participants to learn the required skills:
|Workshop||Facilitation model||Interaction types and skills|
|Introduction to Learning Objectives and Backward Design||Facilitated||Contains auto and manual graded self-checks, discussion, and an application-based assignment. Helps faculty practice a difficult skill and get feedback.|
|Syllabus 101||Resource Based||Contains optional auto-graded self-checks and a culminating discussion forum. Provides valuable “just-in-time” information.|
|Using Feedback to Improve Teaching and Learning||Discussion Based||Asks faculty to share their experiences in overcoming feedback challenges in their courses.|
Gathering and analyzing data from online workshops can be a little overwhelming, but even simple learner analytics like page views and discussion forum mapping can give you the extra information you need to improve participant experience. This type of information can be pulled from your learning management system (LMS).
For example, we use Canvas to house our faculty workshops. A free Canvas workshop created by Dartmouth College gave us a starting point for pulling data out of Canvas. We were able to pull information out of our institution’s instance of Canvas and use pre-built analytics tools to explore and map the data. This data should be available to your Canvas administrators who may be able to assist you with this process.
Once you’ve retrieved data, often the more difficult task is making sense out of it and answering the “so what?” question.
In looking through the huge amount of information provided through page views across multiple sections of the three workshops, we noticed a few things:
- Participant working time is between 8am and 11pm, with peak working time between 9am and 11am.
- Participants spend significantly less time than allotted in the facilitated learning objectives workshop (workshop 1 in the chart above).
- There are clear divisions between the top page views in the workshop and the less visited resources.
- Facilitation and engagement with instructors in the discussion-based workshop has a big impact on how participants interact in the forums.
- Survey results and participant feedback don’t always match up with how participants engage with the workshops
- Participants frequently return to the content (both resources and discussion posts) in the resource-based workshop, infrequently to the facilitated workshop, and not at all to the discussion-based workshop
Based on what we saw in the data, we could see that the best strategy was to design for the normal participant experience and accommodate the outliers when necessary.
Under this guiding thought, we are now implementing the following revisions in the online workshops:
- Change the amount of time we state it takes to complete the workshops to more accurately reflect the workload.
- Reply to questions and schedule announcements and the release of new content before 9am every day.
- Create additional workshops and resources around the top page views in all of the workshops.
- Remove the bottom viewed pages in all of the workshops.
- Add more media and examples.
- Remove some of the self-check multiple choice quizzes.
Once the changes are in place for a few semesters, we can rerun the analytics to see how our changes affect participant experiences.
How can you use analytics to help make your events or courses better? Once you have the data you are empowered to make meaningful updates to these learning experiences and make them better for your attendees.
- Excel Pivot Tables: https://support.office.com/en-us/article/create-a-pivottable-to-analyze-worksheet-data-a9a84538-bfe9-40a9-a8e9-f99134456576
- Canvas Analytics Course: https://learn.canvas.net/courses/1176
UNC Charlotte Center for Teaching and Learning