Jonas Köster recently produced a beautiful and research-rich text entitled Video in the Age of Digital Learning. For those of us in education and developing instructional media, we already know what Köster lays out on the first page—“recent studies overwhelmingly predict the continual rise in the use of instructional video” (xv). Here’s why: “digital video is an extremely powerful method to tell stories, explain complex issues through engaging visuals, offer the learner the ability to work at their own pace, and . . . [it’s] the most efficient and effective method for bringing a teacher and learners together at an incredible scale” (xv).
This shift in teaching and learning requires more than just a camera and an eager instructor, however. For example, student attention span has shortened to only about 8 seconds and making a video engaging “requires a thorough examination of the medium to find the best ways to make it as useful as possible” (xvii). Without regurgitating the entire text, I’ll outline a few aspects of Köster’s book that stood out most.
It’s important for all of us to get feedback and the timeliness of feedback matters too. Remember how it felt when you submitted something to your doctoral thesis committee to review and they took FOREVER to get back to you? Or when you posted that picture on Facebook and the folks you thought would love it didn’t even give it a like let alone a comment? Timely feedback to students is useful to their learning and could be that thing that helps them feel like they belong at Carleton.
When designing or revising your course, one way to situate the types of feedback you’ll give is by using the classic Backward Design model by Wiggins and McTighe. Specifically, it can be helpful to use their diagram for setting curricular priorities into alignment with the types of assessment you might use. We can imagine that quizzing might best align with the concepts or outcomes that are important for students to know or to have facility with in order to wrestle with the BIG ideas or “enduring understanding” of a course.
Quizzing, and particularly multiple choice quizzing done outside the classroom (such as implemented via Moodle, auto-graded, and reported to the gradebook), can make frequent, meaningful feedback for students not only possible but efficient.
Frequent low stakes “testing” (i.e. the need to retrieve information whether in a quiz or otherwise) promotes learning (Roediger and Butler 2011). Moreover, frequent quizzing, besides promoting memory, increases the likelihood of transfer (Carpenter 2012).
You can also give feedback on these quizzes. The same Roediger and Butler–but this time in 2008–showed that while multiple choice questions improve student performance, feedback to students on their answers provides additional benefit. If that feedback is explanatory as to why an answer is wrong the transfer effect is stronger than simple feedback saying the answer is wrong (Moreno and Mayer, 2005). Crafting feedback is decidedly not efficient though! But…it may still be worth your effort in terms of student learning and if you reuse the quizzes your time investment will pay off. Moodle can help here too by making it easy to add feedback specific to each of the possible choices students can make in the quiz. And if you’re teaching a course that uses a textbook you should be aware than many textbooks provide banks of questions with answers and feedback and this can certainly lighten your load.
As always, AT is here to help you if you want to consider this pedagogical move. Please don’t hesitate to reach out to me (firstname.lastname@example.org) or any ATer if you have questions or concerns or would like to work with us!
Butler AC and Roediger HL III (2008). Feedback enhances the positive effects and reduces the negative effects of multiple-choice testing. Memory and Cognition 36, 604-616.
S.K. Carpenter (2012),Testing enhances the transfer of learning, Current Directions in Psychological Science (Sage Publications, Inc), 21(5).
Moreno and R.E. Mayer (2005), Role of guidance, reflection, and interactivity in an agent-based multimedia game, Journal of Educational Psychology 97(I).
Roediger HL III and Butler AC (2011). The critical role of retrieval practice in long-term retention. Trends in Cognitive Sciences 15, 20-27.
Wiggins, Grant P., and Jay McTighe (2011). The Understanding by Design Guide to Creating High-Quality Units. Alexandria, Va: ASCD.
Effective instructional videos can vary in style. This short video, inspired by an Arizona State University study, reveals preferences and effectiveness in two different styles:
Should you teach to the camera/viewer or
Should you teach a student who is also on camera and film that interaction?
This video featuring Dann Hurlbert, Carleton College’s Media & Design Guru succinctly recaps a 2018 study from ASU’s Katelyn M Cooper, Lu Ding, Michelle Stephens, Michelene T. H. Chi, and Sara E Brownell.
I’m already excited to be a part of the team hosting this Instructional Video Workshop at Carleton in late July! Attendees will not only take-way a concrete and replicable process for creating process, but they’ll create [at least] 3 Instructional Videos they can start using right away. The seats filled-up so fast, there is no doubt we’ll be doing more of these in the future! More information on the workshop itself is available here. And if you’d like to be notified when we host another one, please complete this short form. — dann
Podcasts are digital audio files which can be listened to by streaming or downloading.
Why use podcasts?
Podcasts are sometimes used in lieu of a paper assignment or to augment a paper assignment. A podcast is devoid of visual material (think NPR) and this can be useful for focusing student attention. Continue reading Quickstart for Podcasts
Instructional Videos come in all shapes and sizes, but there are some statistics you should pay attention to to maximize student learning. The quick overview is that you should keep your video short (under 6 minutes), friendly, and connect it directly to an assessment.Here’s a link to an article from The American Society for Cell Biology that offers some more good insight on the topic:
A recent Science Post satirical article titled, “I just know” replaces systematic reviews at the top of the evidence pyramid, is a pretty funny read with a darker side.
While the article focuses on medical science (“There is no science backing up my claim that the homeopathic pill cured their cold, but in my gut I just know it did.”) it got me thinking about the teaching and learning work we do here at Carleton and our levels of evidence.
What evidence–beyond “I just know”–do we accept for what we have done in Carleton courses or for what we hoped to have done? If evidence more robust than “I just know” was available for our teaching and learning endeavors, would we want to gather it? What if it was easily available? Any instructional technology we use at Carleton can help with the collection of evidence. And with thoughtful design, that collection of evidence can be “easy” while still being meaningful.
Communicating arguments effectively through a visual medium has its own particular set of opportunities, challenges, and logics–and students often lack exposure and practice in these areas. In this session, we’ll showcase different approaches to design-rich assignments including tips for scaffolding, timing, and assessing student work.
Join Doug and Celeste for a fun hour of talking through visual arguments, possible assignments and assessments!
Video is an excellent way to communicate—but, watching video is a passive activity. Since learning occurs best through engagement . . . finding a way to make your videos more engaging is essential. Here’s a great article from Penn State on Interactive Video Assessment Tools. Take a read, then challenge yourself to couple any video you show in class with an associated engaging activity.