The first chance I got, I shared with my colleague John Golden my enthusiasm for monitoring the MPG gauges and how they were affecting my driving. As I recall, he wondered aloud how the current MPG gauge worked. It was interesting to think about but I did not spend much time on it. I trusted it. After all, I was using less gas.
As so often happens, the conversation morphed into us talking about teaching. I saw the gauges as a metaphor for assessing – gathering data about learning. I said something like, “Wouldn’t it be wonderful if our learners came with displays that would give us immediate feedback on their level of interest or understanding?” I do not remember where the conversation went, but the mental picture of my learners with engagement or understanding gauges prominently displayed stuck. While they are not gauges, I have found several approaches to serve as substitutes.
I do not remember where I got the idea for using green, yellow, and red cups as a way to monitor learners’ progress but a presentation by my colleague Jacque Melin comes to mind. Learners use the cups to represent their current level of progress – green means go (“I have no questions”), yellow means caution (“I have questions but I’m still moving forward”), and red means stopped (“I have questions that have stalled my progress”). During activities, I watch (and listen) for the changing cups and know when I need to offer additional support.
Fisher and Frey (in Checking for Understanding) describe a method where learners hold up fingers, from zero to five, to communicate their current understanding of the topic being discussed. I occasionally use this approach when I want finer detail than the cups can offer. “Great, you are at green. But do you really understand what you are doing?” Using this assessment, I can gather information about whether or not I need to adjust the lesson to meet my learners’ needs.
I rarely use this final approach, a modification of Fisher and Frey’s, but I find it very powerful. Last semester, I used it in an introductory education course. The learners were not engaging with some of the content and I wanted to find out why. So one day I introduced an Engagement Monitoring Sheet. After each workshop, they reflected on their level of engagement during the different phases of the workshop. At the end of the class, they looked for any patterns in their engagement and wrote a reflection.
I collected their Engagement Monitoring Sheets and found two interesting things that changed the way I taught the course. First, many learners struggled to engage after taking a break. We talked about this and I offered strategies they might use to re-engage. Throughout the remainder of the class, I noticed many learners using these strategies coming out of break. The second observation was that it was difficult for them to engage if they did not know what to do. In other words, if I was too vague then they disengaged. From then on, I concentrated on the clarity of my instructions and asked others to repeat them to ensure that they had understood. By altering my teaching, learners’ level of engagement improved.
Yes, I know that these approaches rely on the learners’ ability to self-assess their own understanding or engagement and then report it honestly. Learners may need help in calibrating their gauges and feeling safe sharing their assessments. But if I can trust my car, then I can trust my learners. Anyway, the proof is in the results. And I do like at the end of a trip when my car evaluates my driving as EXCELLENT!
Yes, I know that these approaches rely on the learners’ ability to self-assess their own understanding or engagement and then report it honestly. Learners may need help in calibrating their gauges and feeling safe sharing their assessments. But if I can trust my car, then I can trust my learners. Anyway, the proof is in the results. And I do like at the end of a trip when my car evaluates my driving as EXCELLENT!
This comment has been removed by a blog administrator.
ReplyDelete