Review Days?

Math Test Easy or Wrong

I was having a discussion with a few of my colleagues today about our upcoming quarterly exams. They’re basically our midterms – we just give them a fancier name. We were talking about the amount of review days (or class periods) that are necessary for students to prepare for these types of summative assessments.

It made me think. I don’t really “review” before big exams. I don’t even regularly review before a unit exam. I definitely have days where students aren’t learning new material. Maybe they’re reinforcing things they have learned previously. Or maybe it’s an extension of a previous lesson. Those aren’t the days I’m thinking of. I’m thinking of traditional review days that aim to refresh students minds before a significant assessment.

Whenever it’s necessary or if I feel the moment is right, sure, I’ll whip out a game to review or have the students speed date to catch each other up. I do these sorts of activities from time to time, but this is infrequent and rarely happens right before an exam.

So why don’t I review before exams?

I want raw, unfiltered data on my students’ understanding. By reviewing just before an exam, I am giving my kids a mental cheat sheet that they can use on the day of the exam. Did they understand the content before we reviewed or because we reviewed? My data is inherently skewed because of our review. But if I test them on a concept that we studied two months ago, yes, I want data on their understanding of that concept without refreshing their memories. I want them to forget a step or two. Or for that matter, I don’t mind if they completely forgot how to start a problem. This is precisely the information that will drive my future planning. Also, those misconceptions are exactly the sorts of things that will help my students do better in the future.

But don’t students want review? I mean, they want to increase the odds of performing well on the assessment and therefore increasing their overall grade in the class, right? I answer that question in two ways. First, there are plenty of instances where I don’t even count an exam towards their overall grade. Yes, I said that. I treat it as a diagnostic and the kids buy into it. I find out where they’re at and they get a (potentially poor) exam that doesn’t count against them. We all win. Second, students will adapt to the no-review-before-an-exam policy. They will meet our expectations. If they know I don’t review before exams, then over time they will prepare/study accordingly. And if at first they don’t, they may complain, but eventually they will come around.

So what to do with that extra time not spent reviewing? I spend that day or two (or three) after a big exam to reteach and reassess what my students had trouble with. It just makes sense. Now, because I looked at the data, I have a pretty good understanding of what they know and don’t know. I can pinpoint how I reteach a particular set of concepts. Often times, I even immediately reassess them on concepts they struggled with. This almost always results in improvement, which only helps to establish a growth mindset. It also helps them understand the method behind the madness of no review days.

I guess all this may count as “review after the test” and I’m good with that. Reviewing before an exam is overrated. Intuitively, as teachers, it makes sense to review before. But I think the more effective strategy is to do so after.

I’m really curious about what others believe about this. How do you incorporate review into your class structure?

bp

Quick Key

To help me collect data, I’ve been using a tool for the last couple of months. It’s called Quick Key and it’s used to quickly and easily collect responses from multiple choice questions.

For a long, long time, my school utilized the Apperson Datalink scanner to aid in scoring multiple choice portions of exams. It not only scores exams quickly and efficiently, but its accompanying software provides insightful data analysis that I use to modify my teaching. On the downside, these machines are pricey (almost $1000) and require you to purchase their unique scanning sheets that work only with their machine. Each department in my school had a machine.

Because of my push towards standards-based grading, I find myself giving smaller, bite-size assessments that target fewer concepts. Consequently, I am assessing more frequently and I need the scanning machine at least once a week. The machine was constantly changing hands and I was always running around the building trying to track it down.

I decided that I didn’t want to be a slave to the scanner – and its arbitrary sheets. It’s not sustainable. Especially when we have mobile technology that can perform the same task and provide similar results.

Enter Quick Key.

Quick Key has allowed me to score MC items and analyze my students’ responses in a much more convenient and cost-effective way. Like, free. Hello. You simply set up your classes, print out sheets, and start scanning with your mobile device. (You don’t even need to have wifi or cellular data when scanning.) The interface is pretty clean and easy to use. Plus, it was created and designed by a teacher. Props there too.

Data is synced between my phone and the web, which allows me to download CSV files to use with my standards-based grading spreadsheets.

SBG screenshot
My SBG tracking spreadsheet

That is the big Quick Key buy-in for me: exporting data for use with SBG. As I have mentioned before, SBG has completely changed my teaching and my approach to student learning. At some point, I hope to write in-depth about the specifics of this process and the structure I use.

Though the Quick Key data analysis isn’t as rigorous as what I would get from Datalink, it suffices for my purposes. I sort of wish Quick Key would improve the analysis they provide, but for now, if I need more detailed analytics, its usually requires a simple formula that I can quickly insert.

Data from QK
Sample data analysis from Quick Key
Data from Datalink
Sample data analysis from Datalink

Through all this, I don’t overlook the obvious: MC questions provide minimal insight into what students actually know, especially in math. That being said, my students’ graduation exams still require them to answer a relatively large number of MC items. For that reason alone I feel somewhat obligated to use MC questions on unit exams. Also, when assessing student knowledge via MC questions, I do my best to design them as hinge questions. TMC14 (specifically Nik Doran) formally introduced me to the idea of a hinge question, which are MC questions that are consciously engineered to categorize and target student misconceptions based on their answer. In this way, students responses to MC questions, though less powerful than short response questions, can provide me an intuitive understanding of student abilities.

Quick Key recently introduced a Pro plan ($30/year) that now places limitations on those that sign up for free accounts. Their free plan still offers plenty for the average teacher.

Either way, Quick Key still beats a $1000 scanner + cost of sheets.

bp

Traffic Light


Traffic Light

I’ve seen and read about many “traffic light” strategies used in the classroom. In most instances, its a label we use for a strategy thats helps us gauge student understanding or receive feedback. Here’s another twist on it.

I’m using it as a formative assessment strategy that I fittingly call Traffic Light. (Very creative, I know.) I’ve laminated red, yellow, and green pieces of paper and slid them into another laminated piece of paper that I half-taped to the top of each desk.

During any given lesson, I mention “Traffic Light!” and my students hold up a color corresponding to their level of understanding at that moment. Sometimes I see a sea of green, sometimes a mix, and sometimes I see so much red that I myself turn red. Either way, I have found the cards to be an indispensable tool to keep a pulse on how things are going and, if need be, change things up on the fly. There are plenty of instances where I needed to re-explain something, regroup students, or change the approach to a concept. And, without this in-the-moment feedback from the kids, I probably would not have been aware that a change was necessary.

I must put out a disclaimer. When I first started using the cards, I found that some of the quieter students would hold up a green to avoid me eyeing their yellow or red card – essentially making them “stick out” to me. I had a talk with my classes about how their learning is dependent on their integrity. We also discussed honesty as it relates to their understanding and how this is a driving force of everything we do. I did find that all this helped encourage the kids to provide more accurate responses.

Besides the obvious benefit for me, their teacher, the students actually enjoy using Traffic Light. At the end of the first semester, I asked each student to provide me with one thing they thought went well and one thing they felt needed improvement in our class. I was surprised by this, but several students actually mentioned the Traffic Light cards.

Feedback Traffic Light

(“the new grading system for exams” refers to my shift to standards-based grading)

It could be the interactivity. Students get to, essentially, voice their opinion…and teens love to do that. It could also the message it sends: that I’m willing adjust any lesson based on how they’re learning – and then to actually adjust it. Who knows. I’m just glad they’ve taken to it.

bp

Exit here

Exit Sign

Things have slowed down for me this semester. Not teaching four preps helps (now I only have three). Because of this, I have been able to dig in and get a firmer grip on my classes.

Improving how I assess my students has been a goal of mine for a while. I have finally gotten around to using exit slips on a consistent basis. I have always given a formative assessment at the beginning of class, but not usually at the end. For a long time, exit slips were papers that would pile up on my desk. Subconsciously, I didn’t see the value in assessing my students’ understanding at the end of the class. Somewhere, deep down, I knew they were beneficial, I just didn’t embrace it. I wonder which planet I was teaching on.

Now, I must formatively assess at the end of my class. Not because an administrator told me so or that it’s a district-wide policy, but because I need to know what my students learned (or didn’t learn) each day. I must mention John Scammell and his awesome presentation on formative assessment during TMC14. He caused me to reflect on my assessment practices in a deep way. I’ve now fully realized that exit slips immediately affect my approach the following day and every other day. In other words, I now see value in exit slips.

Well. I say all that to share how I now use exit slips. Like most teachers, they usually only take a few minutes for students to complete. After class, I sit at my desk and go through the slips, categorizing students’ work – sorting them into various piles. I don’t actually use them for a grade so I’m not worried about recording scores. My only focus is student understanding.

After I identify common mistakes or other trends that need to be addressed, I must communicate these to my students. Telling them is one thing, showing them is another. But how to do this in a quick, efficient way? I simply use my laptop’s webcam to snap photos of a few exit slips and insert them into a few slides that precedes my lesson. It literally takes a few minutes to prep. Take photo. Paste. That’s it. A couple examples:

Exit Slip 1

Exit Slip 2

 

Upon showing the work to the class, great discussion usually follows. I ask which error(s) they see and how we can fix them. Its incredibly useful and never runs more than a couple minutes before the lesson. The idea is to show them their mistakes. The whole scene is similar to using Math Mistakes in that we’re examining real student work – but its just their work. (Of course, I remove names so no one is singled out.)

The kids are pretty receptive to seeing their mistakes. And by using the exit slips to direct their learning and analyze their work, my students have never complained about doing the exit slips. They just do them now because they are worthwhile. Sounds like me.

 

bp