A teacher’s dilemma: taking risks beyond the elimination answer choice C

test-986769_960_720

We ask teachers to embrace change, and the pressure on teachers is not to take risks but to march whatever children they can, lockstep, toward higher standardized test scores. – Robert P Moses, Radical Equations (p. 126)

Thanks to a recent conversation, once again I’m confronted with the heavy hand of high-stakes exams.

How can a teacher, like myself, establish and maintain a classroom centered on inquiry, contemplation, and sense making within a system that rises and fails on the scaled scores of New York State Regents exams? How can a teacher move a classroom of students beyond a no. 2 pencil and bubbles containing A, B, C, and D?

I guess this is nothing new. I’m simply reiterating a concern that most teachers have.

I find myself more entrenched in this battle than ever before. The more I teach, the more I realize how oppressive these exams are. I am forced to get kids “through” by whatever means necessary. Schools get recognized and accolades given out for producing students that are “college ready,” which is a reflection of students’ performance on Regents exams. This sort of verbiage gets everyone on the same page. The result is an unspoken, politically correct pressure placed on me and my students to conform to these narrow measures of mathematical fluency. This pressure results in anxiety and dramatically affects the quality of my instruction.

As someone in the classroom everyday doing this work, I’m so wrapped up in these damn exams that I don’t even have time to prepare my students to be “college ready.” Maybe I’m doing something wrong.

I’m essentially a Regents-driven machine whose sole job is to produce other machines who can generate positive results on these exams. Please, forget about the genuine, messy learning of mathematics that I desire.

Furthermore, in a society obsessed with test scores, obtaining a 65 (or 95) can indeed be the ticket to success. Students are only as good as the score they produce. They themselves know this, so their motivations often rise and fall on these exams as well. This is the cherry on top.

Despite this downward spiral, there is hope.

Patrick Honner’s Regents Recaps help me keep things in perspective. His reflections are thoughtful, full of mathematical insight, and shed light just how much of a joke these exams are. Without knowing it, he compels me to teach beautiful mathematics far beyond the expectations of a Regents exam.

And then there are educators like Jose Luis Vilson, Christopher EmdinRobert P. Moses, and Monique W. Morris. Through their writing, they’ve cautioned me that earning a 65 on a Regents exam for many of my students is the least of their worries, despite what school and New York State may tell them. They motivate me to bring often-ignored social issues to the fore.

There are many others who I have met either in person or online who have provided similar inspirations. There are far too many to name.

This leaves me torn.

On one hand, I’m fortunate enough to have a fairly high level of autonomy in my classroom. What my students and I accomplish in the 45 minutes we’re allotted each day is up to us. There’s relatively low oversight. Despite the immense pressures to bubble our lives away, I aim to spend time asking big questions, sharing the joy of mathematical discovery and learning, and enjoying the ride. This is empowering. Hell, I don’t even call my class exams “exams” anymore.

On the other, I am confused. And worried. The fear of a low passing rate has left me paralyzed in the midst of students who desperately need me to be fully aligned with their needs. But if I cannot afford to take meaningful risks in my classroom that go beyond eliminating answer choice C, if I can’t be bold in the face of oppression and conformity, what does this mean for my teaching? More importantly, what does this mean for my students?

 

bp

 

The day after

london_underground2

I don’t like review days before exams. I’d much rather spend that day after an exam analyzing mistakes and relearning. I find this to be crucial in promoting a growth mindset in my students. My struggle has been how to structure these post-exam days. Here’s a formative assessment idea that I’ve used a few times this year.

The day after an exam, I set up the room in 3-5 stations. Each serves as a place to study a particular concept that was on the exam.

My bell ringer asks students to check their exam performance on the bulletin board in the back of the room. It lets them know for which concepts they earned proficiency. I also email the kids their performance immediately after assessing the exams, but many don’t check.

I hand back the exams and they move to a concept that they need help with based on their performance. If they have earned credit for every concept on the exam then I ask them to float and help others. At each station they use notes, each other, and the feedback I provided on the exam to analyze and learn from their mistakes. I also have practice problems at each station so they can make sure they understand the concept. I float around the room and help. Of course, the SBG data allows me to sit with students who need me most.

After a student feels they have successfully relearned a concept, and I usually check in to confirm, they can retake that concept. The retakes are in folders in the corner – students grab one and do it anywhere in the room. They submit it and begin working on another concept, if necessary. It doesn’t matter how many concepts a student retakes during the period, but it usually works out to be 1-2.

Before I did this activity I was concerned that since the stations would be full of students that struggled on a concept that they would all sit together and get no where. This hasn’t been the case. The kids are diligent to relearn. This may be because they like retaking exams and earning proficiency during class time, as I usually make them come after school to do this. It helps that the relearning is targeted and individualized to each student. Plus, it’s all formative. They go wherever they feel they need to. They assess themselves, but use one another in the process.

It can look and feel chaotic. But that’s the point. Improvement is messy. It’s also amazing – especially when it happens amongst your students.
bp

Exams: tools for feedback, answers provided, and lagged

I’ve made three fundamental changes to my unit exams this year.

Part One: Exams as tools for feedback

After a student gets an exam back, what’s the first thing they notice? Easy: their overall score. That’s why I’m not putting scores on exams.

All too often a student sees their score and, especially if its low or mediocre, views the score as the only thing that matters. Even with standards-based grading, sometimes students will get caught up in whether they’ve earned proficiency on a particular concept (which isn’t a bad thing). What they forget about is how to correct mistakes and improve based on how they did. This attitude is more present in struggling learners than it is in high achievers, but it is present throughout.

This is why this year I’ve decided to not put scores or grades on exams. I am only putting feedback. That feedback comes in the form of highlighting incorrect work, asking clarifying questions, inserting direct how-to, and cheers for correct responses. Never will my students find an overwhelming (or underwhelming) score on their exam. When they critic their performance, I want them to focus on their work – not lament on their grade. My next challenge is to get them to actually internalize and grow from the feedback.

Part Two: Exams that focus on why

On exams, I’m providing the answer to every question.

I know this is ridiculous and unheard of, but here’s my thing: I want to build a classroom culture that hinges on questions, not answers. In fact, I fear my kids being answer-driven. I want students to focus on the how and why rather than the what. In addition to simply talking to them and encouraging this frame of mind on an ongoing basis, I wanted to add a structural aspect that can help accomplish this. Providing every answer is what I came up with.

I know this doesn’t simulate standardized exams outside of my room and is fairly impractical, but I hope that I’m helping them see the bigger picture. Besides, I already include them into classwork and homework assignments, so I figured why not exams too?

Part Three: Exams that lag
After reading much about the power of lagging homework from the MTBOS, this summer I decided to incorporate it. In addition, I’ve decided to lag my unit exams. 

It just makes sense to lag both. In fact, when I made the choice to lag my homework, I found lagging unit exams to be direct corollary. Summative assessments (e.g. exams) should always align with what and how I teach. If I lag homework and 80% of what students are doing every night focuses on review content, how can administer an exam of 100% new content?

This all may backfire completely. But at least then I’ll be able to add them to the extensively long list of things that I’ve failed at implementing.



bp

Review Days?

Math Test Easy or Wrong

I was having a discussion with a few of my colleagues today about our upcoming quarterly exams. They’re basically our midterms – we just give them a fancier name. We were talking about the amount of review days (or class periods) that are necessary for students to prepare for these types of summative assessments.

It made me think. I don’t really “review” before big exams. I don’t even regularly review before a unit exam. I definitely have days where students aren’t learning new material. Maybe they’re reinforcing things they have learned previously. Or maybe it’s an extension of a previous lesson. Those aren’t the days I’m thinking of. I’m thinking of traditional review days that aim to refresh students minds before a significant assessment.

Whenever it’s necessary or if I feel the moment is right, sure, I’ll whip out a game to review or have the students speed date to catch each other up. I do these sorts of activities from time to time, but this is infrequent and rarely happens right before an exam.

So why don’t I review before exams?

I want raw, unfiltered data on my students’ understanding. By reviewing just before an exam, I am giving my kids a mental cheat sheet that they can use on the day of the exam. Did they understand the content before we reviewed or because we reviewed? My data is inherently skewed because of our review. But if I test them on a concept that we studied two months ago, yes, I want data on their understanding of that concept without refreshing their memories. I want them to forget a step or two. Or for that matter, I don’t mind if they completely forgot how to start a problem. This is precisely the information that will drive my future planning. Also, those misconceptions are exactly the sorts of things that will help my students do better in the future.

But don’t students want review? I mean, they want to increase the odds of performing well on the assessment and therefore increasing their overall grade in the class, right? I answer that question in two ways. First, there are plenty of instances where I don’t even count an exam towards their overall grade. Yes, I said that. I treat it as a diagnostic and the kids buy into it. I find out where they’re at and they get a (potentially poor) exam that doesn’t count against them. We all win. Second, students will adapt to the no-review-before-an-exam policy. They will meet our expectations. If they know I don’t review before exams, then over time they will prepare/study accordingly. And if at first they don’t, they may complain, but eventually they will come around.

So what to do with that extra time not spent reviewing? I spend that day or two (or three) after a big exam to reteach and reassess what my students had trouble with. It just makes sense. Now, because I looked at the data, I have a pretty good understanding of what they know and don’t know. I can pinpoint how I reteach a particular set of concepts. Often times, I even immediately reassess them on concepts they struggled with. This almost always results in improvement, which only helps to establish a growth mindset. It also helps them understand the method behind the madness of no review days.

I guess all this may count as “review after the test” and I’m good with that. Reviewing before an exam is overrated. Intuitively, as teachers, it makes sense to review before. But I think the more effective strategy is to do so after.

I’m really curious about what others believe about this. How do you incorporate review into your class structure?

bp

Quick Key

To help me collect data, I’ve been using a tool for the last couple of months. It’s called Quick Key and it’s used to quickly and easily collect responses from multiple choice questions.

For a long, long time, my school utilized the Apperson Datalink scanner to aid in scoring multiple choice portions of exams. It not only scores exams quickly and efficiently, but its accompanying software provides insightful data analysis that I use to modify my teaching. On the downside, these machines are pricey (almost $1000) and require you to purchase their unique scanning sheets that work only with their machine. Each department in my school had a machine.

Because of my push towards standards-based grading, I find myself giving smaller, bite-size assessments that target fewer concepts. Consequently, I am assessing more frequently and I need the scanning machine at least once a week. The machine was constantly changing hands and I was always running around the building trying to track it down.

I decided that I didn’t want to be a slave to the scanner – and its arbitrary sheets. It’s not sustainable. Especially when we have mobile technology that can perform the same task and provide similar results.

Enter Quick Key.

Quick Key has allowed me to score MC items and analyze my students’ responses in a much more convenient and cost-effective way. Like, free. Hello. You simply set up your classes, print out sheets, and start scanning with your mobile device. (You don’t even need to have wifi or cellular data when scanning.) The interface is pretty clean and easy to use. Plus, it was created and designed by a teacher. Props there too.

Data is synced between my phone and the web, which allows me to download CSV files to use with my standards-based grading spreadsheets.

SBG screenshot

My SBG tracking spreadsheet

That is the big Quick Key buy-in for me: exporting data for use with SBG. As I have mentioned before, SBG has completely changed my teaching and my approach to student learning. At some point, I hope to write in-depth about the specifics of this process and the structure I use.

Though the Quick Key data analysis isn’t as rigorous as what I would get from Datalink, it suffices for my purposes. I sort of wish Quick Key would improve the analysis they provide, but for now, if I need more detailed analytics, its usually requires a simple formula that I can quickly insert.

Data from QK

Sample data analysis from Quick Key

Data from Datalink

Sample data analysis from Datalink

Through all this, I don’t overlook the obvious: MC questions provide minimal insight into what students actually know, especially in math. That being said, my students’ graduation exams still require them to answer a relatively large number of MC items. For that reason alone I feel somewhat obligated to use MC questions on unit exams. Also, when assessing student knowledge via MC questions, I do my best to design them as hinge questions. TMC14 (specifically Nik Doran) formally introduced me to the idea of a hinge question, which are MC questions that are consciously engineered to categorize and target student misconceptions based on their answer. In this way, students responses to MC questions, though less powerful than short response questions, can provide me an intuitive understanding of student abilities.

Quick Key recently introduced a Pro plan ($30/year) that now places limitations on those that sign up for free accounts. Their free plan still offers plenty for the average teacher.

Either way, Quick Key still beats a $1000 scanner + cost of sheets.

bp