Anticipating student responses 

IMG_0423

A colleague and I recently planned a lesson on developing a need for factoring when solving quadratic equations. It was totally unexpected. He walked into my room after school and we started talking. Two hours later, we had the framework of a lesson.

Two teachers. Two hours. One lesson (sort of).

What struck me was that the bulk of the time was spent anticipating student responses to questions we wanted to ask during the course of the lesson. We went back and forth about the roles of the distributive and zero product properties and how students might interpret these ideas in context. How could we use their responses to bridge an understanding of solving linear equations to solving quadratic equations? We didn’t want to shape how they answered, but simply craft questions that would naturally guide them to worthwhile discussions and new understandings. Whatever question we toyed with throughout the two hours, it always came back to the same criteria.

How might the kids answer? How will their response draw them nearer to the goal? How can their thinking help the lesson tell a story? 

Subconsciously, I think I do this. Just not enough. This experience connected well with the principles of my current book as well as providing me a nice reminder to plan the critical points of a lesson so that they pivot on student thinking.

 

bp

“You need to have your aim posted”

What impact does posting the aim, or central question of a lesson, have on teaching and learning? What purpose does it serve?

I’ve heard throughout my career that “you need to have your aim posted” at the start of every lesson. @stoodle got at this idea recently and made me realize that I myself have been pondering this for quite some time.

A year ago someone at a PD mentioned that they never post the day’s aim. Nor do they “announce” it at the beginning of class. Instead, the aim is elicited from students during the learning process. The essential question is built upon their prerequisite knowledge and pulled from their comprehension of what they learn from the lesson. It is never given, but rather discovered by the students.

When I heard this, I had an ah-ha moment. It made complete sense. Other than in the classroom, how often are we informed of what we’re going to learn before we actually learn it? Sure, you may have a goal you want to accomplish (e.g. complete yard work before 1 pm), but what you actually learn in the process (e.g. how to mow my lawn as efficiently as possible) is often unknown at the onset. We notice, strategize, experiment, learn, and then realize what we’ve learned.

Recently, I didn’t post the aim of a lesson on arithmetic sequences. I required my students, as part of their exit slip, to write what they thought the aim was for the lesson. Not only did 90% of the kids nail it, but one was even better, and more creative, than what I originally intended for the lesson.

Aim

(This is directly related to the overarching problem from the lesson)

This made me think. Whatever a student feels the aim is (during or at the end of a lesson), provides remarkable feedback as to the effectiveness of the lesson.

Another thing. I’m a firm believer that lessons should be based purely on questions. One question should lead to another, and then another, and then another. Ultimately, the central question – the heart of any lesson – should eventually be provoked. Because of this, I want my students to need the central question of a lesson to accomplish a task or goal. They can’t need it if I openly post it.

I’m left with many questions about this widely-adopted practice of aim-posting. What are the consequences of openly telling students the aim of a lesson? Conversely, what are the consequences of structured learning that promotes the discovery of the aim? If I don’t tell my students the aim, how do I frame a lesson from the onset? Does explicitly stating the aim perpetuate a top-down approach to learning? How can we use student-generated aims to inform our teaching?

 

bp

Questions.

Question More

I’ve realized that it has become a goal of mine to improve my questioning. Here’s some of what I’ve been pondering (and doing) as of late.

1. Asking “what if…” questions. This will usually come into play after we finish a problem. I try to change the conceptual nature of the problem, which provokes students to examine relationships and see the problem under a new context. I also really like giving the students a minute or two to generate their own “what if…” questions about a problem after we’ve found a solution.

2. Asking students to find errors within student work samples. I really started focusing on this last year with my exit tickets, but I’m doing it just about every class. I usually pick up someone’s paper and slide it under the document camera for the class the assess. Quick, easy, authentic. Plus, it creates a culture of identifying and accepting mistakes on a regular basis.

3. I’ve also begun asking students to identify potential errors within problems before examining any sample work. The result is always rich classroom discussion over creatively wrong solutions. The goal is for them to identify both subtle and more serious mistakes that could occur.

4. Having students construct their own questions (that are good). I really need to get better at this. I’ve had some success in the past, but usually when I least expect it. I’m thinking of researching more into RQI to find some useful strategies.

5. The other day, out of the blue, I utilized a “convince me” statement to a student during a class discussion. We were factoring and I proposed a (wrong) solution to him. I essentially asked “why is my solution wrong,” but in a way that felt more like a challenge rather than a question. I felt the power when I uttered it. It probably bled through from a workshop from Chris Luzniak a couple years ago on using debate in math class. He has great stuff.

6. Using questions as a foundation of my class. I want my classroom culture to be one that emphasizes the why behind the answer instead of the answer itself. As a math teacher, I’ve always emphasized work and how critical it is. But I’ve never lived out that creed by how I teach my kids. Trying to change that this year. More to come on this.

7. If one student X makes a statement about something we’re studying, I’ll sometimes turn to student Y and ask them to “Interpret what X just said…”

8. During an intervisitation, the teacher I was visiting posed a question to the class and no one responded or seemed to have a clue. He said “Alright, take 30 seconds and brainstorm with a neighbor about the question.” He waited and asked the question again and there were several responses. This was awesome.

9. The questioning doesn’t begin and end while I’m teaching. I’ve started questioning more of what I plan and structure for my students, including things that I’ve done for years. I’ve put my teaching philosophy under a microscope too. It’s changing. This will have repercussions far greater than any question I could ever pose to a student.

 

bp

Quick Key

To help me collect data, I’ve been using a tool for the last couple of months. It’s called Quick Key and it’s used to quickly and easily collect responses from multiple choice questions.

For a long, long time, my school utilized the Apperson Datalink scanner to aid in scoring multiple choice portions of exams. It not only scores exams quickly and efficiently, but its accompanying software provides insightful data analysis that I use to modify my teaching. On the downside, these machines are pricey (almost $1000) and require you to purchase their unique scanning sheets that work only with their machine. Each department in my school had a machine.

Because of my push towards standards-based grading, I find myself giving smaller, bite-size assessments that target fewer concepts. Consequently, I am assessing more frequently and I need the scanning machine at least once a week. The machine was constantly changing hands and I was always running around the building trying to track it down.

I decided that I didn’t want to be a slave to the scanner – and its arbitrary sheets. It’s not sustainable. Especially when we have mobile technology that can perform the same task and provide similar results.

Enter Quick Key.

Quick Key has allowed me to score MC items and analyze my students’ responses in a much more convenient and cost-effective way. Like, free. Hello. You simply set up your classes, print out sheets, and start scanning with your mobile device. (You don’t even need to have wifi or cellular data when scanning.) The interface is pretty clean and easy to use. Plus, it was created and designed by a teacher. Props there too.

Data is synced between my phone and the web, which allows me to download CSV files to use with my standards-based grading spreadsheets.

SBG screenshot

My SBG tracking spreadsheet

That is the big Quick Key buy-in for me: exporting data for use with SBG. As I have mentioned before, SBG has completely changed my teaching and my approach to student learning. At some point, I hope to write in-depth about the specifics of this process and the structure I use.

Though the Quick Key data analysis isn’t as rigorous as what I would get from Datalink, it suffices for my purposes. I sort of wish Quick Key would improve the analysis they provide, but for now, if I need more detailed analytics, its usually requires a simple formula that I can quickly insert.

Data from QK

Sample data analysis from Quick Key

Data from Datalink

Sample data analysis from Datalink

Through all this, I don’t overlook the obvious: MC questions provide minimal insight into what students actually know, especially in math. That being said, my students’ graduation exams still require them to answer a relatively large number of MC items. For that reason alone I feel somewhat obligated to use MC questions on unit exams. Also, when assessing student knowledge via MC questions, I do my best to design them as hinge questions. TMC14 (specifically Nik Doran) formally introduced me to the idea of a hinge question, which are MC questions that are consciously engineered to categorize and target student misconceptions based on their answer. In this way, students responses to MC questions, though less powerful than short response questions, can provide me an intuitive understanding of student abilities.

Quick Key recently introduced a Pro plan ($30/year) that now places limitations on those that sign up for free accounts. Their free plan still offers plenty for the average teacher.

Either way, Quick Key still beats a $1000 scanner + cost of sheets.

bp