Year in Review: 2015

I always find it a bit odd reflecting at the end of a calendar year. My real year runs from September to June. That’s how I’m trained. Nonetheless, here are some reflections from my 2015.

Professional Development

  •  I helped facilitate an Arduino PLT with MfA in the spring. Although I helped lead the group, I was truly learning as I went. It was an uplifting testimony to being a novice that’s willing to bring knowledgeable folks together for the greater good.
  • The spring introduced me to Video Clubs and the fall allowed me to bring this rich experience to the math department at my school. I eagerly await to see where this goes. I even got invited to speak at a conference! Of all people, me!
  • The Research Experience for Teachers (RET) with NYU provided the platform to develop and learn through actual, hands-on research. I’ll never again look at human hair or cement the same way. I even got to use an electron microscope. Talk about a wow experience.
  • The Learning Partners Program helped me in unpredictable ways. I was expecting intervisitations, but the experience proved to be far more comprehensive. Some of the resulting transparency caused pain, but in a healthy way.
  • I was about my colleagues. Our relationships got stronger, more interdependent. I thrusted myself into a leadership role and invested in their success. I wanted my opportunities to become their opportunities.
  • I took up juggling. It’s not directly related to my professional growth, but I think it’s pretty cool. Plus, it’s fairly therapeutic, which helped me grow. I’m looking forward dwelving into the mathematics of juggling, which will be fun.
  • I read, and enjoyed reading, more than ever.

Teaching

  • I began assessing my students by means of standards-based grading and then improved it. I should’ve began this a long, long time ago.
  • It seems simple, but exit slips played a significant role in my class for the first time. And, in general, assessment was a running theme for me all year.
  • I once held the belief that if I don’t grade an assignment, the kids wouldn’t do it. I learned that it’s more about the classroom culture and growth mindset that make kids want to (or not want to) work.
  • I wrote to my students more than ever. And loved it.
  • I incorporated writing and discussion techniques regularly. I also thought about my questioning in more sophisticated ways. Simply put, I learned to value these worthwhile activities.
  • After years of waiting for the opportunity, I was able to kickstart an after-school bicycle club in 2015.
  • I realized that I now look at every day, every class period, differently. In fact, I feel different about teaching. I’m more aware, more dedicated, more creative. My perspective matured greatly in 2015. This has pushed me to capitalize on moments with students and colleagues like never before.

This was my first full calendar year maintaining a blog. Is it a coincidence that 2015 was such a game changer for my career?

bp

Knowledge Audits

Audit 1How can I know what my kids know?

I’ve been asking myself that question for a long time. In my Regents-obsessed school, I’m forced to ensure my students can perform well on end-of-year state exams. The typical learning flow in my class usually looks like this:

  1. Student learns X.
  2. Student demonstrates understanding of X.
  3. Student learns Y and forgets X.
  4. Student demonstrates understanding of Y and has no idea what X is.

Compile this over the course of a school year and you have students that understand nothing other than what they just learned. What does this mean for a comprehensive standardized exam? Disaster!

Sure, a lot of this has to do with pacing and students not diving deep into things they learn to make connections. That is a sad reality of too many teachers, including me. So given these constraints, how can I help kids build long-lasting understanding of things they learn and not forget everything other than what we’re doing right now?

In the past, I’ve “spiraled” homework and even put review questions on exams, but this never helped. There was no system to it and I never followed up. This year, I’m lagging both homework and exams, which does seem to be making a difference. But with the ginormous amount of standards that students are supposed to learn each year, I still feel this isn’t enough.

So, last week I began implementing Audits. These are exams that do not assess concepts from the current unit. The plan is to administer about one a month and because I lag my unit exams, I should have no trouble fitting them into the regular flow of things.

I’m choosing not to call them “Review Exams” or some other straightforward name in order to put a fresh spin on them and increase buy in. So far, so good.

The hope is to continually and systematically revisit older content to keep students actively recalling these standards. This should reinforce their learning and help to make it stick. On the teacher side of things, I get an updated snapshot of where they are and can plan accordingly. The SBG aspect is simple: the results from the Audit supersede any previous level of understanding.

  • If a student has not previously earned proficiency on a standard that is assessed on an Audit, he or she can earn proficiency. This alleviates the need to retest on their own.
  • If a student has previously earned proficiency on a standard, he or she must earn proficiency again or else lose credit for that standard. This would then require them to retest.

The first Audit resulted in a mix of students earning credit and losing credit for a set of standards. It was great. The proof is in the pudding. Knowledge isn’t static and my assessment practices must reflect this.


bp

Speed dating, revised & reloaded

I love speed dating.

With students, of course. The type of speed dating that serves to reinforce, promote peer tutoring, and review.

But I always found the initial assigning of problems to be tricky and inefficient. Desks would be set up facing each other. I give kids handouts on their way into the class that have the practice problems on them. I would then walk around and assign each student a specific problem from the handout. I always did this strategically; I would assign higher level questions to higher level students. This way, the higher level concepts would get facilitated effectively to other students who may not initially understand those concepts. Because they needed to wait for me to “give” them their question, this caused some off-task, downtime for the kids…which was not good. They would also spend less time speed dating (and doing math) because they were dependent on me at the beginning. Bottom line: I wasted a bunch of class time assigning questions.

Today, I tried another technique and I liked it. I placed numbers on the desks and wrote numbers on the handouts. I gave the students the handouts as they entered (randomly), and instead of wading around assigning questions to students, the kids just sat at a desk whose number matched the number on their handout. And that number was their assigned question they needed to master. They didn’t need me to assign them a problem because they already had one. Here are the desks with the numbers.

Numbered Seats for Speed Dating

The result: a much more fluid, efficient start to the activity. The students came in and knew exactly where to sit and which question they were responsible for. They ultimately spent more time working on math – which is the whole point.

Speed Dating Revised

I randomly assigned the questions as they walked in – so I got away from assigning higher level questions to certain students. It was fine. I just walked around and helped students with their problems – which I would have done anyway. Even if I wanted a specific student (or students) to be in charge of a specific problem, I could simply ensure that I gave them the number of that problem when they walked into the room. I would only need to do this for one or two questions, so it wouldn’t be terribly difficult. It is a lot better than students sitting around while I manually assign problems at the beginning of class.

3/18/15 UPDATE:

Love it.

bp

Review Days?

Math Test Easy or Wrong

I was having a discussion with a few of my colleagues today about our upcoming quarterly exams. They’re basically our midterms – we just give them a fancier name. We were talking about the amount of review days (or class periods) that are necessary for students to prepare for these types of summative assessments.

It made me think. I don’t really “review” before big exams. I don’t even regularly review before a unit exam. I definitely have days where students aren’t learning new material. Maybe they’re reinforcing things they have learned previously. Or maybe it’s an extension of a previous lesson. Those aren’t the days I’m thinking of. I’m thinking of traditional review days that aim to refresh students minds before a significant assessment.

Whenever it’s necessary or if I feel the moment is right, sure, I’ll whip out a game to review or have the students speed date to catch each other up. I do these sorts of activities from time to time, but this is infrequent and rarely happens right before an exam.

So why don’t I review before exams?

I want raw, unfiltered data on my students’ understanding. By reviewing just before an exam, I am giving my kids a mental cheat sheet that they can use on the day of the exam. Did they understand the content before we reviewed or because we reviewed? My data is inherently skewed because of our review. But if I test them on a concept that we studied two months ago, yes, I want data on their understanding of that concept without refreshing their memories. I want them to forget a step or two. Or for that matter, I don’t mind if they completely forgot how to start a problem. This is precisely the information that will drive my future planning. Also, those misconceptions are exactly the sorts of things that will help my students do better in the future.

But don’t students want review? I mean, they want to increase the odds of performing well on the assessment and therefore increasing their overall grade in the class, right? I answer that question in two ways. First, there are plenty of instances where I don’t even count an exam towards their overall grade. Yes, I said that. I treat it as a diagnostic and the kids buy into it. I find out where they’re at and they get a (potentially poor) exam that doesn’t count against them. We all win. Second, students will adapt to the no-review-before-an-exam policy. They will meet our expectations. If they know I don’t review before exams, then over time they will prepare/study accordingly. And if at first they don’t, they may complain, but eventually they will come around.

So what to do with that extra time not spent reviewing? I spend that day or two (or three) after a big exam to reteach and reassess what my students had trouble with. It just makes sense. Now, because I looked at the data, I have a pretty good understanding of what they know and don’t know. I can pinpoint how I reteach a particular set of concepts. Often times, I even immediately reassess them on concepts they struggled with. This almost always results in improvement, which only helps to establish a growth mindset. It also helps them understand the method behind the madness of no review days.

I guess all this may count as “review after the test” and I’m good with that. Reviewing before an exam is overrated. Intuitively, as teachers, it makes sense to review before. But I think the more effective strategy is to do so after.

I’m really curious about what others believe about this. How do you incorporate review into your class structure?

bp