Exams: SBG-style

The goal of any exam, ideally, is to assess how much students have learned over the course of a semester or school year. I changed the focus of grading in my classes from counting points to counting progress towards specific learning goals, I knew my exams needed to reflect that change as well.

This summer I had initially thought I might design some sort of alternate, performance-based exam that would mesh well with the tenets of standards-based grading. However, this year all exams for the same class were required to be exactly the same regardless of teacher. Since I'm currently one of four teachers who teach the 9th grade Integrated Science course and the only one using standards-based grading, I knew I had to take our common exam and make the best of it.

So, the exams had to have the same questions, but they didn't need to be in the exact same order, right? I reordered all the questions on the exam based on the learning goal they assessed.

Multiple choice section, SBG exam

This process uncovered several questions which didn't address any of the learning goals, so these "others" were grouped together to make their own section.

Overall, I wasn't thrilled with the exam, but I think it was quite good given the requirements it had to meet.

Assessment

Breaking down the exam into its composite learning goals allowed me to assess each learning goal on the exam individually. It took decently longer to grade the exams in this way, but it also provided me and my students with a wealth of information about their learning throughout the first semester.

I created a Google Spreadsheet that automatically calculated the individual scores for each learning goal and the overall exam grade. Once the grading was done, I shared each student's spreadsheet with them through Google Docs.

Below is an example of a filled out scoresheet (and here's a blank calculation sheet if you're interested):

Example Exam Calculation Spreadsheet

Details

Overall grades. You may notice I calculated two "overall" grades. I told students their overall grade on the exam would be the average of their scores on each learning goal (giving each learning goal equal weight), but I wasn't sure if that might result in some odd effects on the overall grade due to some flaw I hadn't planned for. As a check, I also calculated the exam's score "traditionally," or simply by dividing the total points possible by the total points earned. Interestingly these two scores were almost always ridiculously close to each other (for most students it was <1%). I'm not sure exactly what that means, but it was interesting nonetheless.

Unfinished long answer questions. The exam had 6 long answer questions and students were required to complete at least 4 of them. I had a few students who either skipped the long answer questions entirely or did fewer than were required. It didn't make sense to penalize any one learning goal for not doing all the long answer questions (since, after all, simply not doing the long answer questions didn't necessarily mean they didn't understand the content of the learning goals). However, I felt that there should be some penalty for doing fewer than required1.  As a result, I calculated what percentage one long answer question was of the entire exam and divided that by 2- which gave me 1.84% in this case. For each required long answer question that was not completed, I took 1.84% off their overall exam grade.

Spreadsheet-fu. I honed some serious "if-then" formula skills in the process- an area of serious spreadsheet-fu weakness before this spreadsheet. Despite the time it took me to figure out how to make the spreadsheet do what I want, I'm still pretty sure using the spreadsheet instead of calculating everything by hand saved me several hours. Plus, now I have another formula type under my belt.

Final thoughts

Perhaps unsurprisingly, my predictions about what learning goals would be problematic for students on the exam were dead-on. They were the same learning goals that more students struggled with during the course of the semester. There really weren't any surprises on the mid-term.

What then, is the purpose of an exam in a SBG classroom? Exams are meant to assess how well students know the material that has been presented throughout the semester. However, if I am regularly assessing students' understanding of learning goals throughout the semester is there any benefit to a final, summative exam? Most students' exam grades were eerily close to their grades for the rest of the semester2.

If we're doing SBG well, it seems to me the final exam is unnecessary. We should already have a good understanding of exactly what students know, so why bother with a big test at the end of the semester?

Should the exam in an SBG classroom be something totally different than what we've traditionally come to think of exams as being? Or should they just be done away with?

_____

  1. At first I really balked at penalizing students for not completing the required long answer questions. However, after thinking about it for a bit, I came to the conclusion that to some degree the decision of a student to skip one or more of the long answer questions  was indicative of a lack of understanding of the content at least to some degree.     []
  2. On average, the exam grades were just a bit lower than grades for the rest of the semester. I can rationalize that in several ways: additional anxiety due to it being an exam, or a less than perfect exam design, etc.     []