The goal of any exam, ideally, is to assess how much students have learned over the course of a semester or school year. I changed the focus of grading in my classes from counting points to counting progress towards specific learning goals, I knew my exams needed to reflect that change as well.
This summer I had initially thought I might design some sort of alternate, performance-based exam that would mesh well with the tenets of standards-based grading. However, this year all exams for the same class were required to be exactly the same regardless of teacher. Since I'm currently one of four teachers who teach the 9th grade Integrated Science course and the only one using standards-based grading, I knew I had to take our common exam and make the best of it.
So, the exams had to have the same questions, but they didn't need to be in the exact same order, right? I reordered all the questions on the exam based on the learning goal they assessed.
This process uncovered several questions which didn't address any of the learning goals, so these "others" were grouped together to make their own section.
Overall, I wasn't thrilled with the exam, but I think it was quite good given the requirements it had to meet.
Assessment
Breaking down the exam into its composite learning goals allowed me to assess each learning goal on the exam individually. It took decently longer to grade the exams in this way, but it also provided me and my students with a wealth of information about their learning throughout the first semester.
I created a Google Spreadsheet that automatically calculated the individual scores for each learning goal and the overall exam grade. Once the grading was done, I shared each student's spreadsheet with them through Google Docs.
Below is an example of a filled out scoresheet (and here's a blank calculation sheet if you're interested):
Details
Overall grades. You may notice I calculated two "overall" grades. I told students their overall grade on the exam would be the average of their scores on each learning goal (giving each learning goal equal weight), but I wasn't sure if that might result in some odd effects on the overall grade due to some flaw I hadn't planned for. As a check, I also calculated the exam's score "traditionally," or simply by dividing the total points possible by the total points earned. Interestingly these two scores were almost always ridiculously close to each other (for most students it was <1%). I'm not sure exactly what that means, but it was interesting nonetheless.
Unfinished long answer questions. The exam had 6 long answer questions and students were required to complete at least 4 of them. I had a few students who either skipped the long answer questions entirely or did fewer than were required. It didn't make sense to penalize any one learning goal for not doing all the long answer questions (since, after all, simply not doing the long answer questions didn't necessarily mean they didn't understand the content of the learning goals). However, I felt that there should be some penalty for doing fewer than required1. As a result, I calculated what percentage one long answer question was of the entire exam and divided that by 2- which gave me 1.84% in this case. For each required long answer question that was not completed, I took 1.84% off their overall exam grade.
Spreadsheet-fu. I honed some serious "if-then" formula skills in the process- an area of serious spreadsheet-fu weakness before this spreadsheet. Despite the time it took me to figure out how to make the spreadsheet do what I want, I'm still pretty sure using the spreadsheet instead of calculating everything by hand saved me several hours. Plus, now I have another formula type under my belt.
Final thoughts
Perhaps unsurprisingly, my predictions about what learning goals would be problematic for students on the exam were dead-on. They were the same learning goals that more students struggled with during the course of the semester. There really weren't any surprises on the mid-term.
What then, is the purpose of an exam in a SBG classroom? Exams are meant to assess how well students know the material that has been presented throughout the semester. However, if I am regularly assessing students' understanding of learning goals throughout the semester is there any benefit to a final, summative exam? Most students' exam grades were eerily close to their grades for the rest of the semester2.
If we're doing SBG well, it seems to me the final exam is unnecessary. We should already have a good understanding of exactly what students know, so why bother with a big test at the end of the semester?
Should the exam in an SBG classroom be something totally different than what we've traditionally come to think of exams as being? Or should they just be done away with?
_____
- At first I really balked at penalizing students for not completing the required long answer questions. However, after thinking about it for a bit, I came to the conclusion that to some degree the decision of a student to skip one or more of the long answer questions was indicative of a lack of understanding of the content at least to some degree. [↩]
- On average, the exam grades were just a bit lower than grades for the rest of the semester. I can rationalize that in several ways: additional anxiety due to it being an exam, or a less than perfect exam design, etc. [↩]
This really interests me. I have a couple of questions. How long was this test? Were MC questions worth more than point each? Is this a physical science class (with an earth and space science component)?
It has always been my experience that longer exams (such as midterms and finals) tend to have lower scores than previous exams. Part of that is on the students and part of that I think is on us. We tend to want to make the exam as comprehensive as possible and we also don't want it to be a cakewalk so we make it a bit more rigorous than usual.
Steve Whiteley´s last blog post ..2-16-11
I had a personal crisis about finalsa couple of years ago. I chucked them in favour of each student identifying the standards they were still working toward and came up with a plan for review where each student just focused on their specific deficiencies. (You can see the intervention materials and mini-lessons by following each link.) I did not require students who had met all the standards to take a final in my class.
At the bottom of all of this is "How long does a student have to retain something for us to say that they've learned it?" Not such an easy question to answer...and not one that necessarily makes the connection between a big test and learning.
@Steve: The exam was designed to fill the 2 hour exam period, and has 53 multiple choice questions, a large short answer section, and the long answer questions mentioned in the post. I made the MC questions worth 2 points each, since that seemed to make them a little more equitable with the value of the rest of the test.
The Integrated Science class is a mix of chemistry, astronomy, environmental science, and other physical sciences. You can see the district mandated standards for the course here.
In the past students have scored much lower on exams than they did throughout the rest of the semester, though this year the drop was much less. I'm hoping that's a positive data point for SBG, but at this point it's a data point of one, so it's hard to say.
[...] This post was mentioned on Twitter by Ben Wildeboer, Ben Wildeboer. Ben Wildeboer said: A new blog post on Re:thinking: Exams: SBG-style http://blog.benwildeboer.com/2011/exams-sbg-style/ [...]
@The Science Goddess: Whoa. Missed this comment in the queue somehow. I really think your "plan for review" makes a lot more sense than some final cumulative exam for all students regardless of their progress towards mastering the standards. It really just hit me while writing this post that having a final exam doesn't make sense if we are consistently measuring their mastery towards the goals of the course. Of course, I know I'm not to the point where my SBG-skills are where they should be, but I'm not sure a final exam is giving me any additional info I didn't have from other assessments.
Your question about "How long does a student have to retain something for us to say they've learned it?" is pretty tricky. Students didn't feel they should be assessed on standards more than once when I first started SBG. They don't complain about it as much now, but I still get groans when we assess over topics they feel we should be done with. Not sure what the answer is for this one. Definitely food for thought...
This is genius! I just found it and I'm using it with my AP Computer Science students. I will be using it with all my classes next semester - Thanks for sharing!!