Dear Skeptics' Guide: Standards aren't the solution

There's a widespread narrative regarding science education in the United States: It stinks. As a science educator, whenever I hear this two things happen. First, I get my my hackles all up. Second, I realize that despite my hackles I generally agree. I get my hackles up because I've spent a lot of time thinking about, planning, designing, and implementing a science curriculum that I feel has been pretty darn good. However, I recognize that the School System (I'm not picking at any one school district here, but instead at the entire system of schooling in this country) has not done a very good job of helping students to think and act like scientists.

Recently, while listening to the Skeptics' Guide to the Universe podcast (#343), I had my hackles raised. They discussed a recent article on io9 titled, "Your State Sucks at Science.1" This article discussed a report by the Thomas B. Fordham Institute that analyzed each state's standards on their "Content & Rigor" and "Clarity and Specificity." The results (summarized on the map below), showed that the vast majority of states didn't do so well. In fact, they did terribly.

Grades for States on Science Standards.

OK, that information doesn't shock, surprise, or upset me. Connecticut earned a not-so-respectable "C." I'd probably give the standards I've worked with (9th grade Integrated Science) a lower grade. Many standards are overly broad. Others are ambiguous. I agree with the Skeptics' Guide, io9, and the Thomas B. Fordham institute that improving these standards would be a good thing for science education.

So, why are my hackles still raised? Well...during the Skeptics' Guide to the Universe (SGU) discussion on the sorry state of science education, the general view was that poor standards are the crux of the problem (followed by poor teachers- more on this later). It was stated that poor standards will cause teachers to fail their students more often than the case would be if states had good standards. As anecdotal evidence of this, Dr. Steven Novella noted his daughter is receiving a sub-par science education at the Connecticut public school she is attending. Dr. Novella specifically described his two big problems with his daughter's science instruction:

  1. Inquiry and scientific thinking is not taught well at all.
  2. Real science education doesn't even really begin until the 7th grade. In grade schools they get virtually nothing.

I generally agree with these assertions. What really bothered me, however, was the discussion of why these problems exist. Here are some quotes from the discussion:

  • "Teachers don't quite grasp how science works."
  • "When the standards fail the teachers, the teachers will more likely fail the students."

Can you see where this is going? They never come right out and say science education stinks because our science teachers stink, but that idea is hovering just beneath the surface. I readily admit there are science educators who don't quite grasp how science works and who don't do a great job of designing science instruction. However, I believe this is more of a systemic issue than an individual teacher issue. Let's look at Dr. Novella's two assertions again:

  1. Inquiry and scientific thinking is not taught well at all.
    • Education is a high-stakes testing world these days. What's valued by our current schooling system are good scores on standardized tests, so effective teachers are labeled as those who help students earn good scores on standardized tests. However, it's can be tricky to assess inquiry and scientific thinking. The best way to assess these skills is to observe students performing scientific inquiry (or at least look at a portfolio of student work) to gauge the level of sophistication in scientific inquiry and thinking the student possesses. So, let's look at how Connecticut assesses science: The Connecticut Mastery Test (given grades 3-8) and the Connecticut Academic Performance Test (given to 10th graders) both assess "experimentation and the ability to use scientific reasoning to solve problems2." The CAPT science test includes 60 multiple choice and 5 open-response questions. In 5th grade, the CMT science test includes 36 multiple choice and 3 open-response, and in the 8th grade edition there are 55 multiple choice and 3 open-response3. Multiple choice questions- even well designed items- are a shoddy way to measure inquiry. Even the open-response questions that require several sentences to answer aren't a very good measure. Yet this is the system of assessment we value and this system of assessment doesn't value inquiry, so why are we surprised when inquiry and scientific thinking take a backseat in the classroom? The problem doesn't start with the teachers, it starts with our method of assessment.
  2. Real science education doesn't even really begin until the 7th grade. In grade schools they get virtually nothing.
    • Again, let's look at what our schools value by looking at what they assess: The CMT is given to every 3rd through 8th grader attending Connecticut public schools. Every year from the third grade and on, students are assessed in mathematics and language arts. Only 8th graders took a science CMT through 2007. Starting in 2008 the state added a science CMT to the 5th grade as well. Why is science instruction getting the short end of the stick? Because we're not assessing it. The focus on math and language arts isn't a bad thing, but it means that subjects not being assessed are being pushed to the side. This isn't the fault of the teachers stuck in this system- it's the fault of the system itself.

What's the solution?

I am not advocating for giving more science standardized tests. I have no problem with improving our science standards. However, unless we change the current methods of assessment I wouldn't expect to see much change. To learn scientific thinking and inquiry, students must be given time in class to explore ideas, rethink assumptions, and test their hypotheses. These things take a lot of class time- furthermore they deserve a lot of class time. Having lots of well written standards is generally a good thing, but it also means teachers are pressured to "cover" all the standards to the detriment of depth of understanding and student exploration.

Dear SGU, you are science educators yourselves, and I love most of what you do. However, I'd like you to think and talk more deeply about what good science education in schools looks like and whether that vision is being supported by the assessment methods employed by the states. A wise person once said, "What we assess defines what we value4" I'd add "How we assess defines what we value," as well. If we value inquiry and scientific thinking, our assessments should be more sophisticated- requiring students to actively demonstrate their understanding of how science works. These assessments would be expensive to design and implement but would more accurately reflect students' actual scientific knowledge and skills. It's not that I think the SGU hates teachers, but you do seem to be jumping on the political narrative that has been placing undue blame for poor education practices on the shoulders of teachers instead of including systemic forces that impact how and why teachers deliver instruction in the classroom.


  1. The discussion starts about 27 minutes into the episode and runs for 10 minutes on this topic. (back)
  2. See 2011 CAPT Interpretive Guide, p. 5. (back)
  3. Question information from the CAPT Program Overview 2012, p. 11, and Science CMT Handbook, p. 8, (back)
  4. This was a Grant Wiggins quote, I believe. (back)

3+ Quick- Birthday, (grading) scale matters, exposing climate fraud, debunking handbook

These aren't brand new items, as they're things I came across awhile ago and am just getting around to posting now. In addition, I realized that the anniversary of this blog just passed. My first post was published January 12, 2008. As I look back at my first posts, it's clear that I've come a long way (hopefully for the better)- in my location, in my career, and in my thinking. So, in celebration of the 4th anniversary of this blog, let me present you with the following interesting tidbits:

Scale matters (Rick Wormelli)

Thanks to the ActiveGrade blog for bringing this to my attention. I don't know how many times I've had discussions with other teachers on the topic of what constitutes fair and effective grading. Often the most heated topic (where I never made any headway) involved the giving out of zeroes for either missing or poorly done classwork. Rick Wormelli gives a great explanation of why grading scales matter- and specifically why zeroes are no good. It's long for YouTube at 8+ minutes, but it's worth it:

Exposing a climate science fraud (Ethan Siegel)

The post is ostensibly a take down of Judith Curry's claim's that recent studies and reports on the topic of climate change are "hiding the decline1." However, the real appeal of this post (for me) is how it so effectively describes how science and scientists work. He goes through the data, the uncertainties in measurement, and explains how exactly it is that scientists determine that some effect is real and not just a statistical fluke.

The Debunking Handbook (Skeptical Science)

Somewhat related, the Skeptical Science blog (one of the best places to find science-based information about climate science) released The Debunking Handbook a while ago and just recently updated it. The Handbook provides guidelines for communicating about misinformation and gives tips to avoid falling into common pitfalls. In their own words, "The Handbook explores the surprising fact that debunking myths can sometimes reinforce the myth in peoples' minds. Communicators need to be aware of the various backfire effects and how to avoid them..." The handbook is a free PDF download available at their website.


  1. "Hiding the decline" is the (totally false) idea that climate scientists are tweaking their graphs to make it seem like the Earth is getting warmer, when it really has been cooling the last decade (which it hasn't). Read the full article for more details. (back)

Teachable Moments, for all of us

On Friday, when discussing the earthquake and tsunami that had just struck Japan, I remember saying to students, "It looks like the death toll will be in the hundreds, which is horrible, but considering the size of the earthquake is pretty low." I write this,1 the official death toll is at 2,414 and expected to rise to perhaps as high as 10,000.2

Still image from a 1st person view of the tsunami
Still image from a 1st person view of the tsunami

We've been discussing the earthquake and tsunami in class, though I haven't done much "educationalizing" of the disaster at this point. So far my M.O. has been to show some videos or pictures, give news updates of what's going on, and then have time for students to ask questions or just talk about what's going on. At some level I feel like trying to craft organized lessons about subduction zones, Moment Magnitude scales, tsunami generation, or nuclear power generation would be taking advantage of the disaster.

I want students to know what's going on in Japan. I want students to understand the details. That's why I show the videos, why I spent a big chunk of time searching for video and images that seemed to capture the disaster. And the fact is, students want to know about the earthquake and tsunami and potential meltdowns at nuclear power plants. They want to know why tsunamis are so dangerous ("I don't get it, it's just water, right?"), what causes earthquakes ("I heard it was caused by the 'super moon.'3"), and how nuclear power plants work ("If there's an explosion at a nuclear power plan, how can it not be a nuclear explosion?").

The general public wants to know what's happening and why. Our students want to know what's happening and why. I want to know what's happening and why. However, I want student interest to drive our classroom learning about the disaster. I don't want to use the disaster to drag out a month of earthquake & tsunami lessons if the students aren't interested in learning more.4

I have been pleasantly surprised with the number of more "mainstream" media outlets doing some exemplary explaining about earthquakes, tsunamis, and nuclear reactors. I've especially been impressed with the time given to explain how nuclear reactors work and then what's going on at the Fukushima Nuclear Plants. Boing Boing did an excellent job describing how nuclear power plants work and NHK World explained simply yet thoroughly what was happening at Fukushima.

These are the times when it seems very clear to me that a little scientific literacy (or at least a healthy dose of skepticism) is an extremely useful skill. There are quite a few bits of misinformation out there, but there are also a lot of quality explanations of the science behind the disaster.


  1. at 10:10pm EDT, March 14, 2011       (back)
  2. via (back)
  3. FYI, it wasn't. See here for a in-depth take down of the super moon myth     (back)
  4. Yes, I get following the state curriculum means I'm essentially forcing this same thing most of the school year with students. My especially guilty feeling on these topics most likely derives from the fact that I'd feel like I'd be taking advantage people's suffering simply as an educational hook.    (back)