Worksheet labs aren't that great: Hooke's Law

In a recent post, I strongly suggested that a physics class should be a place where students are actively involved in the exploration of the relationships that exist between different variables (force and mass, for example)- not a place where students are simply given a list of equations they are told explain how the world works. Let's continue down this line with an example.

Example: Simple Harmonic Motion and Hooke's Law

This is a lab from a college class I took last semester:

Analysis

This lab isn't terrible. I mean, who doesn't like bouncing springs?


In the first part, we were required to find the spring constant by examining the relationship between the force applied to the spring and the spring's elongation using a graph. That's not too shabby, right? Well...no...but...

What the lab doesn't require is any thinking about the relationship between force and elongation. You make a nice graph, but are told right in the instructions that the slope of the graph is this thing called the "spring constant." We aren't expected to know anything more about how the relationship between force and elongation and the spring constant works.

In part two, we varied the mass on the spring and measured the period of the spring's oscillation, which we then compared to the expected period based upon our calculations and a formula we were given ahead of time:

T = 2 \pi \sqrt{ \dfrac{m}{k}}

I didn't need to know much to write up the lab report:

  1. The period of a spring's oscillation depends on the mass attached to the spring.
  2. The formula we were given to find the period of a spring's oscillation works.

That's it. If I was an astute student I might've realized that the slope of a Force-Elongation graph will give you the spring constant- but we were walked through that step in such a way that it would have been easy to miss that tidbit. Never mind understanding what having a larger or smaller spring constant would mean in real life.

Rethinking the lab

So now you're thinking that I'm just a cranky-pants who likes pointing out the failings of other people's labs. Let me try to improve your perception of myself by explaining how I'd like to run a lab covering the same content.

First, I think it's important to identify what I want students to understand as a result of completing this activity. I'd like them to understand:

  1. The nature of the relationship between the force applied to a spring and the spring's elongation.
  2. The slope of a Force-Elongation plot is the "spring constant."
  3. The nature of the relationship between the mass hanging on a spring and the spring's oscillation period.

Second, I want the students to play be the primary investigators. I'm not going to give them a sheet explaining step by step exactly what they have to do. I want the students to handle that part. Maybe I give each group of students a few springs and a set of masses and simply set them free to play around and make observations for 10 minutes or so- after which we discuss as a class observations they have made and decide upon a path for further investigation. Maybe I give some guidance right away and tell them to investigate the relationship between the mass on the spring and the elongation of the spring.

Third, we draw some Force-Elongation graphs. We discuss the relationship between force and spring elongation (it should be pretty obvious it's a direct linear proportionality- i.e., if you double the force on the spring, you double its elongation). So now we know that F \propto x. Next, we look at the difference in the graphs for each spring. Why are some lines steeper than others? What is the difference between a spring with a steep slope and a spring with a more gradual slope? Then explain the slope on a Force-Elongation graph is called the "spring constant." So now we've figured out that if we know the force acting on a spring and that spring's spring constant, we can figure out how much the spring will stretch: F=kx. Hey...that looks an awful lot like Hooke's Law...

Fourth, I'd play this video clip:

Fifth, I'd tell students to investigate the relationship between the amount of mass on a spring and the period of the spring's oscillation. We'd collect data, make some graphs, and hopefully come to the conclusion that T \propto \sqrt{m}.

If we stop here, we've already done a lot. We've discovered Hooke's Law. We understand a stiffer spring has a bigger spring constant. We know how doubling the mass on a spring will affect the spring's oscillation. At this point I could introduce the equation T = 2 \pi \sqrt{ \frac{m}{k}}. Maybe we could then do the second part of the lab posted above and see how close the observed periods of the springs match the values calculated with that forumla. We'd probably notice all of our observed periods were off by a little bit. This opens up a discussion of why we all have this systematic error. Why are we all off? What could be off? Looking at the formula, there are really only two places we could have error: the spring constant or the mass. Maybe we draw a free-body diagram for the mass on the spring. At this point a student will probably suggest we need to draw a free-body diagram for the spring as well. Hmm...you know...this spring has mass too...could the mass of the spring itself be affecting the spring's period? Now we've independently figured out we need to consider the spring's mass as well. From there we could figure out a test to determine how much of the spring's mass we need to include.

Overcoming the traditional lab format

If you randomly visited physics courses in high schools and colleges across the nation, you'd most likely see a lot of labs similar to the first lab. Traditionally physics labs have been designed so you're given a formula and are asked to make observations that fit with the formula. This is despite the fact that the student-led investigation requires deeper thinking, encourages greater engagement and thinking about the concepts, a better understanding of how the world works, and an understanding of what an equation actually means.

Why should this be so? I believe it's because traditional labs are easy. Print out a sheet with a step by step procedure. Hand out the supplies. Make some measurements. Maybe make a graph. Answer a couple quick questions. Done. The student-led investigation is tricker to share and explain. The entire process I described in the student-led investigation could be preformed without any worksheets whatsoever. It's harder for teachers looking for a new lab to stumble on a description of this type of lab. It's really easy to hit up The Google and find a lab handout, save it, print it, and pass it out. Student-led investigations also lead to potential student errors. Students may struggle. It may take more class time. Sometimes you'll get data that doesn't turn out as well as you'd like. This can be scary and frustrating for teachers. And yet...

Struggling with what this or that graph is telling us, or being forced to think about where errors came from, or having to defend your results and process requires a lot of thinking. Critical thinking. And helping students learn to think critically is worth the extra time and effort. As a bonus, they'll also actually understand the physics better, which is also a good thing in a physics class. 🙂

3+ Quick- Birthday, (grading) scale matters, exposing climate fraud, debunking handbook

These aren't brand new items, as they're things I came across awhile ago and am just getting around to posting now. In addition, I realized that the anniversary of this blog just passed. My first post was published January 12, 2008. As I look back at my first posts, it's clear that I've come a long way (hopefully for the better)- in my location, in my career, and in my thinking. So, in celebration of the 4th anniversary of this blog, let me present you with the following interesting tidbits:

Scale matters (Rick Wormelli)


Thanks to the ActiveGrade blog for bringing this to my attention. I don't know how many times I've had discussions with other teachers on the topic of what constitutes fair and effective grading. Often the most heated topic (where I never made any headway) involved the giving out of zeroes for either missing or poorly done classwork. Rick Wormelli gives a great explanation of why grading scales matter- and specifically why zeroes are no good. It's long for YouTube at 8+ minutes, but it's worth it:

Exposing a climate science fraud (Ethan Siegel)


The post is ostensibly a take down of Judith Curry's claim's that recent studies and reports on the topic of climate change are "hiding the decline1." However, the real appeal of this post (for me) is how it so effectively describes how science and scientists work. He goes through the data, the uncertainties in measurement, and explains how exactly it is that scientists determine that some effect is real and not just a statistical fluke.

The Debunking Handbook (Skeptical Science)


Somewhat related, the Skeptical Science blog (one of the best places to find science-based information about climate science) released The Debunking Handbook a while ago and just recently updated it. The Handbook provides guidelines for communicating about misinformation and gives tips to avoid falling into common pitfalls. In their own words, "The Handbook explores the surprising fact that debunking myths can sometimes reinforce the myth in peoples' minds. Communicators need to be aware of the various backfire effects and how to avoid them..." The handbook is a free PDF download available at their website.

______________________________

  1. "Hiding the decline" is the (totally false) idea that climate scientists are tweaking their graphs to make it seem like the Earth is getting warmer, when it really has been cooling the last decade (which it hasn't). Read the full article for more details. (back)

What is the purpose of Physics class?

I took three physics classes through a local community college last semester. From how the content was presented in each class, it would be fair to say Physics is primarily concerned with learning a set of equations and then figuring out which equation you need to use in order to find the right answer.

This is not a very useful skill. People wiser than I have pointed out similar things. So why, in high school and introductory college physics classes, do they lean so heavily on "learning the formulas?" Here are the two arguments I've heard the most often:

They'll need it in college/their careers

It could be argued, perhaps, that it is good preparation for students who will be pursuing engineering or scientific careers- after all, they'll be taking college classes and graduate classes and probably use a couple equations during their careers. However, there's a big problem with this line of thinking. Are all the students in a high school physics class there because they're planning on becoming scientists and engineers? A few, maybe. Most of them will not- and that's OK, but this realization should cause us to rethink how we present the material.

The equations explain the relationship between variables

I'm sympathetic towards this line of thinking (more on this later)- but not enough to think it's valid. Whenever I hear this argument the first question that comes to mind is "Is this the best way to explore those relationships?" In my experience, students who have struggled understanding physics often did so because they couldn't make sense of what the equations actually describe. Given an equation and all the variables but one and they'd be able to work though a problem, but they weren't understanding why that answer makes sense and any further obfuscation of the problem quickly threw them off track. I agree that the relationship between variables is an important bit. I don't believe that equations clarify that relationship for the vast majority of students.

How I'd like to teach physics

Understanding the relationship between variables, in my mind, is the key to a useful understanding of physics. If I push twice as hard on this shopping cart, what happens to the cart's acceleration? That's a tangible situation that is easier to understand than simply throwing out F = ma and hoping students figure out that relationship on their own. Further, students should discover these relationships. Give students some equipment and tools and have them measure what happens to an object's acceleration as they apply more or less force on the object (some tracking software would be really handy for this). Then have them apply the same force but change up the mass. Chances are pretty good they'll be able to discover F = ma on their own. Chances are they'll have a much better conceptual understanding of what F = ma means at this point than if you simply gave them the equation and had them do some problems. Or if you simply had them prove the formula is correct in a lab.

Why it matters

  1. I believe the focus on relationships promotes a better conceptual understanding of physics- the students can more effectively internalize the way the world around them works. A populace with a healthy baseline of physics knowledge could prevent silly and potential harmful pseudoscience such as magnet therapy from becoming an issue.
  2. There's been a focus on increasing interest in STEM careers- and a special focus on recruiting women and minorities into STEM fields (see this White House press release). An equation-focused physics curriculum can seem intimidating to students. A collaborative, constructivist approach can be perceived to be less intimidating and more welcoming (I'd recommend giving Episode 32 of the Shifted Learning podcast for some interesting bits on gender issues in STEM).

Modeling Instruction

I don't have much experience with Modeling Instruction, but it seems from my reading that the instruction I've been describing is essentially what it is. As a bonus, it's well developed, well researched, and well used instructional method to improve students' ability to construct a better understanding of the physical world around them.

If you're interested, I've found both Kelly O'Shea's series on model building and Frank Noschese's primer to modeling instruction to be great resources. Check their blogrolls for even more good stuff from teachers using modeling.

As I look forward to potentially teaching physics next year I want students who take my classes to come out with a lasting understanding of the topic. I don't want them to half-heartedly memorize equations that they'll forget two weeks after we finish a unit. I'd like to teach for all the students, not just the future scientists and engineers.

Book Review: The Students Are Watching

The Students Are Watching- Schools and the Moral Contract

Somewhere along my journey of teaching, I realized I had started paying an awful lot of attention to more than just the content of what I was saying and having students do in class. I was paying attention to the messages and values communicated through the classroom rules, routines, and activities I was designing for the students. I started to purposefully promote particular values in my classroom.

There was no singular moment (that I can recall) where I decided to align the happenings of my classroom meshed with values that I felt were important. Yet now I find myself thinking (perhaps too much) about what the classroom structures and routines are really telling students. Do they emphasize fairness? Do they treat students as valuable individuals?

"What does it tell students when we make them sign in and out to use the bathroom? That we feel they're trustworthy? That we assume they're going to abuse the privilege? Do the safety and security benefits from having a record of students out of the classroom outweigh the implicit message to students that we don't trust them? How does a school community decide upon these routines?"

Similar issues are discussed in with greater clarity, insight, and detail by Nancy and Ted Sizer in their book, The Students Are Watching: Schools and the Moral Contract. The each look at the routines and rituals of a school through the lens of common verbs that happen in all schools: Modeling, Grappling, Bluffing, etc.). Throughout the book it is argued that we (as individual educators as well as school communities) need to think through how we model or grapple or bluff. We are teaching students about what things we value- whether we've taken the effort as a community to design our routines to closely match our values or not.

In my own experience, I've found schools will pay lip service to values- such as treating every student as an individual- while sadly lacking to provide structures that allow students to be known as individuals. The book doesn't condemn these schools and those that work in them as being hypocrites or incompetent. Instead, it points out that rules and routines are often well intended, but without specifically thinking through the procedures (and including students and parents in the decision making process) we often fall to a default mode of that which is easiest. However, the easiest routines usually put the adults' needs ahead of the students' or allow a subgroup of students to get lost in the system.

I found The Students Are Watching a challenging read. I often would stop part way through a passage and think through my own practices and how I might improve them. It doesn't purport to provide a silver bullet to solve all of a school's problems, but it does provide a tangible framework for thinking more carefully about what values our schools are actually promoting- and whether those values are the values we really want to be promoting.

Go read this book. If you need a copy, I have one I'm no longer reading. I'd be happy to send it along if you're interested- as long as you don't mind some of my messy writing in the margins (see below).

[Update]: This article on The Slacktivist is a sad example of a school's decision making process teaching the students and community about the real moral values of the school. In this case it's further exaggerated due to the school being a religious school.

Critiquing the CAPSS Recommendations for School Reform

I want to make my classroom the best learning environment possible. Most of my posts on this site focus on lessons, assessments, or ideas on how to improve the learning environment inside my classroom. Improving our individual teaching craft is one of the easiest places (not to say it's necessarily easy) as a teacher to effect change.

However, as I've worked towards improving what happens in my classroom I've frequently run into obstacles. These obstacles were primarily exterior to my classroom. Sometimes they were school or district policies, sometimes national or state requirements, and sometimes they were the result of how we, as a culture, have historically structured this thing we call "school." Most of these policies and structures were created with good intentions in an attempt to improve our schools and our children's education.

Given my generally negative experiences with "traditional1" instructional models and structures, I've found myself more and more interested in systemic school reforms. How can we create modern schools and structures that leverage the advancements in technology and access to information to provide students with an education that prepares them to be active participants in our nation's democracy, economy, and society?

It was no surprise when an editorial in our local paper titled Major Restructuring Recommended for Schools caught my eye. In it, the author briefly describes the Connecticut Association of Public School Superintendents (CAPSS) new report, "Recommendations for Transformation," a list of recommendations to transform the state education system "so it is able to meet the needs of students in the future." Naturally, I downloaded, read, and critiqued the full 36 page report (here's the official download link [pdf file], here's a version with my commentary [pdf file]).

My critique of the CAPSS recommendations

The report includes 134 individual recommendations for action across ten broad categories. I won't go into them all. Instead I'll give a brief breakdown of each broad category and get more specific around recommendations of particular interest.

The tl;dr version

This is a long article. For those of you thinking, "I'm can't read this whole thing. There is too much," let me sum up. Speaking in sweeping generalities, I applaud the CAPSS recommendations. In many ways the recommendations are progressive, forward-thinking, and focus on the best interests of students instead of on things that would be easy to implement or get through the political process. Recommendations such as competency-based advancement, standards-based assessments, and integrating out-of-school learning experiences into the formal education process suggest that CAPSS is interested in totally reworking what we mean by "school." This makes me happy. Too often reform movements are limited by the inertia of history and that-which-already-exists. CAPSS is clearly trying to overcome this inertia. Schools that followed the recommendations in the report could be student-centered environments that have a laser-like focus on student learning, support and integrate learning experiences that occur outside the classroom, remove conventions of little educational value (e.g. letter grades, traditional homework, and adult-friendly-but-child-poor assessments), and make schools an intrinsic part of their community.

And yet CAPSS puzzlingly makes recommendations that would make schools larger, less personal, and less a part of their community. Consolidating districts might save some money- which is an important consideration- but this seems to fly in the face of entire other sections of this report (For example, Section 2: Make it Personal; Section 4: Retool Assessments & Accountability; Section 8: Involve Students & Parents). Creating fiscally sustainable school districts is important, but eliminating small community schools in favor of large regional schools fosters disconnect between the schools and their community, students skating through schools unknown by their teachers, and an overall less personalized educational experience. Furthermore, many recommendations are so general that they're simply platitudes without any real meat to them (i.e. "Engage parents as partners in their children's education."). More detail and explanation is needed as to exactly what many recommendations are actually recommending. Lastly, how about some references? Surely (hopefully) the CAPSS group that created the report relied on more than the four citations included in this report- three of which are statistics on current educational practices. Nowhere do they cite sources to support their positions- either in this report, on their website, or any other report provided at their website.

I think CAPSS took a step in a positive direction by making many forward-thinking recommendations for the future of education in Connecticut. While none of these recommendations are binding, it heartens me to see an organization of this sort making progressive recommendations. It gives me hope there will be enough momentum to effect some real and positive educational reform in the near term. However, portions of the report conflict with the overall progressive theme- pointing towards deep elements of hesitation toward the large- and in my opinion needed- education reforms.

If you'd like a more detailed breakdown of the 10 categories of recommendations made in the CAPSS report, read on!

1. Raise the Bar

There are essentially two recommendations here: (1) Create "ambitious, focused, and cohesive" education standards, and (2) provide a system that measures student learning and promotes students through school based on content mastery instead of seat time.

  1. Standards. Question: There already are state education standards, how are these standards different? Are these different than the Common Core Standards? Further, the recommendations specifically focus on standards for "college and career readiness." Those are important goals, but I'd also like them to focus on helping students become effective participants in a democracy. On the whole I'm skeptical of the standardization movement. The report spends a lot of time recommending greater flexibility. In my experience standards tend to inhibit flexibility. Have students who are really interested in a topic not included in the standards? Sorry, no time for that- it's not in the standards.
  2. Content mastery. This is one of those bold recommendations that I love. Essentially, they support the idea that as soon as a student shows mastery of a topic they can move on to a new topic. 13 years in a classroom does not necessarily make an education. In this model, students would be able to advance more quickly or more slowly depending on their individual content mastery- they wouldn't have to wait until the end of the year to move on to the next topic. This is essentially standards-based grading on systemic steroids. However, they fall short on proposing what School would look like under this system. How would mastery be determined? How does it impact the organization of classes at schools? These are big questions that need some serious thought for this to be taken seriously.

2. Make it Personal

This thread focuses on creating student-centric learning environments. Of any of the 10 sections, I like these recommendations the most. The two main ideas in this section:

  1. Advance students based on mastery. This restates some ideas from the last section. I still like it. They're still vague on details, offering only, "Establish flexible work schedules," and "Allow credits to be awarded based on mastery." I have a hard time visualizing how this would work in reality, but perhaps that's because I've spent the last 27 years in the existing system. I'm worried by the recommendation to develop a variety of assessments and projects to allow students to demonstrate mastery. This sounds like they'd be state-standardized affairs, which if they're anything like existing state-standardized activites, would be horrible. These should be developed locally (while being shared publicly for other educators) based on individual student needs.
  2. Flexible learning environments. Yes. Please recognize that plenty of valuable learning takes place outside school. The integration of this informal learning with our formal education is much needed. This should go beyond counting a family trip to the Grand Canyon as an educational experience. If a student can diagnose and fix a car's electrical system, spending three weeks in a classroom learning about basic series and parallel circuits is a waste of their time. Schools should partner with and validate our students' out of school educational experiences.

3. Start with Early Childhood

This isn't my area of expertise, but I think the proposal to provide quality preschool for all children starting at the age of three is one of the biggest no-brainers in education reform. The payoff to society don't manifest for nearly two decades, but there is a seeming wealth of research that suggests preschool is a very good thing. I have some concerns with the recommendations similar to "Develop a system of accountability for providing language-rich, challenging, developmentally appropriate and engaging reading and mathematics curricula." The focus on reading and math smacks of No Child Left Behind, and suggests an emphasis on tightly structured learning environments. In the words of Alfie Kohn:

...the results are striking for their consistent message that a tightly structured, traditionally academic model for young children provides virtually no lasting benefits and proves to be potentially harmful in many respects.

4. Retool Assessments and Accountability

Now we're getting into some meat. The CAPSS report suggests standardized testing should be de-emphasized. I'd be willing to bet they'd suggest eliminating standardized tests as we know them were it not for the current national education environment. Props to them for that.

Here's a selected summary of their suggestions: (1) Provide a variety of assessment formats, (2) Assess students as they're ready to be assessed (instead of everyone at the same time), (3) Get assessment results back to students & teachers quickly so they inform instruction, and (4) Make the goals of all assessment transparent. It seems like they're saying one thing here. Yup, it's Standards-Based Grading.

In fact, they do mention SBG by name in this section, but they recommend making it "part of assessments." I'm a fan of SBG (as evidenced by previous posts), and I think this is a stellar recommendation.

I do have some hesitations with their recommendations, despite their SBG-like nature. For one, it's pretty clear from the language used they're not discussing day-to-day classroom assessment. They're discussing a new form for state standardized2 tests. I'm unclear on what this would look like, but it does sound like an improvement over the current system, though I'm skeptical it would come to pass in this improved manner. Another hesitation rests on the description of incentives for high performing schools. The report clearly recommends moving away from punitive measures, yet in my mind, providing incentives to high-performing schools is nearly indistinguishable from punitive measures against low-performing schools. Finally, the report lists subject areas for "base academic accountability." I take that to mean, "These are the subjects that will be assessed," or perhaps more clearly, "These are the subjects we think are important (things that are valued are assessed)." Notably absent are the arts and physical education- meaning the cuts to art and phys. ed. programs we see happening today are likely to continue were these measures put into place.

5. Offer More Options and Choices

Or, the section with the title that most poorly represents its contents. A better section title? "Consolidate School Districts." Their basic argument seems to be that having the current (supposedly high number of) 165 Connecticut districts creates an environment where it is difficult to align state and local initiatives, is economically inefficient, and fosters racial and ethnic isolation. While I agree that you can save some money by consolidating services like busing or food service, you also lose a connection with the community when the district encompasses many, many communities. Having worked in both small and large districts, the small district was much more connected to and valued by the community3. It may be more expensive to have small community districts- and that's not a small obstacle- but it would be worth it. It should be noted that reworking the state education system in the manner recommended by this report would also be expensive. In addition smaller districts would help schools be more flexible, personal, and transparent. Those adjectives would be a fair summary of the recommendations of this entire report, so why include this section?4

6. Reform Leadership

This section makes a lot of recommendations about the relationship between the State Department of Education and the Commissioner of Education as well as the roles of school boards and superintendents. That's a little bit outside my area of expertise, but I do like this statement from the introduction to the section:

Currently, organization and policy making for education are based on bureaucratic assumptions of hierarchy, centralized decision making, standardization and inspection. These characteristics limit individual discretion, depress creativity and foster stasis, not change.

That certainly describes my experience teaching in Connecticut. Despite completing my Master's in Secondary Education project by designing and implementing a student-centric, student-driven project5, I was told I couldn't continue the project unless all the science teachers wanted to use it. That's not exactly how one fosters innovation and creativity...

7. Boost Quality

This is a huge section with 26 recommendations for action ranging from incentives for attracting quality teachers, to improving teacher education and professional development, to revamping teacher tenure as we know it. I'm going to limit my analysis to the recommendations for professional development and teacher evaluation. I think restructuring the current tenure system is a major issue that deserves discussion, but that'll have to happen in another post so it doesn't turn this already lengthy review into a ridiculously long review.

  1. Professional development for teachers.
    • The report (rightly, in my opinion) makes many recommendations related to preparing pre-service teachers and helping new teachers grow as educators. One of my favorite recommendations suggests structuring a teacher's first year in the classroom as an internship with regular coaching and mentoring by master teachers. If it were up to me, I'd have new teachers carry half of a teaching load, giving them plenty of time during the day to observe other teachers, review and revamp instruction and assessment with a mentor, and generally work to improve their craft. Likewise, the mentors should have a reduced teaching load so they have time to both observe and meet with their mentees during the school day. The current system where exactly zero time is allocated for new teachers to review and reflect on their time in the classroom is a horrible model if we want new teachers to show improvement.
    • A second recommendation states that districts should provide professional learning opportunities for teachers as a part of their regular job- and schedules should be configured to give teachers time to collaborate with their peers. Again, I agree. If you value professional learning and improvement, you should schedule time for it- not make it only something teachers do on their own time (which most do, but it's such a valuable thing schools should be purposefully providing opportunities for their teachers). However, a word of warning: I've taught in a school where the schedule was changed to provide teachers with 70 minutes of "collaboration time" each week. Teachers (including myself) were genuinely excited for this time to share lessons, have quick professional development sessions, and critique instruction and assessment. Instead, it was mandated from above that the "collaboration time" be used solely to analyze student standardized test-prep results. While I understand the importance of standardized tests in our current system, the cost was the loss of time for teachers to share their expertise with each other, learn how to effectively integrate technology, and design cross-curricular projects- all things teachers were excited to use that time to do. The moral of the story is that simply having collaboration time in the schedule doesn't mean it's being used effectively.
  2. Teacher evaluation. As it is, the teacher evaluation system as I've known it is in need of reform. Last year I was observed by an administrator three times- each observation lasting approximately 70 minutes. Outside these official observations, administrators spent about 30 minutes in my classroom throughout the year. Okay, so that's a total of 240 minutes of observation for the entire school year by those who evaluate my performance. For some perspective, I taught four 70 minute classes each school day, and there are 180 school days per school year. That works out to 50,400 minutes of instructional time each school year. My evaluations were based on 240 out of those 50,400 minutes, or 0.48% of the total instructional time. It makes me nervous to think I'm being evaluated from such a position of ignorance6. The recommendations by the CAPSS include creating a standards-based evaluation system with regular performance reviews and including peer review as part of the performance review. As long as "regular performance reviews" includes frequent, informal observations by evaluators and "including peer review" can be expanded to provide students and parents a voice in the evaluation, then I think the recommendations are on track.

8. Involve Students and Parents

Schools give a lot of lip service to including parents and students in the education process. I've never been part of school that has done a good job at doing this. I've known teachers who were really good individually at involving parents in their classrooms and other teachers who provide students a large voice in their own education. Beyond the classroom level the furthest extent I've seen a district or (high) school involve parents is to invite them to serve on committees with little influence that meet at times untenable for most working adults' schedules.

I have no problems with the recommendations in the CAPSS report...other of course than the fact that they're so non-specific that they're just platitudes: "Engage parents as partners in their children's education," or "Create structures that encourage family involvement." Yes, those are good things- but what suggestions do you have for how to do these things?

Let me offer a few quick suggestions.

  1. Use technology to make learning and school happenings more transparent. How? Have administrators start a blog or create an online newsletter that is updated regularly sharing goings on at the school. Share a photo a day. Invite teachers and students to do the same. Let students share their learning and reflections through student blogs (or evening events where students show off projects, etc.). In my mind, these things are the low hanging fruit- They're easy to implement and can cost nothing (depending on the tools used).
  2. Form collaborations with people in the community. Examples?
    • Maybe you have an assisted living community near the school. That's a community with a huge amount of knowledge, skill, and disposable time. Provide transportation to retirees so they can read, mentor, advise, or provide academic support to students.
    • Create a community garden on school grounds that "rents" plots to community members. Have students run the administration and marketing of the community garden. Sell the fruits (& vegetables) of the gardens' labor at a farmer's market in the school parking lot on the weekends.
    • Start a hackerspace in the school for the community. Students in class such as design, computer science, engineering, or any other class where they need to build stuff could be given free memberships and all other students can become members for discounted rates. Hackerspace members can access it all day. Let advanced students lead workshops for community members.

    Ideas like these take more effort and money- but in the end the rewards may pay for themselves. In essence, make the school a community learning center and let the community share its skills and knowledge with the students and vice versa.

9. Leverage Technology

This section is surprisingly short (considering the topic), and the recommendations focus around two main ideas:

  • Students and educators should have access to educational resources at any time. They don't quite recommend making broadband internet access a universal right, they do hint at it. I'd agree- though I'm not sure how that gets implemented. The inexpensive computers available today make computer ownership possible for even quite poor families. Paying $30-$50/month for internet access is much less likely to fit into tiny budgets. I also like the recommendation to "leverage online environments [...] for two-way communication, feedback, and collaboration..." Those environments are widely used today (in the form of social network sites), but more often than not are blocked by the schools themselves. It'd be nice to see schools embracing the power of these tools instead of hiding from them.
  • Keep the technology infrastructure up to date. Of course I agree with this, but it's a matter of money. Even though reasonably powerful computers are becoming less and less expensive, it's still a major cost. I'd like to see schools use free and open source software (Open Office instead of Microsoft Office, for instance) or free resources such as Google Apps for Education. These would help keep software costs down and allow for money to be allocated more wisely elsewhere.
  • .

    10. Continue the Transformation Process

    The report makes suggestions on how to avoid reform stagnation at both the state and district level. Several of the recommendations focus on items like changing statutes or education budgets. I don't have too much of an opinion on these items (due to my own relative ignorance on the topics more than anything else). However, two of the recommendations contain a similar idea that I find extremely attractive. Essentially, they say: Let innovators innovate.. One suggests districts can receive waivers for state statues and regulations to experiment with new ideas to improve student learning. The second recommends providing systems for teachers and principals to experiment with innovative practices.

    If you let smart people do creative things- even if those things are outside the state's or school's "mandates"- you'll end up with a ton of great ideas that help everyone in the end (see: Google's 20% time). Instead of alienating smart people and ultimately driving them out of the education sector, you'd be empowering them and attracting more innovation.

    ______________________________

    1. There isn't a single good definition for what I mean here, but think of the stereotypical adult-centric school or classroom. (back)
    2. Clearly the assessments would be less standardized than the existing Connecticut Academic Performance Test or Connecticut Mastery Test, but they'd still be the state standard. (back)
    3. I admit this could simply be due to specific situations in each respective district, but after hearing and reading about other people's similar experience, it seems to be a fair generalization. (back)
    4. For a smart person's perspective on this matter, let me recommend Deborah Meier's article, As Though They Owned the Place: Small Schools as Membership Communities (pdf alert). (back)
    5. That, to toot my own horn, was nominated for a Scholar of Excellence award by my advisor. (back)
    6. I readily admit any administrator worth their salt talks to students regularly and knows more about what goes on inside the classroom than simply what they see when they're personally in the classroom. I still think 0.48% is a pretty sorry basis for an evaluation. (back)

Missing school

This morning I volunteered to help out one of my physics instructors with an activity on fiber optics at a local high school. I had to skip one of my classes this morning in order to volunteer, but I'm killing it in that class and I haven't been in a high school classroom in a long time. I've been out of the classroom since the middle of June, 2011. Okay, okay, so that's not even 6 months, but the 60ish minutes I spent in a high school today reminded me how much I miss it.

I miss the mental gymnastics of devising solid lessons and activities. I really miss the relationships with students. After playing a minor part in a classroom for an hour this morning I wanted to stay the rest of the day observing teachers, tweaking lessons, and talking to students.

My life is much less stressful this year (despite the lack of a salary). I go to class. I do my homework. I come home to more free time than I've ever had in the last 9 years.

And yet, I miss teaching1. It's good gig.

______________________________

  1. I'll be back teaching next year (assuming I get a job). In fact if you know anyone hiring a physics/earth science/chemistry teacher next year...(back)

Pipe Insulation Roller Coaster Assessment

Welcome back. If you haven't joined us for the last two posts, let me recommend that you first read about determining rolling friction on the coaster and the project overview.

On to the assessment...

Assessment is extremely important. It explicitly informs students what things we value (and thus the things we value). If we assess the wrong things, students will focus on the wrong things. This can turn an otherwise excellent project into a mediocre project. For this post, I'll share two methods of assessment: First, the "old" method I used when I last taught physics (in 2008). Second, my updated assessment scheme that I'd use if I did this project again.

The old assessment strategy

Embedded below is the document I gave to students at the beginning of the pipe insulation roller coaster project. Most noticeably it includes a description of the assessment scheme I used way back in January of 2008.


As you can see, I split the assessment of this project into two equal parts:

An assessment of the finished roller coaster

I wanted students to think carefully about the design, construction, and "marketing" of their coasters. I wanted them to design coasters that not only met the requirements, but coasters that were beautiful and interesting. Individual items being assessed under this rubric were weighted differently. For example, "Appropriate name of the coaster" was only worth 5%, while "Creativity, originality, and aesthetics" was worth 20%. Here's a link to the sheet I used when assessing this aspect of the coaster project.

An assessment of the physics concepts

In the embedded document above, you can see the breakdown of what items were being assessed. In my last post on pipe insulation roller coasters, you can see how students labeled their coasters with information on the marble's energy, velocity, and such along the track. Groups were required to turn in a sheet with the calculations they performed to arrive at these numbers. These sheets were the primary basis for determining whether students understood the physics concepts.

Problems

There are a lot of problems with the assessment scheme as described above. I'm not going to try to address them all, so here are a couple of the biggest issues:

  • Assessing coaster design
    • I'm a fan of elegant design. For this project I'm a fan of finished coasters that look well designed and exciting. That's why I included the first part of the assessment. I wanted to incentivize students to think about the design and construction of their coasters. In retrospect this is probably unnecessary. Students generally came into this project with plenty of intrinsic motivation to make their coaster the best in the history of the class. While I'd still stress the importance of quality design in the future, I'd completely cut this half of the assessment. Students already cared about the design of their coaster. If anything, awarding points for coaster design had an net negative effect. Especially because it doesn't assess anything related to the understanding of physics.
  • Assessing student understanding of physics concepts
    • As a normal part of working in a group while attempting to complete a large project in a limited time, students split up the work. Students are generally pretty smart about this in their own way. While I stressed that everyone in the group should contribute equally towards the calculations. Most groups would have the student who had the best understanding of the physics do most of the calculations. Why? Because it was faster. They needed to finish their coaster and just having the fastest person do the calculations meant more time for construction. While I generally knew when students in a group were adding very little to the calculations (and would assess them accordingly), on the whole this method didn't give me a good picture of each individual students' level of understanding. There were certainly students who skated through the project while minimally demonstrating their understanding of the energy and friction concepts involved.

The new assessment strategy

You've probably already picked up on a few of the improvements I'd make for this project.

  1. Use standards-based assessment. Standards-based assessment is an integral part of the classroom throughout the year- not just for projects. If you're unfamiliar with what this "standards-based" business is all about click the little number at the end of this sentence for plenty of links in the footnotes1. Here are a list of standards that would be assessed through this project:

    Content standards assessed

    • Energy
      • Understand and apply the law of conservation of energy.
      • Explain and calculate the kinetic energy and potential energy of an object.
      • Explain and calculate the amount of work done on and by an object.
      • Solve basic conservation of energy problems involving kinetic energy and potential energy.
      • Solve conservation of energy problems involving work and thermal energy.
    • Circular Motion
      • Solve basic circular motion problems using formulas.
    • Habits of Mind
      • Collaborate and communicate with others to meet specific goals.
      • Handle and overcome hurdles creatively and productively.

    The specific standards used can vary based on your specific implementation.

  2. No points for coaster requirements. As I mentioned earlier, it proved unnecessary to incentivize their coaster designs and meeting the basic requirements of the project. This decision also comes out of standards-based grading, which focuses assessment around, "Do you know physics?" instead of "Can you jump through the right hoops?" That isn't to say we don't talk about what makes a coaster "exciting" or "aesthetically pleasing" or whatever. It just means a student needs to demonstrate their understanding of the physics to earn their grade.
  3. A focus on informal assessment. Rather than heavily relying on a sheet of calculations turned in at the end of the project (and probably done lopsidedly by one or two group members) to determine if the group understands the physics, I'd assess their understanding as I walked around the classroom discussing the coasters and their designs with the students as they work on them. Starting with questions like, "Why did you make that loop smaller?," or "Where are you having trouble staying within the requirements?" can be used to probe into student thinking and understanding. The final calculations would still be a part of the assessment, but no longer the single key piece of information in the assessment.

On the whole I was very happy with this project as I used it in the past. As I've learned and grown as a teacher I've found several ways I can tweak the old project to keep up with the type of student learning I want to support in my classroom. If you have other suggestions for improvement, I'd be happy to hear them.

As a bonus, here's a student produced video of the roller coaster project made for the daily announcements. The video was made by a student who wasn't in the physics class, so there's a little more emphasis on the destruction of the roller coasters at the end of the project than I'd like. Kids. What can ya do?

______________________________

  1. Here are posts I've written about my experience implementing standards-based assessment. I'm not an expert, so let me also direct you my bookmarks related to standards-based grading, and some resources written by a couple people who are more expert: Shawn Cornally and Frank Noschese (who offers blog posts, a shared google doc foler, and a collection of bookmarked links). There are certainly other great resources out there, but these are a great starting point. (back)

Pipe Insulation Roller Coasters

The Hazard Zone team with their coaster
The Hazard Zone team with their coaster

I like projects. I really liked this project. The pipe insulation roller coaster project is one of the most enjoyable projects I've ever used in class.

History

It was my second year teaching physics. During the unit on energy, the book we were using frequently used roller coasters in their problems. We even had a little "roller coaster" to use with photo gates. I thought we could do better.

My original idea was to get some flexible Hot Wheels tracks and make some loop-de-loops and hills. Turns out a class set of Hot Wheels track is pretty expensive. On an unrelated yet serendipitous visit to my local big box hardware store, I ran across the perfect (and cheap!) substitute: Pipe Insulation!. For $1.30 or so you can get six feet of pipe insulation- which doubles nicely as a marble track1 when you split the pipe insulation into two equal halves. It's really easy to cut pipe insulation with a sharp pair of scissors. Just be sure you don't buy the "self-sealing" pipe insulation, which has glue pre-applied- it's more expensive and it'd turn into a sticky mess.

At first I planning to simply design a one-period long investigation using the pipe insulation (my original ideas morphed into the pre-activity for this project). As I started to think through the project more and more, I realized we could go way bigger. And thus, the pipe insulation roller coaster project was born.

Building the Coasters

In groups of three, students were given 24 feet of pipe insulation (4 pieces), a roll of duct tape2, and access to a large pile of cardboard boxes3. All groups had to adhere to a few standard requirements:

  • Construction requirements
    1. The entire roller coaster must fit within a 1.0m x 2.0m rectangle4.
    2. There must be at least two inversions (loops, corkscrews, etc.).
    3. All 24 feet of pipe insulation must be used.
    4. The track must end 50 cm above the ground.

    The coaster is under construction

  • Physics requirements
    In addition to meeting the above requirements, students were required to utilize their understanding of the work-energy theorem, circular motion, and friction to do the following:
    1. Determine the average rolling friction, kinetic energy, and potential energy at 8 locations on their roller coaster.
    2. Determine the minimum velocities required for the marble to stay on the track at the top of all the inversions
    3. Determine the g-forces the marble experiences through the inversions and at least five additional corners, hills, or valleys.
    4. The g-forces must be kept at "safe" levels5.
Labeling the Physics Data
The labels included information about the energy, velocity, and acceleration of the marble at specific points (Note: there are some sig fig issues here).
Calculations
  1. Rolling friction, kinetic energy, and potential energy
    • The potential energy (U_g = mgh) is easy enough to find after measuring the height of the track and finding the mass of the marble. The kinetic energy is trickier and can be done by filming the marble and doing some analysis with Tracker, but since the speed of the marble is likely to be a little too fast for most cameras to pick up clearly, it's probably easier (and much faster) to simply measure the time it takes the marble travel a certain length of track. I describe how this can be done in a previous post, so check that out for more info. That post also includes how to calculated the coefficient of friction by finding how much work was done on the marble due to friction- so I'll keep things shorter here by not re-explaining that process.
      • Pro-tip: Have students mark every 10 cm or so on their track before they start putting together their coasters (note the tape marks in this pic). Since d in W=F\cdot d in this case is the length of track the marble has rolled so far, it makes finding the value for d much easier than trying to measure a twisting, looping roller coaster track.
    
    
  2. Minimum marble velocities through the inversions.
    • This is also called the critical velocity. That's fitting. If you're riding a roller coaster it's pretty critical that you make it around each loop. Also, you might be in critical condition if you don't. While falling to our death would be exciting, it also limits the ability to ride roller coasters in the future (and I like roller coasters). Since we're primarily concerned with what is happening to the marble at the top of the loop, here's a diagram of the vertical forces on the marble at the very top of the loop:

      The red curve is the roller coaster track. Just so you know.

      So just normal force (the track pushing on the marble) and gravitational force (the earth pulling on the marble). Since these forces are both acting towards the center of the loop together they're equal to the radial force:

      When the marble is just barely making it around the loop (at the critical velocity), the normal force goes to zero. That is, the track stops pushing on the marble for just an instant at the top of the loop. If the normal force stays zero for any longer than that it means the marble is in free fall, and that's just not safe. So:

      Then when you substitute in masses and accelerations for the forces and do some rearranging:

      There you go. All you need to know is the radius of the loop, and that's easy enough to measure. Of course, you'd want a little cushion above the critical velocity, especially because we're ignoring the friction that is constantly slowing down the marble as it makes its way down the track.

  3. Finding g-forces
    • An exciting roller coaster will make you weightless and in the next instant squish you into your seat. A really bad roller coaster squishes you until you pass out. This is awesomely known as G-LOC (G-force Induced Loss of Consciousness). With the proper training and gear, fighter pilots can make it to about 9g's before G-LOC. Mere mortals like myself usually experience G-LOC between 4 and 6g's.

      As I mentioned, I set the limit for pipe insulation roller coasters at 30g's simply because it allowed more creative and exciting coaster designs. While this would kill most humans, it turns out marbles have a very high tolerance before reaching G-LOC.

      To find the g-forces being pulled on corners, loops, or hills you just need to find the radial acceleration (keeping in mind that 1g = 9.8 m/s^2):

    Pipe Insulation Roller Coaster construction underway

    Raise the stakes

    Students become fiercely proud of their roller coasters. They'll name them. Brag about them. Drag their friends in during lunch to show them off. Seeing this, I had students show off their creations to any teachers, parents, or administrators that I was able to cajole into stopping by for the official testing of the coasters. I even made up a fun little rubric (.doc file) for any observers to fill out for each coaster. This introduces some level of competition into the project, which gives me pause- though from day one students generally start some friendly smack talk about how their coaster is akin to the Millenium Force while all other coasters are more like the Woodstock Express. The students love to show off their coasters, and it seems the people being shown enjoy the experience as well.

    Coaster judging in progress.
    Coaster judging in progress.

    Assessment

    Assessment is massively important. However, this post is already long. The exciting conclusion of this post will feature the assessment piece in: Part 2: Pipe Insulation Roller Coaster Assessment.

    The Pipe Insulation Roller Coaster Series

    1. Pipe Insulation Roller Coasters and Rolling Friction
    2. Pipe Insulation Roller Coasters
    3. Pipe Insulation Roller Coaster Assessment
    
    

    ______________________________

    1. The first day we played with pipe insulation in class I had students use some marble-sized steel balls. Unfortunately because the steel balls are so much heavier and the pipe insulation is spongy and flexible, there was just too much friction. When we switched to marbles the next day everything worked like a charm. (back)
    2. Most groups typically use more than one roll of duct tape. My first couple years I bought the colored duct tape and gave each group a different color. That was a nice touch, but also a bit more expensive than using the standard silver. Whatever you decide, I highly recommend avoiding the cut-rate duct tape. The cheap stuff just didn't stick as well which caused students to waste a lot of time fixing places where the duct tape fell and in the end used a lot more duct tape. (back)
    3. I had an arrangement with our school's kitchen manager to set broken down boxes aside for me for a few weeks before we started the project. If that's not an option, I've also found if you talk to a manager of a local grocery store they're usually more than willing to donate boxes. (back)
    4. I made it a requirement for groups to start by building a cardboard rectangle with the maximum dimensions. This served two functions: (1) It made it easy for the groups to see what space they had to work with, and (2) it allows the roller coasters to be moved around a little by sliding them across the floor. (back)
    5. Originally I wanted students to keep g-forces below 10. Very quickly it became apparent that under 10g's was overly restrictive and I upped it to 30g's. That's not really safe for living creatures, but it would certainly make it more "exciting." (back)

3 Quick- Subjectives, grading stinks, and modeling with Kelly O'Shea

Fairly often I find things online that I think are either terribly interesting, awesome, or thought-provoking, but don't have either the time or the will or write anything in depth about how or why they're interesting, awesome, or thought-provoking. I'd still like to share these items, so I've decided to make the 3 Quick a semi-irregular feature1 here at Re:Thinking. Offered with little to no editorialization. Feel free to kick off a conversation in the comments.

Subjects or Subjectives (Michael Wesch)


Dr. Wesch says:

As an alternative to the idea that we teach “subjects,” I’ve been playing with the idea that what we really teach are “subjectivities”: ways of approaching, understanding, and interacting with the world. Subjectivities cannot be “taught” – only practiced. They involve an introspective intellectual throw-down in the minds of students.

I agree. I think this is something discussed fairly often in the scientific spectrum (though through different terminology). For me it boils down to the statement, "I'm not concerned if my students can't remember specific scientific content if they have learned to think scientifically."



The Case Against Grades (Alfie Kohn)


Alfie Kohn has never been a supporter of giving out grades, and this article goes into detail on the three big effects of grading:

  • Grades tend to diminish students’ interest in whatever they’re learning.
  • Grades create a preference for the easiest possible task.
  • Grades tend to reduce the quality of students’ thinking.

Even better, he gives some examples of assessment done right. My only beef with the article is the seeming lack of acknowledgement that many (probably most) teachers are working in situations where they would simply not be able to get rid of grades due to the requirements of their schools and districts.



Model Building (Kelly O'Shea)


If you're interested in using the well-researched and effective Modeling Instruction with physics, let me recommend Kelly O'Shea's series explaining how her classes build the models. It gives an excellent peek inside a classroom using modeling to those (like myself) who are interested in implementing it in the near future. As of this posting, she's written up explanations for the Balanced Force Particle Model and Constant Acceleration Particle Model, but it looks like she'll have six in total when she's done.

______________________________

  1. "Semi-irregular" as in however long it takes me to come across 3 items to share and have the time to write up a post. (back)

Lightbulb Challenge: LED vs. Halogen

[Update: See the bottom of the post for a quick update thanks to some issues pointed out by Nicolas Marmet in the comments.]

[Update 2 (1/26/2014): I added graphs for the cost per hour instead of the cost per day assuming 3 hours of use per day. I s'pose it would've made sense to just start with these graphs. Oh well.]

In the 1960s Walter Mischel performed studies involving preschoolers and marshmallows. The "Marshmallow Experiment" involved sitting kids in a bare room and setting a marshmallow1 in front of them, then telling the preschoolers they could either eat the marshmallow now or wait 15 minutes. If they successfully waited 15 minutes then they'd get a second marshmallow to enjoy in addition to the first.

In addition to telling us something about deferred gratification, it's also immensely fun to watch preschoolers in agony attempting to defer their gratification:

This past week I needed to pick up two new lightbulbs for our oven hood. I noticed our local big box home supply store had a fancy new LED bulb that would fit into the outlet. On the downside the LED cost $32.98 compared to $7.98 for the same type of halogen bulb that just blew out. Wow. $32.98 feels really expensive for a lightbulb. I decided to get one of each then do a little cost analysis when I got home.

LED vs Halogen

Here are the vitals for each bulb (according the packaging):

Halogen Bulb LED Bulb
Power 50 Watts 7 Watts
Lifetime 2,500 hours 25,000 hours

Let me assure you our old halogen bulbs didn't last nearly as long as 2,500 hours. I'd roughly estimate those bulbs are on an average of 3 hours a day. The halogen bulbs should have lasted about 2.25 years at that usage. This is the second time I've had to replace these bulbs and we've only been in our house 3 years. Does this mean the LED bulb will have a shortened lifespan as well? I don't know. I'll cut the bulb makers some slack this post and assume their numbers for the lifetime of the bulbs is accurate.

How long before the LED pays for itself?

The LED bulb uses about one-seventh less power and has 10 times the lifespan of the halogen bulb. It seems pretty clear that at some point it'll eventually pay for itself. But how long will that be? Days? Years? Decades?

I hunted down a bill from the electric company and added up all the government surcharges, distribution rates, and so on. Hmm...it seems like they should print the total rate you actually pay instead of only listing the seven different surcharges individually. It'd certainly make it quicker to see what you're paying. Anyway, I pay $0.15373 per kilowatt-hour. What does that mean? It means that if I leave a 1000 watt lightbulb on for 1 hour, it'd cost me $0.15373. Knowing that, I can figure out how much it costs me to keep the LED and Halogen lights on for 1 hour:

That's nice to know, but the time has come to make a chart:

You might be wondering why the data points jump every so often on the graph. Let me explain: The halogen bulb line jumps $7.98 after every 2,500 hours of use. Why? That's the bulb burning out and me running out to the store buying a new bulb for $7.98. You'll notice the LED cost jumps $32.98 after 25,000 hours of use. Same deal.

I added in best-fit2 lines and had Excel whip up the equations for those lines. The trendlines' slope is the cost per day (assuming the bulbs are on 3 hours a day). I'm really interested in the intersection of the two trendlines. It's at that point where the LED bulb is actually saving me money. We can see they cross just before 1000 days. We can do better than that. If I subtract the two equations from each other, I should get an equation that gives the difference in cost between the two bulbs:

What does this tell us? Well, it says the difference in cost between the two bulbs in $25.00, and for each day of use (x), the LED is $0.0277 cheaper to operate than the halogen bulb. So how many days until the difference in cost between the LED an halogen is $0.00?

902.5 days
or
2 years, 5 months, and 20 days

.

Deferred gratification

LEDs save money. They're more efficient. They last longer. But...paying $32.98 for one lightbulb and then waiting nearly two and a half years before it pays itself off can be nearly as painful as a 4 year old waiting 15 minutes for a second marshmallow. In both situations the end result is desirable- but it involves subduing that part of your brain that says, "Mmmm...marshmallow...so tasty...must. eat. it. now." or "HOLY #$*&@! $33.00 for ONE lightbulb!" You just have to keep telling yourself that second marshmallow's on its way and that after 25,000 hours that LED bulb will have saved you over $200.

A related student activity

It seems that many sustainable technologies (LED bulbs, electric cars, photovoltaic cells) require more money up front. Over time, just like the LED bulb, they generally pay for themselves. Have students investigate whether these extra costs pan out by having them pick a sustainable technology (like buying an electric car) and by doing some research into, for example, their family's driving habits. After a few calculations they can determine how long it would take for the electric car to pay for itself compared to an equivalent gas-powered car. Open up the topic to students- let them pick topics that interest them. Have them do additional research into the costs and benefits beyond just dollars and cents. Encourage them to interview people who have already implemented these changes. Most of all, let them take the time to puzzle over how to figure this stuff out. Be there to help and guide them, but please, please, don't just give them worksheet that has them plug in some numbers to get some type of answer.

UPDATE

Nicolas Marmet (in the comments) pointed out a few things that I think are worth addressing, primarily the issue with not continuing the data in the chart above for several lifetimes of the LED bulb. I didn't think there would be much effect, but just because I was curious, I made a new chart (below) that goes through 11 lifetimes of the LED bulb:
LED bulb chart, through 11 lifetimes
As you can see, the slope of the best-fit line jumps from 0.0042 dollars/day to 0.0070 dollars/day. While that's still significantly cheaper than the Halogen bulb (at 0.0319 dollars/day), it's different enough that some new calculations are in order. After updating the Savings equation (from above), the new equation looks like this:
\text{Savings}=\$ 25.00 - 0.0249x

This tells us that it'll take an additional 102 days than the calculation above until the LED is the cheaper choice. It'll take 1004 days, or 2 years and 9 months until the break even point. Not too much longer, but still longer.

Also, a quick run through Nicolas Marmet's other points:

  • As far as the difference in lumens between the two bulbs, I no longer have the boxes pictured above, and can't make out any lumens listed on the boxes in the picture above. Qualitatively looking at the lights side-by-side in my house, I don't notice much difference in brightness between the two, and I'd pick the LED as the brighter bulb- though perhaps that's a trick due to the whiter quality of its light.
  • I can't say much about the premature failure of LED-bulbs other than to point out that my LED bulb hasn't failed yet- though as of this update it's only been 327 days since being installed, which assuming 3 hours per day of use, is only 3.9% of the bulb's 25,000 hour lifespan, and I'm still 2 years, 10 months, and 8 days away from the break-even point. I'll update this post again once either bulb has failed.
  • LED buzzing: The LED I installed doesn't buzz at all. It has made no noises at all that I've been able to perceive. My cats don't act weird (or I guess I should say, weirder than normal) when it's on, so that's not an issue.

While there's certainly a lot of time left where I could be proven wrong, it does still seem over the long-term the LED bulb is a smarter investment, though you're probably not going to be able to retire early from the savings gained from using LEDs.

UPDATE 2 (1/26/2014)

Here are two views of the same graphs using Cost per Hour instead of Cost per Day of 3/hr use:

Halogen v LED 30900 hrs

Below I zoomed in to only include the costs up to 3200 hours so the break even point becomes more clear.

Halogen v LED 3200 hrs

So, after 2906.98 hours of continuous use (or 121 days), it becomes cheaper to purchase a 7W LED bulb over a 50W Halogen.

______________________________

  1. According to the Wikipedia, Mischel let kids choose whether they wanted a marshmallow, Oreo cookie, or pretzel stick. I guess the "Marshmallow, Oreo, or Pretzel Experiment" doesn't quite roll of the tongue as nicely. (back)
  2. I set the y-intercept for the halogen and LED bulbs at 7.98 and 32.98, respectively. Since you're paying that money up front, it follows that at time zero (when you've purchased the bulbs but haven't used them yet) you're already out the cost of the bulb. (back)