While it's not exactly news at this point, I'm happy to announce that I'll be teaching Physics at the Connecticut River Academy, a public magnet school located in East Hartford, CT. I've been subbing and helping out at the school quite a bit since I was hired, and I'm pretty dern excited to teach there next year. While I haven't been around the school community much as of yet, I think it's safe to say there are a lot of good things happening at this school and I'm excited to be a part of those things in the years to come.
Here's where you can help: The CT River Academy is about to wrap up only its second year as a school this month. As a result of the school's newness, there's no Physics curriculum yet put together. While this means it'll be a lot of work for me this summer, I'm excited to help build the class with my colleagues from the bottom up. Earlier via Twitter, I shared this Google Doc that lists some ideas and thoughts I have for designing the instruction and assessment for Physics classes. If possible, I'd greatly appreciate some additional help from any teachers using Modeling Instruction to teach Physics. Namely, I'm interested in (1) what units you go through and in what order, and (2) what textbook (if any) you use with Modeling Instruction. If you could complete this really short survey on these topics, I'd greatly appreciate it.
I have a love/hate relationship with professional development. I like getting better at teaching. I like hearing from people who are smarter and/or more experienced than I am. Unfortunately, my experience with "official" school-district provided professional development is too often...um...less than stellar.
This less than stellar PD is one of the main reasons Twitter has been an amazing resource. It allows my professional development to be self-directed: focusing on what I want help with when I want help with it. As great as the Twitter was (and continues to be) for this, I missed the face-to-face interactions that can't happen over the Twitter.
Fortunately, EdCamp is a thing. It combines the just-in-time, self-directed professional development I enjoy from Twitter with the great face-to-face conversations I value so much from traditional professional development. It also happens to be free (which is a bonus, because it's worth my hard-earned money).
What makes EdCamp different?
Well, quite a lot, actually.
It's democratic. At the beginning of the day the participants propose sessions and design the schedule for the day.
It's participatory. Sessions at EdCamps are encouraged to be conversations between the session leaders and participants. No hour long terrible comic-sans slide decks with one person droning on. I promise.
It's organized and run entirely by volunteer educators. Sure there are sponsors to help pay for lunch, prizes, and so on, but there's no exhibitor hall with salespeople hawking their wares or sessions that are just sales pitches. In fact, if such a sales-pitch session did happen you would be encouraged to...
...vote with your feet. If you find yourself in a session that just isn't the topic you had hoped it would be, you can leave. It's not just okay to walk out, it's encouraged. We don't want you to waste your time sitting through a session you don't find applicable to your needs. In fact, you can wander in and out as you please. Or skip a session if you simply need time to organize your thoughts or even take a break.
If you're free August 10, 2012 and in the New England region, you should attend EdCamp CT. It'll be a great day of professional development with passionate educators from all over the region1. You can register for EdCamp CT here. I hope to see you there!
I found myself thinking a lot about what schools are doing and what they should be doing to prepare students for their lives after formal education while attending EdCamp Boston this past weekend. During a session where Katrina Kennett and her students were sharing about how they create a learning environment based off the EdCamp model1 I found myself wondering what it was like for Katrina's students to hear their teacher discussing how she designed the system and has addressed specific issues.
My mind wandered back to a discussion at an earlier session discussing ways the training/education of pre-service teachers could be improved. During that conversation the idea of encouraging pre-service teachers to employ a "growth mindset2" came up- mainly because we thought a growth mindset was something we desired for our students and as a result it's something desirable for teachers so they can encourage it in their students.
Teaching students to be resilient, creative, and independent thinkers is hard. It's not something that can be done with a "good" textbook or curriculum and is essentially impossible to assess using the current regime of standardized testing. It's not simply about having students take lots fine arts classes (though that's not a bad thing)- it's something that should be an integral part of the school culture. But how do you do that?
Personally, I think we should model it for our students in our classrooms. When Katrina was discussing how she designed and implemented EdCafés in front of her students, they were able to get a behind the scenes view of how she addresses problems that come up and how the process was changed and tweaked over time. This behind the scenes view of the teaching process can model how problems and failures can be jumping off points to future success. Often classrooms are places where both teachers and students are afraid of failing. Instead we need to model how failures today can lead to some of the best learning opportunities tomorrow. I've heard it said that the biggest challenge for science graduate students is the transition from undergrad- where information is taught like we know everything- to research- where the best place to work is in the unknown3.
The most exciting phrase to hear in science, the one that heralds new discoveries, is not 'Eureka!' (I found it!) but 'That's funny ...' -Isaac Asimov
In today's manic "Ed Reform" environment, there's plenty of talk about preparing kids for the future. But the future is uncertain and what knowledge they'll need in the future is uncertain. What we do know is that students will need to be flexible. They'll need to be able to adapt and change to new situations. While a good background of knowledge in science, math, history, etc. is important, it's more important we help students lose their fear of failure and help them learn how to be resilient. These are things I'd like schools to be doing explicitly.
It was a good session- largely (for me) because students led the discussion most of the time modeling how the EdCafé system works in the classroom. It was nice example of good professional development design. If you're interested, you can read about EdCafés at the EdCafé posterous. (back)
What is a growth mindset? "In a growth mindset, people believe that their most basic abilities can be developed through dedication and hard work—brains and talent are just the starting point. This view creates a love of learning and a resilience that is essential for great accomplishment." via Mindset Online For (much) more, check out John Burk's numerous posts on encouraging a growth mindset in his students. (back)
I can't remember where I heard this. If you know drop me a line and I'll update it. (back)
In no particular order. And I reserve the right to be driven crazy by things excluded from this list.
My example "bad" slide deck (from this post) has been viewed on SlideShare over twice as often and downloaded 4+ times as often as the new, improved, better version.
The number one route people on the internetz take to get to this post in which I lament the poor quality of worksheet labs is by searching the Google for, "Worksheet for Hooke's Law," or some variation thereof.
Grade grubbing. A couple weeks ago we got back the scores from the second exam in my Organic Chemistry class. I lost 10 points for making a small silly mistake in a reaction's mechanism. I wasn't very happy about receiving 0/10 points when I clearly showed more that 0% understanding of the topic (I'd've given myself a 7/10- proficient, but with room for improvement). The professor was overwhelmed with grade grubbers after passing back the exam who were quite clearly simply looking for extra points to improve their grade. I couldn't bring myself to ask for partial credit because I didn't want to be associated with the grade grubbers.
The first two especially bother me- most notably because they have this ironic quality of juxtaposing things I've posted about moving away from "traditional" instructional models and people looking for resources to use teacher-centrically. Today I changed the description of the poor slide deck in SlideShare to, "Please don’t use these slides to teach. Really. I only posted this as an example of how I used to (poorly) use PowerPoint." Let's see if that helps.
Hackerspaces are "community-operated physical places, where people can meet and work on their projects1. Essentially, it's a community workshop: Some have wood or metal-working equipment and community tools, others have welding equipment, others focus on computing and programming. Each hackerspace is different. To become a member generally you'll pay an annual fee which gives you access to the equipment and the space. Many hackerspaces will offer classes given by members to the general public and have drop-in days when non-members can pay a small fee to use the hackerspace.
A hackerspace is a large, self-directed learning environment. Maybe you want to make your own Geiger counter or build a sidecar for your bicycle. A hackerspace would provide you the space and tools to get it done. On top of the space, the best thing about hackerspaces is that they encourage collaboration. It's a place where you can walk around and see what other people are working on, ask questions, and get some help from smart people if you need it.
Why Hackerspaces in Schools?
I first starting thinking about how schools and hackerspaces fit together after listening to CBC Spark's segment on Hacking the Library, featuring several libraries that are teaming up with hackerspaces to provide additional learning experiences for their patrons. That piqued my interest. Libraries are community learning spaces. Schools are community learning spaces. If hackerspaces are popping up at libraries, why not in schools?
What are the benefits of having a hackerspace at a school?
Real-world application of content. I recently took the Praxis II Physics exam. I spent a lot of time studying content related to electricity & magnetism. Why? Besides not having taken a class on the topic since 1999, I lacked a deeper understanding of the topics- primarily because you can't see magnetism and electricity the way you can see a ball flying through the air. Inductance? Capacitance? These are tricky concepts that I know I struggled to understand deeply. However, if you provide the time and space for students to build things like USB chargers for their iPods, or super-capacitor flashlights- where students can harness inductors and capacitors to build useful objects, then there's a much better chance they'll gain a deeper understanding of what capacitors and inductors are and how they're used.
Student choice. There are amazing communities like Instructables or Make Projects where students can find ideas for projects. Even if you wanted students to all build something related to a specific topic (i.e. electronics with capacitors, for instance), there is such a huge variety of projects available online this would still allow students to pick something that interested them personally.
Giving students agency over their "stuff." Making stuff is empowering. Taking apart and restoring a trashed bike gives you a sense of pride about the bike that wouldn't exist if you had just bought it from the store. If your remote control for your TV breaks, you might just go buy a new one- but if you recently built your own solar battery charger, maybe instead you'd take apart the remote control and fix it yourself.
Connecting the school to the community. Ideally, I see schools as centers of community- a place open to all community members as a place of learning beyond just the school day for students. I envision a school hackerspace run very much like any hackerspace: open to anyone in the community who would like to become a member, available to community members during school hours and students after school hours, providing classes for the community (ideally some classes being taught by students), and providing a place for the community to share their expertise with students and students to share their expertise with the community.
Not just a wood shop class on steroids. I wouldn't want to see the hackerspace used as its own class- like wood shop classes might have been in the past. I think it'd be much more powerful if the school day were arranged so students had independent time set aside to work on self-directed projects (a là Think Thank Thunked). Not just for hackerspace projects, mind you, since not all topics and projects would be hackerspace appropriate, but certainly the hackerspace would be available.
If I was in charge of building a new school2, I'd work my butt off to try to get a hackerspace as part of my school. I realize there would be a lot of potential details and issues to work through to get it done, but I think the learning and community that would result from such a space would be well worth the effort.
Note: I've never actually been to a real hackerspace. Unfortunately there don't appear to be any hackerspaces in Connecticut (according to Hackerspaces.org). If someone would like to get on that as well, I'd be on board.
Well...to be precise, it's titled "Implementation of a technology-rich self-directed learning environment in a ninth grade Integrated Science classroom." Catchy, I know.
To be honest, this is a bit old. I thought I had posted this a long time ago, but recently realized I never had despite always meaning to do so. I implemented this project in the spring of 2010 and officially submitted my project in June of the same year. It won me a "Scholar of Excellence" award, so it must be at least somewhat decent.
Though the full paper may not be of interest to you, let me recommend the Lit Review. I went through many, many papers on constructivist environments and instructional technology's impact on student learning. It'd make me very happy if anybody found this even remotely useful.
Simply put, students worked in teams of four to five and shared a team blog. Students investigated any topic that interested them around the general theme of climate change. Students were tasked with researching the topic and sharing their learning and questions on their blog. There were no due dates (other than the end of the school year), though students were all required to write a certain number of posts and comments on their classmates' posts (for more details, check out the Project Design section of the paper). For a bit on the rationale, here's an excerpt from the Introduction and Rationale:
The purpose of the educational system in the United States has been described in many different ways depending on the viewpoint of the individual doing the describing. Creating individuals able to become positive members of society, providing skills for the future workforce, or preparing individuals for an uncertain future have all been cited by various people and organizations as the purpose of schooling- each relying on their own value set and particular social and political biases. While there is no doubt that these various beliefs about the purpose of the American educational system have been true, and may continue to be true in various times and places, it is this author's belief that one of the more important goals of the educational system is to create life-long learners who will be able to actively and knowledgeably engage in whatever ideas and issues may cross their paths. As specific information and skill-sets are quickly changing due to the rapid increases in knowledge and improvements in technology the importance of teaching students specific content information decreases while the importance of teaching students how to locate, evaluate, and interact with knowledge increases. As what it means to be productive members of society or effective members of the workforce changes, the ability for individuals to understand how to learn new knowledge when they need it is more valuable than simply falling back on information learned through formal schooling.
If schools are to become a place where students learn how to interact with, challenge, and develop new knowledge, then the traditional classroom structure- that of the teacher as the primary source of knowledge and assessment- needs to change as well. Students should be given a chance to work out the solutions to problems that do not have predefined answers. In doing so, students lose their status as passive recipients of information and instead become active creators of knowledge. A method of implementing this might be built on the problem-based learning (PBL) model that has been used for many years in many content areas with various age levels. The incarnation of PBL envisioned here provides students with real-world problems to solve that do not already have easy or "neat" answers, gives students the freedom to explore down side canyons as part of the problem solving process, allows time for students to share their ideas and work with others, and provides support and time for students to document and reflect on their learning and problem solving process.
Let me know what you think or if you found anything useful for your own purposes.
There's a widespread narrative regarding science education in the United States: It stinks. As a science educator, whenever I hear this two things happen. First, I get my my hackles all up. Second, I realize that despite my hackles I generally agree. I get my hackles up because I've spent a lot of time thinking about, planning, designing, and implementing a science curriculum that I feel has been pretty darn good. However, I recognize that the School System (I'm not picking at any one school district here, but instead at the entire system of schooling in this country) has not done a very good job of helping students to think and act like scientists.
Recently, while listening to the Skeptics' Guide to the Universe podcast (#343), I had my hackles raised. They discussed a recent article on io9 titled, "Your State Sucks at Science.1" This article discussed a report by the Thomas B. Fordham Institute that analyzed each state's standards on their "Content & Rigor" and "Clarity and Specificity." The results (summarized on the map below), showed that the vast majority of states didn't do so well. In fact, they did terribly.
OK, that information doesn't shock, surprise, or upset me. Connecticut earned a not-so-respectable "C." I'd probably give the standards I've worked with (9th grade Integrated Science) a lower grade. Many standards are overly broad. Others are ambiguous. I agree with the Skeptics' Guide, io9, and the Thomas B. Fordham institute that improving these standards would be a good thing for science education.
So, why are my hackles still raised? Well...during the Skeptics' Guide to the Universe (SGU) discussion on the sorry state of science education, the general view was that poor standards are the crux of the problem (followed by poor teachers- more on this later). It was stated that poor standards will cause teachers to fail their students more often than the case would be if states had good standards. As anecdotal evidence of this, Dr. Steven Novella noted his daughter is receiving a sub-par science education at the Connecticut public school she is attending. Dr. Novella specifically described his two big problems with his daughter's science instruction:
Inquiry and scientific thinking is not taught well at all.
Real science education doesn't even really begin until the 7th grade. In grade schools they get virtually nothing.
I generally agree with these assertions. What really bothered me, however, was the discussion of why these problems exist. Here are some quotes from the discussion:
"Teachers don't quite grasp how science works."
"When the standards fail the teachers, the teachers will more likely fail the students."
Can you see where this is going? They never come right out and say science education stinks because our science teachers stink, but that idea is hovering just beneath the surface. I readily admit there are science educators who don't quite grasp how science works and who don't do a great job of designing science instruction. However, I believe this is more of a systemic issue than an individual teacher issue. Let's look at Dr. Novella's two assertions again:
Inquiry and scientific thinking is not taught well at all.
Education is a high-stakes testing world these days. What's valued by our current schooling system are good scores on standardized tests, so effective teachers are labeled as those who help students earn good scores on standardized tests. However, it's can be tricky to assess inquiry and scientific thinking. The best way to assess these skills is to observe students performing scientific inquiry (or at least look at a portfolio of student work) to gauge the level of sophistication in scientific inquiry and thinking the student possesses. So, let's look at how Connecticut assesses science: The Connecticut Mastery Test (given grades 3-8) and the Connecticut Academic Performance Test (given to 10th graders) both assess "experimentation and the ability to use scientific reasoning to solve problems2." The CAPT science test includes 60 multiple choice and 5 open-response questions. In 5th grade, the CMT science test includes 36 multiple choice and 3 open-response, and in the 8th grade edition there are 55 multiple choice and 3 open-response3. Multiple choice questions- even well designed items- are a shoddy way to measure inquiry. Even the open-response questions that require several sentences to answer aren't a very good measure. Yet this is the system of assessment we value and this system of assessment doesn't value inquiry, so why are we surprised when inquiry and scientific thinking take a backseat in the classroom? The problem doesn't start with the teachers, it starts with our method of assessment.
Real science education doesn't even really begin until the 7th grade. In grade schools they get virtually nothing.
Again, let's look at what our schools value by looking at what they assess: The CMT is given to every 3rd through 8th grader attending Connecticut public schools. Every year from the third grade and on, students are assessed in mathematics and language arts. Only 8th graders took a science CMT through 2007. Starting in 2008 the state added a science CMT to the 5th grade as well. Why is science instruction getting the short end of the stick? Because we're not assessing it. The focus on math and language arts isn't a bad thing, but it means that subjects not being assessed are being pushed to the side. This isn't the fault of the teachers stuck in this system- it's the fault of the system itself.
What's the solution?
I am not advocating for giving more science standardized tests. I have no problem with improving our science standards. However, unless we change the current methods of assessment I wouldn't expect to see much change. To learn scientific thinking and inquiry, students must be given time in class to explore ideas, rethink assumptions, and test their hypotheses. These things take a lot of class time- furthermore they deserve a lot of class time. Having lots of well written standards is generally a good thing, but it also means teachers are pressured to "cover" all the standards to the detriment of depth of understanding and student exploration.
Dear SGU, you are science educators yourselves, and I love most of what you do. However, I'd like you to think and talk more deeply about what good science education in schools looks like and whether that vision is being supported by the assessment methods employed by the states. A wise person once said, "What we assess defines what we value4" I'd add "How we assess defines what we value," as well. If we value inquiry and scientific thinking, our assessments should be more sophisticated- requiring students to actively demonstrate their understanding of how science works. These assessments would be expensive to design and implement but would more accurately reflect students' actual scientific knowledge and skills. It's not that I think the SGU hates teachers, but you do seem to be jumping on the political narrative that has been placing undue blame for poor education practices on the shoulders of teachers instead of including systemic forces that impact how and why teachers deliver instruction in the classroom.
The discussion starts about 27 minutes into the episode and runs for 10 minutes on this topic. (back)
This will be the most scientific and precise post regarding Winston Churchill's belly you'll read today. Maybe all week.
Today, we'll be analyzing the following video:
After randomly embedding the preceding video while thinking about Hooke's Law and the spring constant in my last post, what I, and I'm sure you as well, immediately wonder is, of course, "I wonder what type of spring constant Winston Churchill's belly had?" This seems like something worthy of my time.
Here we go!
If we're going to figure this out, we need some data. First, we need some sense of scale. Since I have no idea the how tall the Animaniacs are, let's focus on the historical figures. I'm going to go with Winston Churchill's height to give the video some scale since he's pretty stretched out whilst his belly is being jumped upon1. It's surprisingly hard to find Churchill's height online with any sort of citation. I found what seems like a pretty solid source (via Wikipedia) for the height of Harry S. Truman (1.75 m). Using that information along with the following picture, I can figure out Churchill's height after throwing the image into Tracker:
Churchill and Truman were nearly the same height. I got 1.76 m (5 ft, 9 in) for Churchill. That seems pretty close to most of the unsourced figures for his height I found online.
I think the best way to go about finding the spring constant for Winston Churchill's belly is to use gravitational potential energy and elastic potential energy. If we can find the gravitational potential energy Stalin has at the top of his bounce and the maximum compression of Churchill's belly, we should be able to do the following:
Where m is Stalin's mass, Δy is Stalin's maximum height above Churchill's belly, and x is the maximum compression of Churchill's belly.
I can fairly easily find Δy and x using Tracker to analyze the video.
I used 1.70 m for Churchill's height in the video instead of the 1.76 m figure above since his knees are bent slightly. Using that information to scale the video, Stalin's maximum height (Δy) is 0.65 meters and the maximum compression of Churchill's belly (x) is 0.28 m.
Finding Stalin's mass will require another long and probably fruitless internet search. Instead, I'm going to assume from the above picture Stalin is approximately the same height as Harry S. Truman and then assume Stalin's BMI is slightly above average (he was a dictator- which means he has access to lots of food). I'm going to say Stalin's BMI is 26. According to this BMI calculator, that would give Stalin a weight of 175 lbs, or 79.4 kg.
Now we've precisely (ha.) figured out all our variables, so we can go ahead and solve the equation for the spring constant (k):
OK, so what's that mean? It means that if you could compress Winston Churchill's belly by a full meter it would require 12,900 Newtons of force. On the surface of the Earth, that would take a mass of 1,315 kg (2,900 lbs) sitting on his belly to compress it by a full meter2. WolframAlpha helpfully notes that this is approximately a mass equivalent to approximately 2 "typical dairy cows."
We can also learn something about the Animaniacs' collective mass now that we know the spring constant. If we rearrange the previous equation to solve for the mass, we get:
It looks like the maximum height the Animaniacs attain is 0.77 m with a maximum belly compression of 0.16 m. Now solving for the mass we find:
Collectively the three Animaniacs have a mass of 21.9 kg (48.3 lbs). Wow. They're lighter than I anticipated. If you divide that figure evenly by three, the average Animaniac weight is 16.1 lbs. Clearly Dot and Wakko are smaller than Yakko. This may, in fact, prove Dot's hypothesis that in addition to being cute, she's a cat:
Also, I came across a few places that speculated that Stalin may have use elevator shoes to make himself seem taller, so it might be harder to get an accurate figure for him. However, this isn't exactly going to be a super-accuracy fest anyway, so maybe I shouldn't let that bother me. (back)
I'm not sure if Churchill actually has a meter of stomach to depress, but you get the idea. (back)
In a recent post, I strongly suggested that a physics class should be a place where students are actively involved in the exploration of the relationships that exist between different variables (force and mass, for example)- not a place where students are simply given a list of equations they are told explain how the world works. Let's continue down this line with an example.
Example: Simple Harmonic Motion and Hooke's Law
This is a lab from a college class I took last semester:
This lab isn't terrible. I mean, who doesn't like bouncing springs?
In the first part, we were required to find the spring constant by examining the relationship between the force applied to the spring and the spring's elongation using a graph. That's not too shabby, right? Well...no...but...
What the lab doesn't require is any thinking about the relationship between force and elongation. You make a nice graph, but are told right in the instructions that the slope of the graph is this thing called the "spring constant." We aren't expected to know anything more about how the relationship between force and elongation and the spring constant works.
In part two, we varied the mass on the spring and measured the period of the spring's oscillation, which we then compared to the expected period based upon our calculations and a formula we were given ahead of time:
I didn't need to know much to write up the lab report:
The period of a spring's oscillation depends on the mass attached to the spring.
The formula we were given to find the period of a spring's oscillation works.
That's it. If I was an astute student I might've realized that the slope of a Force-Elongation graph will give you the spring constant- but we were walked through that step in such a way that it would have been easy to miss that tidbit. Never mind understanding what having a larger or smaller spring constant would mean in real life.
Rethinking the lab
So now you're thinking that I'm just a cranky-pants who likes pointing out the failings of other people's labs. Let me try to improve your perception of myself by explaining how I'd like to run a lab covering the same content.
First, I think it's important to identify what I want students to understand as a result of completing this activity. I'd like them to understand:
The nature of the relationship between the force applied to a spring and the spring's elongation.
The slope of a Force-Elongation plot is the "spring constant."
The nature of the relationship between the mass hanging on a spring and the spring's oscillation period.
Second, I want the students to play be the primary investigators. I'm not going to give them a sheet explaining step by step exactly what they have to do. I want the students to handle that part. Maybe I give each group of students a few springs and a set of masses and simply set them free to play around and make observations for 10 minutes or so- after which we discuss as a class observations they have made and decide upon a path for further investigation. Maybe I give some guidance right away and tell them to investigate the relationship between the mass on the spring and the elongation of the spring.
Third, we draw some Force-Elongation graphs. We discuss the relationship between force and spring elongation (it should be pretty obvious it's a direct linear proportionality- i.e., if you double the force on the spring, you double its elongation). So now we know that . Next, we look at the difference in the graphs for each spring. Why are some lines steeper than others? What is the difference between a spring with a steep slope and a spring with a more gradual slope? Then explain the slope on a Force-Elongation graph is called the "spring constant." So now we've figured out that if we know the force acting on a spring and that spring's spring constant, we can figure out how much the spring will stretch: . Hey...that looks an awful lot like Hooke's Law...
Fourth, I'd play this video clip:
Fifth, I'd tell students to investigate the relationship between the amount of mass on a spring and the period of the spring's oscillation. We'd collect data, make some graphs, and hopefully come to the conclusion that .
If we stop here, we've already done a lot. We've discovered Hooke's Law. We understand a stiffer spring has a bigger spring constant. We know how doubling the mass on a spring will affect the spring's oscillation. At this point I could introduce the equation . Maybe we could then do the second part of the lab posted above and see how close the observed periods of the springs match the values calculated with that forumla. We'd probably notice all of our observed periods were off by a little bit. This opens up a discussion of why we all have this systematic error. Why are we all off? What could be off? Looking at the formula, there are really only two places we could have error: the spring constant or the mass. Maybe we draw a free-body diagram for the mass on the spring. At this point a student will probably suggest we need to draw a free-body diagram for the spring as well. Hmm...you know...this spring has mass too...could the mass of the spring itself be affecting the spring's period? Now we've independently figured out we need to consider the spring's mass as well. From there we could figure out a test to determine how much of the spring's mass we need to include.
Overcoming the traditional lab format
If you randomly visited physics courses in high schools and colleges across the nation, you'd most likely see a lot of labs similar to the first lab. Traditionally physics labs have been designed so you're given a formula and are asked to make observations that fit with the formula. This is despite the fact that the student-led investigation requires deeper thinking, encourages greater engagement and thinking about the concepts, a better understanding of how the world works, and an understanding of what an equation actually means.
Why should this be so? I believe it's because traditional labs are easy. Print out a sheet with a step by step procedure. Hand out the supplies. Make some measurements. Maybe make a graph. Answer a couple quick questions. Done. The student-led investigation is tricker to share and explain. The entire process I described in the student-led investigation could be preformed without any worksheets whatsoever. It's harder for teachers looking for a new lab to stumble on a description of this type of lab. It's really easy to hit up The Google and find a lab handout, save it, print it, and pass it out. Student-led investigations also lead to potential student errors. Students may struggle. It may take more class time. Sometimes you'll get data that doesn't turn out as well as you'd like. This can be scary and frustrating for teachers. And yet...
Struggling with what this or that graph is telling us, or being forced to think about where errors came from, or having to defend your results and process requires a lot of thinking. Critical thinking. And helping students learn to think critically is worth the extra time and effort. As a bonus, they'll also actually understand the physics better, which is also a good thing in a physics class.
These aren't brand new items, as they're things I came across awhile ago and am just getting around to posting now. In addition, I realized that the anniversary of this blog just passed. My first post was published January 12, 2008. As I look back at my first posts, it's clear that I've come a long way (hopefully for the better)- in my location, in my career, and in my thinking. So, in celebration of the 4th anniversary of this blog, let me present you with the following interesting tidbits:
Thanks to the ActiveGrade blog for bringing this to my attention. I don't know how many times I've had discussions with other teachers on the topic of what constitutes fair and effective grading. Often the most heated topic (where I never made any headway) involved the giving out of zeroes for either missing or poorly done classwork. Rick Wormelli gives a great explanation of why grading scales matter- and specifically why zeroes are no good. It's long for YouTube at 8+ minutes, but it's worth it:
The post is ostensibly a take down of Judith Curry's claim's that recent studies and reports on the topic of climate change are "hiding the decline1." However, the real appeal of this post (for me) is how it so effectively describes how science and scientists work. He goes through the data, the uncertainties in measurement, and explains how exactly it is that scientists determine that some effect is real and not just a statistical fluke.
Somewhat related, the Skeptical Science blog (one of the best places to find science-based information about climate science) released The Debunking Handbook a while ago and just recently updated it. The Handbook provides guidelines for communicating about misinformation and gives tips to avoid falling into common pitfalls. In their own words, "The Handbook explores the surprising fact that debunking myths can sometimes reinforce the myth in peoples' minds. Communicators need to be aware of the various backfire effects and how to avoid them..." The handbook is a free PDF download available at their website.
"Hiding the decline" is the (totally false) idea that climate scientists are tweaking their graphs to make it seem like the Earth is getting warmer, when it really has been cooling the last decade (which it hasn't). Read the full article for more details. (back)