## Master's Project: Self-directed learning in the science classroom

Well...to be precise, it's titled "Implementation of a technology-rich self-directed learning environment in a ninth grade Integrated Science classroom." Catchy, I know.

To be honest, this is a bit old. I thought I had posted this a long time ago, but recently realized I never had despite always meaning to do so. I implemented this project in the spring of 2010 and officially submitted my project in June of the same year. It won me a "Scholar of Excellence" award, so it must be at least somewhat decent.

#### The Goods

Though the full paper may not be of interest to you, let me recommend the Lit Review. I went through many, many papers on constructivist environments and instructional technology's impact on student learning. It'd make me very happy if anybody found this even remotely useful.

I've decided to release it under a Creative Commons Attribution license, so have at it. Here's the full paper in variety of formats for any of your consumption needs:

• Implementation of a technology-rich self-directed learning environment in a ninth grade Integrated Science classroom

#### Description

Simply put, students worked in teams of four to five and shared a team blog. Students investigated any topic that interested them around the general theme of climate change. Students were tasked with researching the topic and sharing their learning and questions on their blog. There were no due dates (other than the end of the school year), though students were all required to write a certain number of posts and comments on their classmates' posts (for more details, check out the Project Design section of the paper). For a bit on the rationale, here's an excerpt from the Introduction and Rationale:

The purpose of the educational system in the United States has been described in many different ways depending on the viewpoint of the individual doing the describing. Creating individuals able to become positive members of society, providing skills for the future workforce, or preparing individuals for an uncertain future have all been cited by various people and organizations as the purpose of schooling- each relying on their own value set and particular social and political biases. While there is no doubt that these various beliefs about the purpose of the American educational system have been true, and may continue to be true in various times and places, it is this author's belief that one of the more important goals of the educational system is to create life-long learners who will be able to actively and knowledgeably engage in whatever ideas and issues may cross their paths. As specific information and skill-sets are quickly changing due to the rapid increases in knowledge and improvements in technology the importance of teaching students specific content information decreases while the importance of teaching students how to locate, evaluate, and interact with knowledge increases. As what it means to be productive members of society or effective members of the workforce changes, the ability for individuals to understand how to learn new knowledge when they need it is more valuable than simply falling back on information learned through formal schooling.

If schools are to become a place where students learn how to interact with, challenge, and develop new knowledge, then the traditional classroom structure- that of the teacher as the primary source of knowledge and assessment- needs to change as well. Students should be given a chance to work out the solutions to problems that do not have predefined answers. In doing so, students lose their status as passive recipients of information and instead become active creators of knowledge. A method of implementing this might be built on the problem-based learning (PBL) model that has been used for many years in many content areas with various age levels. The incarnation of PBL envisioned here provides students with real-world problems to solve that do not already have easy or "neat" answers, gives students the freedom to explore down side canyons as part of the problem solving process, allows time for students to share their ideas and work with others, and provides support and time for students to document and reflect on their learning and problem solving process.

Let me know what you think or if you found anything useful for your own purposes.

## Dear Skeptics' Guide: Standards aren't the solution

There's a widespread narrative regarding science education in the United States: It stinks. As a science educator, whenever I hear this two things happen. First, I get my my hackles all up. Second, I realize that despite my hackles I generally agree. I get my hackles up because I've spent a lot of time thinking about, planning, designing, and implementing a science curriculum that I feel has been pretty darn good. However, I recognize that the School System (I'm not picking at any one school district here, but instead at the entire system of schooling in this country) has not done a very good job of helping students to think and act like scientists.

Recently, while listening to the Skeptics' Guide to the Universe podcast (#343), I had my hackles raised. They discussed a recent article on io9 titled, "Your State Sucks at Science.1" This article discussed a report by the Thomas B. Fordham Institute that analyzed each state's standards on their "Content & Rigor" and "Clarity and Specificity." The results (summarized on the map below), showed that the vast majority of states didn't do so well. In fact, they did terribly.

OK, that information doesn't shock, surprise, or upset me. Connecticut earned a not-so-respectable "C." I'd probably give the standards I've worked with (9th grade Integrated Science) a lower grade. Many standards are overly broad. Others are ambiguous. I agree with the Skeptics' Guide, io9, and the Thomas B. Fordham institute that improving these standards would be a good thing for science education.

So, why are my hackles still raised? Well...during the Skeptics' Guide to the Universe (SGU) discussion on the sorry state of science education, the general view was that poor standards are the crux of the problem (followed by poor teachers- more on this later). It was stated that poor standards will cause teachers to fail their students more often than the case would be if states had good standards. As anecdotal evidence of this, Dr. Steven Novella noted his daughter is receiving a sub-par science education at the Connecticut public school she is attending. Dr. Novella specifically described his two big problems with his daughter's science instruction:

1. Inquiry and scientific thinking is not taught well at all.
2. Real science education doesn't even really begin until the 7th grade. In grade schools they get virtually nothing.

I generally agree with these assertions. What really bothered me, however, was the discussion of why these problems exist. Here are some quotes from the discussion:

• "Teachers don't quite grasp how science works."
• "When the standards fail the teachers, the teachers will more likely fail the students."

Can you see where this is going? They never come right out and say science education stinks because our science teachers stink, but that idea is hovering just beneath the surface. I readily admit there are science educators who don't quite grasp how science works and who don't do a great job of designing science instruction. However, I believe this is more of a systemic issue than an individual teacher issue. Let's look at Dr. Novella's two assertions again:

1. Inquiry and scientific thinking is not taught well at all.
• Education is a high-stakes testing world these days. What's valued by our current schooling system are good scores on standardized tests, so effective teachers are labeled as those who help students earn good scores on standardized tests. However, it's can be tricky to assess inquiry and scientific thinking. The best way to assess these skills is to observe students performing scientific inquiry (or at least look at a portfolio of student work) to gauge the level of sophistication in scientific inquiry and thinking the student possesses. So, let's look at how Connecticut assesses science: The Connecticut Mastery Test (given grades 3-8) and the Connecticut Academic Performance Test (given to 10th graders) both assess "experimentation and the ability to use scientific reasoning to solve problems2." The CAPT science test includes 60 multiple choice and 5 open-response questions. In 5th grade, the CMT science test includes 36 multiple choice and 3 open-response, and in the 8th grade edition there are 55 multiple choice and 3 open-response3. Multiple choice questions- even well designed items- are a shoddy way to measure inquiry. Even the open-response questions that require several sentences to answer aren't a very good measure. Yet this is the system of assessment we value and this system of assessment doesn't value inquiry, so why are we surprised when inquiry and scientific thinking take a backseat in the classroom? The problem doesn't start with the teachers, it starts with our method of assessment.
2. Real science education doesn't even really begin until the 7th grade. In grade schools they get virtually nothing.
• Again, let's look at what our schools value by looking at what they assess: The CMT is given to every 3rd through 8th grader attending Connecticut public schools. Every year from the third grade and on, students are assessed in mathematics and language arts. Only 8th graders took a science CMT through 2007. Starting in 2008 the state added a science CMT to the 5th grade as well. Why is science instruction getting the short end of the stick? Because we're not assessing it. The focus on math and language arts isn't a bad thing, but it means that subjects not being assessed are being pushed to the side. This isn't the fault of the teachers stuck in this system- it's the fault of the system itself.

#### What's the solution?

I am not advocating for giving more science standardized tests. I have no problem with improving our science standards. However, unless we change the current methods of assessment I wouldn't expect to see much change. To learn scientific thinking and inquiry, students must be given time in class to explore ideas, rethink assumptions, and test their hypotheses. These things take a lot of class time- furthermore they deserve a lot of class time. Having lots of well written standards is generally a good thing, but it also means teachers are pressured to "cover" all the standards to the detriment of depth of understanding and student exploration.

Dear SGU, you are science educators yourselves, and I love most of what you do. However, I'd like you to think and talk more deeply about what good science education in schools looks like and whether that vision is being supported by the assessment methods employed by the states. A wise person once said, "What we assess defines what we value4" I'd add "How we assess defines what we value," as well. If we value inquiry and scientific thinking, our assessments should be more sophisticated- requiring students to actively demonstrate their understanding of how science works. These assessments would be expensive to design and implement but would more accurately reflect students' actual scientific knowledge and skills. It's not that I think the SGU hates teachers, but you do seem to be jumping on the political narrative that has been placing undue blame for poor education practices on the shoulders of teachers instead of including systemic forces that impact how and why teachers deliver instruction in the classroom.

______________________________

1. The discussion starts about 27 minutes into the episode and runs for 10 minutes on this topic. (back)
2. See 2011 CAPT Interpretive Guide, p. 5. http://www.csde.state.ct.us/public/cedar/assessment/capt/resources/misc_capt/2011%20CAPT%20Interpretive%20Guide.pdf (back)
3. Question information from the CAPT Program Overview 2012, p. 11, http://www.csde.state.ct.us/public/cedar/assessment/capt/resources/misc_capt/CAPT%20program%20overview%202012.pdf and Science CMT Handbook, p. 8, http://www.sde.ct.gov/sde/lib/sde/pdf/curriculum/science/science_cmt_handbook.pdf (back)
4. This was a Grant Wiggins quote, I believe. (back)

## The spring constant of Winston Churchill's belly

This will be the most scientific and precise post regarding Winston Churchill's belly you'll read today. Maybe all week.

Today, we'll be analyzing the following video:

After randomly embedding the preceding video while thinking about Hooke's Law and the spring constant in my last post, what I, and I'm sure you as well, immediately wonder is, of course, "I wonder what type of spring constant Winston Churchill's belly had?" This seems like something worthy of my time.

#### Here we go!

If we're going to figure this out, we need some data. First, we need some sense of scale. Since I have no idea the how tall the Animaniacs are, let's focus on the historical figures. I'm going to go with Winston Churchill's height to give the video some scale since he's pretty stretched out whilst his belly is being jumped upon1. It's surprisingly hard to find Churchill's height online with any sort of citation. I found what seems like a pretty solid source (via Wikipedia) for the height of Harry S. Truman (1.75 m). Using that information along with the following picture, I can figure out Churchill's height after throwing the image into Tracker:

Churchill and Truman were nearly the same height. I got 1.76 m (5 ft, 9 in) for Churchill. That seems pretty close to most of the unsourced figures for his height I found online.

I think the best way to go about finding the spring constant for Winston Churchill's belly is to use gravitational potential energy and elastic potential energy. If we can find the gravitational potential energy Stalin has at the top of his bounce and the maximum compression of Churchill's belly, we should be able to do the following:

$mg\Delta y = \frac{1}{2}kx^2 \\ \\ k = \dfrac{2mg\Delta y}{x^2}$

Where m is Stalin's mass, Δy is Stalin's maximum height above Churchill's belly, and x is the maximum compression of Churchill's belly.

I can fairly easily find Δy and x using Tracker to analyze the video.

I used 1.70 m for Churchill's height in the video instead of the 1.76 m figure above since his knees are bent slightly. Using that information to scale the video, Stalin's maximum height (Δy) is 0.65 meters and the maximum compression of Churchill's belly (x) is 0.28 m.

Finding Stalin's mass will require another long and probably fruitless internet search. Instead, I'm going to assume from the above picture Stalin is approximately the same height as Harry S. Truman and then assume Stalin's BMI is slightly above average (he was a dictator- which means he has access to lots of food). I'm going to say Stalin's BMI is 26. According to this BMI calculator, that would give Stalin a weight of 175 lbs, or 79.4 kg.

Now we've precisely (ha.) figured out all our variables, so we can go ahead and solve the equation for the spring constant (k):

$k = \dfrac{2mg\Delta y}{x^2} \\ \\ \\ k = \dfrac{2(79.4\text{ kg})(9.8\text{ m/s}^2)(0.65\text{ m})}{(0.28\text{ m})^2} \\ \\ \\ k = 12,900\text{ N/m}$

OK, so what's that mean? It means that if you could compress Winston Churchill's belly by a full meter it would require 12,900 Newtons of force. On the surface of the Earth, that would take a mass of 1,315 kg (2,900 lbs) sitting on his belly to compress it by a full meter2. WolframAlpha helpfully notes that this is approximately a mass equivalent to approximately 2 "typical dairy cows."

We can also learn something about the Animaniacs' collective mass now that we know the spring constant. If we rearrange the previous equation to solve for the mass, we get:

$m = \dfrac{kx^2}{2g\Delta y}$

It looks like the maximum height the Animaniacs attain is 0.77 m with a maximum belly compression of 0.16 m. Now solving for the mass we find:
$m = \dfrac{(12900\text{ N/m})(0.16\text{ m})^2}{2(9.8\text{ m/s}^2)(0.77\text{ m})} \\ \\ \\ m = 21.9\text{ kg}$

Collectively the three Animaniacs have a mass of 21.9 kg (48.3 lbs). Wow. They're lighter than I anticipated. If you divide that figure evenly by three, the average Animaniac weight is 16.1 lbs. Clearly Dot and Wakko are smaller than Yakko. This may, in fact, prove Dot's hypothesis that in addition to being cute, she's a cat:

Watch animaniacs - what are we? in Animation  |  View More Free Videos Online at Veoh.com

______________________________

1. Also, I came across a few places that speculated that Stalin may have use elevator shoes to make himself seem taller, so it might be harder to get an accurate figure for him. However, this isn't exactly going to be a super-accuracy fest anyway, so maybe I shouldn't let that bother me. (back)
2. I'm not sure if Churchill actually has a meter of stomach to depress, but you get the idea. (back)

## Worksheet labs aren't that great: Hooke's Law

In a recent post, I strongly suggested that a physics class should be a place where students are actively involved in the exploration of the relationships that exist between different variables (force and mass, for example)- not a place where students are simply given a list of equations they are told explain how the world works. Let's continue down this line with an example.

#### Example: Simple Harmonic Motion and Hooke's Law

This is a lab from a college class I took last semester:

#### Analysis

This lab isn't terrible. I mean, who doesn't like bouncing springs?

In the first part, we were required to find the spring constant by examining the relationship between the force applied to the spring and the spring's elongation using a graph. That's not too shabby, right? Well...no...but...

What the lab doesn't require is any thinking about the relationship between force and elongation. You make a nice graph, but are told right in the instructions that the slope of the graph is this thing called the "spring constant." We aren't expected to know anything more about how the relationship between force and elongation and the spring constant works.

In part two, we varied the mass on the spring and measured the period of the spring's oscillation, which we then compared to the expected period based upon our calculations and a formula we were given ahead of time:

$T = 2 \pi \sqrt{ \dfrac{m}{k}}$

I didn't need to know much to write up the lab report:

1. The period of a spring's oscillation depends on the mass attached to the spring.
2. The formula we were given to find the period of a spring's oscillation works.

That's it. If I was an astute student I might've realized that the slope of a Force-Elongation graph will give you the spring constant- but we were walked through that step in such a way that it would have been easy to miss that tidbit. Never mind understanding what having a larger or smaller spring constant would mean in real life.

#### Rethinking the lab

So now you're thinking that I'm just a cranky-pants who likes pointing out the failings of other people's labs. Let me try to improve your perception of myself by explaining how I'd like to run a lab covering the same content.

First, I think it's important to identify what I want students to understand as a result of completing this activity. I'd like them to understand:

1. The nature of the relationship between the force applied to a spring and the spring's elongation.
2. The slope of a Force-Elongation plot is the "spring constant."
3. The nature of the relationship between the mass hanging on a spring and the spring's oscillation period.

Second, I want the students to play be the primary investigators. I'm not going to give them a sheet explaining step by step exactly what they have to do. I want the students to handle that part. Maybe I give each group of students a few springs and a set of masses and simply set them free to play around and make observations for 10 minutes or so- after which we discuss as a class observations they have made and decide upon a path for further investigation. Maybe I give some guidance right away and tell them to investigate the relationship between the mass on the spring and the elongation of the spring.

Third, we draw some Force-Elongation graphs. We discuss the relationship between force and spring elongation (it should be pretty obvious it's a direct linear proportionality- i.e., if you double the force on the spring, you double its elongation). So now we know that $F \propto x$. Next, we look at the difference in the graphs for each spring. Why are some lines steeper than others? What is the difference between a spring with a steep slope and a spring with a more gradual slope? Then explain the slope on a Force-Elongation graph is called the "spring constant." So now we've figured out that if we know the force acting on a spring and that spring's spring constant, we can figure out how much the spring will stretch: $F=kx$. Hey...that looks an awful lot like Hooke's Law...

Fourth, I'd play this video clip:

Fifth, I'd tell students to investigate the relationship between the amount of mass on a spring and the period of the spring's oscillation. We'd collect data, make some graphs, and hopefully come to the conclusion that $T \propto \sqrt{m}$.

If we stop here, we've already done a lot. We've discovered Hooke's Law. We understand a stiffer spring has a bigger spring constant. We know how doubling the mass on a spring will affect the spring's oscillation. At this point I could introduce the equation $T = 2 \pi \sqrt{ \frac{m}{k}}$. Maybe we could then do the second part of the lab posted above and see how close the observed periods of the springs match the values calculated with that forumla. We'd probably notice all of our observed periods were off by a little bit. This opens up a discussion of why we all have this systematic error. Why are we all off? What could be off? Looking at the formula, there are really only two places we could have error: the spring constant or the mass. Maybe we draw a free-body diagram for the mass on the spring. At this point a student will probably suggest we need to draw a free-body diagram for the spring as well. Hmm...you know...this spring has mass too...could the mass of the spring itself be affecting the spring's period? Now we've independently figured out we need to consider the spring's mass as well. From there we could figure out a test to determine how much of the spring's mass we need to include.

#### Overcoming the traditional lab format

If you randomly visited physics courses in high schools and colleges across the nation, you'd most likely see a lot of labs similar to the first lab. Traditionally physics labs have been designed so you're given a formula and are asked to make observations that fit with the formula. This is despite the fact that the student-led investigation requires deeper thinking, encourages greater engagement and thinking about the concepts, a better understanding of how the world works, and an understanding of what an equation actually means.

Why should this be so? I believe it's because traditional labs are easy. Print out a sheet with a step by step procedure. Hand out the supplies. Make some measurements. Maybe make a graph. Answer a couple quick questions. Done. The student-led investigation is tricker to share and explain. The entire process I described in the student-led investigation could be preformed without any worksheets whatsoever. It's harder for teachers looking for a new lab to stumble on a description of this type of lab. It's really easy to hit up The Google and find a lab handout, save it, print it, and pass it out. Student-led investigations also lead to potential student errors. Students may struggle. It may take more class time. Sometimes you'll get data that doesn't turn out as well as you'd like. This can be scary and frustrating for teachers. And yet...

Struggling with what this or that graph is telling us, or being forced to think about where errors came from, or having to defend your results and process requires a lot of thinking. Critical thinking. And helping students learn to think critically is worth the extra time and effort. As a bonus, they'll also actually understand the physics better, which is also a good thing in a physics class.

## 3+ Quick- Birthday, (grading) scale matters, exposing climate fraud, debunking handbook

These aren't brand new items, as they're things I came across awhile ago and am just getting around to posting now. In addition, I realized that the anniversary of this blog just passed. My first post was published January 12, 2008. As I look back at my first posts, it's clear that I've come a long way (hopefully for the better)- in my location, in my career, and in my thinking. So, in celebration of the 4th anniversary of this blog, let me present you with the following interesting tidbits:

#### Scale matters (Rick Wormelli)

Thanks to the ActiveGrade blog for bringing this to my attention. I don't know how many times I've had discussions with other teachers on the topic of what constitutes fair and effective grading. Often the most heated topic (where I never made any headway) involved the giving out of zeroes for either missing or poorly done classwork. Rick Wormelli gives a great explanation of why grading scales matter- and specifically why zeroes are no good. It's long for YouTube at 8+ minutes, but it's worth it:

#### Exposing a climate science fraud (Ethan Siegel)

The post is ostensibly a take down of Judith Curry's claim's that recent studies and reports on the topic of climate change are "hiding the decline1." However, the real appeal of this post (for me) is how it so effectively describes how science and scientists work. He goes through the data, the uncertainties in measurement, and explains how exactly it is that scientists determine that some effect is real and not just a statistical fluke.

#### The Debunking Handbook (Skeptical Science)

Somewhat related, the Skeptical Science blog (one of the best places to find science-based information about climate science) released The Debunking Handbook a while ago and just recently updated it. The Handbook provides guidelines for communicating about misinformation and gives tips to avoid falling into common pitfalls. In their own words, "The Handbook explores the surprising fact that debunking myths can sometimes reinforce the myth in peoples' minds. Communicators need to be aware of the various backfire effects and how to avoid them..." The handbook is a free PDF download available at their website.

______________________________

1. "Hiding the decline" is the (totally false) idea that climate scientists are tweaking their graphs to make it seem like the Earth is getting warmer, when it really has been cooling the last decade (which it hasn't). Read the full article for more details. (back)

## What is the purpose of Physics class?

I took three physics classes through a local community college last semester. From how the content was presented in each class, it would be fair to say Physics is primarily concerned with learning a set of equations and then figuring out which equation you need to use in order to find the right answer.

This is not a very useful skill. People wiser than I have pointed out similar things. So why, in high school and introductory college physics classes, do they lean so heavily on "learning the formulas?" Here are the two arguments I've heard the most often:

#### They'll need it in college/their careers

It could be argued, perhaps, that it is good preparation for students who will be pursuing engineering or scientific careers- after all, they'll be taking college classes and graduate classes and probably use a couple equations during their careers. However, there's a big problem with this line of thinking. Are all the students in a high school physics class there because they're planning on becoming scientists and engineers? A few, maybe. Most of them will not- and that's OK, but this realization should cause us to rethink how we present the material.

#### The equations explain the relationship between variables

I'm sympathetic towards this line of thinking (more on this later)- but not enough to think it's valid. Whenever I hear this argument the first question that comes to mind is "Is this the best way to explore those relationships?" In my experience, students who have struggled understanding physics often did so because they couldn't make sense of what the equations actually describe. Given an equation and all the variables but one and they'd be able to work though a problem, but they weren't understanding why that answer makes sense and any further obfuscation of the problem quickly threw them off track. I agree that the relationship between variables is an important bit. I don't believe that equations clarify that relationship for the vast majority of students.

#### How I'd like to teach physics

Understanding the relationship between variables, in my mind, is the key to a useful understanding of physics. If I push twice as hard on this shopping cart, what happens to the cart's acceleration? That's a tangible situation that is easier to understand than simply throwing out F = ma and hoping students figure out that relationship on their own. Further, students should discover these relationships. Give students some equipment and tools and have them measure what happens to an object's acceleration as they apply more or less force on the object (some tracking software would be really handy for this). Then have them apply the same force but change up the mass. Chances are pretty good they'll be able to discover F = ma on their own. Chances are they'll have a much better conceptual understanding of what F = ma means at this point than if you simply gave them the equation and had them do some problems. Or if you simply had them prove the formula is correct in a lab.

#### Why it matters

1. I believe the focus on relationships promotes a better conceptual understanding of physics- the students can more effectively internalize the way the world around them works. A populace with a healthy baseline of physics knowledge could prevent silly and potential harmful pseudoscience such as magnet therapy from becoming an issue.
2. There's been a focus on increasing interest in STEM careers- and a special focus on recruiting women and minorities into STEM fields (see this White House press release). An equation-focused physics curriculum can seem intimidating to students. A collaborative, constructivist approach can be perceived to be less intimidating and more welcoming (I'd recommend giving Episode 32 of the Shifted Learning podcast for some interesting bits on gender issues in STEM).

#### Modeling Instruction

I don't have much experience with Modeling Instruction, but it seems from my reading that the instruction I've been describing is essentially what it is. As a bonus, it's well developed, well researched, and well used instructional method to improve students' ability to construct a better understanding of the physical world around them.

If you're interested, I've found both Kelly O'Shea's series on model building and Frank Noschese's primer to modeling instruction to be great resources. Check their blogrolls for even more good stuff from teachers using modeling.

As I look forward to potentially teaching physics next year I want students who take my classes to come out with a lasting understanding of the topic. I don't want them to half-heartedly memorize equations that they'll forget two weeks after we finish a unit. I'd like to teach for all the students, not just the future scientists and engineers.

## Book Review: The Students Are Watching

Somewhere along my journey of teaching, I realized I had started paying an awful lot of attention to more than just the content of what I was saying and having students do in class. I was paying attention to the messages and values communicated through the classroom rules, routines, and activities I was designing for the students. I started to purposefully promote particular values in my classroom.

There was no singular moment (that I can recall) where I decided to align the happenings of my classroom meshed with values that I felt were important. Yet now I find myself thinking (perhaps too much) about what the classroom structures and routines are really telling students. Do they emphasize fairness? Do they treat students as valuable individuals?

"What does it tell students when we make them sign in and out to use the bathroom? That we feel they're trustworthy? That we assume they're going to abuse the privilege? Do the safety and security benefits from having a record of students out of the classroom outweigh the implicit message to students that we don't trust them? How does a school community decide upon these routines?"

Similar issues are discussed in with greater clarity, insight, and detail by Nancy and Ted Sizer in their book, The Students Are Watching: Schools and the Moral Contract. The each look at the routines and rituals of a school through the lens of common verbs that happen in all schools: Modeling, Grappling, Bluffing, etc.). Throughout the book it is argued that we (as individual educators as well as school communities) need to think through how we model or grapple or bluff. We are teaching students about what things we value- whether we've taken the effort as a community to design our routines to closely match our values or not.

In my own experience, I've found schools will pay lip service to values- such as treating every student as an individual- while sadly lacking to provide structures that allow students to be known as individuals. The book doesn't condemn these schools and those that work in them as being hypocrites or incompetent. Instead, it points out that rules and routines are often well intended, but without specifically thinking through the procedures (and including students and parents in the decision making process) we often fall to a default mode of that which is easiest. However, the easiest routines usually put the adults' needs ahead of the students' or allow a subgroup of students to get lost in the system.

I found The Students Are Watching a challenging read. I often would stop part way through a passage and think through my own practices and how I might improve them. It doesn't purport to provide a silver bullet to solve all of a school's problems, but it does provide a tangible framework for thinking more carefully about what values our schools are actually promoting- and whether those values are the values we really want to be promoting.

Go read this book. If you need a copy, I have one I'm no longer reading. I'd be happy to send it along if you're interested- as long as you don't mind some of my messy writing in the margins (see below).

[Update]: This article on The Slacktivist is a sad example of a school's decision making process teaching the students and community about the real moral values of the school. In this case it's further exaggerated due to the school being a religious school.

## Critiquing the CAPSS Recommendations for School Reform

I want to make my classroom the best learning environment possible. Most of my posts on this site focus on lessons, assessments, or ideas on how to improve the learning environment inside my classroom. Improving our individual teaching craft is one of the easiest places (not to say it's necessarily easy) as a teacher to effect change.

However, as I've worked towards improving what happens in my classroom I've frequently run into obstacles. These obstacles were primarily exterior to my classroom. Sometimes they were school or district policies, sometimes national or state requirements, and sometimes they were the result of how we, as a culture, have historically structured this thing we call "school." Most of these policies and structures were created with good intentions in an attempt to improve our schools and our children's education.

Given my generally negative experiences with "traditional1" instructional models and structures, I've found myself more and more interested in systemic school reforms. How can we create modern schools and structures that leverage the advancements in technology and access to information to provide students with an education that prepares them to be active participants in our nation's democracy, economy, and society?

It was no surprise when an editorial in our local paper titled Major Restructuring Recommended for Schools caught my eye. In it, the author briefly describes the Connecticut Association of Public School Superintendents (CAPSS) new report, "Recommendations for Transformation," a list of recommendations to transform the state education system "so it is able to meet the needs of students in the future." Naturally, I downloaded, read, and critiqued the full 36 page report (here's the official download link [pdf file], here's a version with my commentary [pdf file]).

## My critique of the CAPSS recommendations

The report includes 134 individual recommendations for action across ten broad categories. I won't go into them all. Instead I'll give a brief breakdown of each broad category and get more specific around recommendations of particular interest.

### The tl;dr version

This is a long article. For those of you thinking, "I'm can't read this whole thing. There is too much," let me sum up. Speaking in sweeping generalities, I applaud the CAPSS recommendations. In many ways the recommendations are progressive, forward-thinking, and focus on the best interests of students instead of on things that would be easy to implement or get through the political process. Recommendations such as competency-based advancement, standards-based assessments, and integrating out-of-school learning experiences into the formal education process suggest that CAPSS is interested in totally reworking what we mean by "school." This makes me happy. Too often reform movements are limited by the inertia of history and that-which-already-exists. CAPSS is clearly trying to overcome this inertia. Schools that followed the recommendations in the report could be student-centered environments that have a laser-like focus on student learning, support and integrate learning experiences that occur outside the classroom, remove conventions of little educational value (e.g. letter grades, traditional homework, and adult-friendly-but-child-poor assessments), and make schools an intrinsic part of their community.

And yet CAPSS puzzlingly makes recommendations that would make schools larger, less personal, and less a part of their community. Consolidating districts might save some money- which is an important consideration- but this seems to fly in the face of entire other sections of this report (For example, Section 2: Make it Personal; Section 4: Retool Assessments & Accountability; Section 8: Involve Students & Parents). Creating fiscally sustainable school districts is important, but eliminating small community schools in favor of large regional schools fosters disconnect between the schools and their community, students skating through schools unknown by their teachers, and an overall less personalized educational experience. Furthermore, many recommendations are so general that they're simply platitudes without any real meat to them (i.e. "Engage parents as partners in their children's education."). More detail and explanation is needed as to exactly what many recommendations are actually recommending. Lastly, how about some references? Surely (hopefully) the CAPSS group that created the report relied on more than the four citations included in this report- three of which are statistics on current educational practices. Nowhere do they cite sources to support their positions- either in this report, on their website, or any other report provided at their website.

I think CAPSS took a step in a positive direction by making many forward-thinking recommendations for the future of education in Connecticut. While none of these recommendations are binding, it heartens me to see an organization of this sort making progressive recommendations. It gives me hope there will be enough momentum to effect some real and positive educational reform in the near term. However, portions of the report conflict with the overall progressive theme- pointing towards deep elements of hesitation toward the large- and in my opinion needed- education reforms.

If you'd like a more detailed breakdown of the 10 categories of recommendations made in the CAPSS report, read on!

### 1. Raise the Bar

There are essentially two recommendations here: (1) Create "ambitious, focused, and cohesive" education standards, and (2) provide a system that measures student learning and promotes students through school based on content mastery instead of seat time.

1. Standards. Question: There already are state education standards, how are these standards different? Are these different than the Common Core Standards? Further, the recommendations specifically focus on standards for "college and career readiness." Those are important goals, but I'd also like them to focus on helping students become effective participants in a democracy. On the whole I'm skeptical of the standardization movement. The report spends a lot of time recommending greater flexibility. In my experience standards tend to inhibit flexibility. Have students who are really interested in a topic not included in the standards? Sorry, no time for that- it's not in the standards.
2. Content mastery. This is one of those bold recommendations that I love. Essentially, they support the idea that as soon as a student shows mastery of a topic they can move on to a new topic. 13 years in a classroom does not necessarily make an education. In this model, students would be able to advance more quickly or more slowly depending on their individual content mastery- they wouldn't have to wait until the end of the year to move on to the next topic. This is essentially standards-based grading on systemic steroids. However, they fall short on proposing what School would look like under this system. How would mastery be determined? How does it impact the organization of classes at schools? These are big questions that need some serious thought for this to be taken seriously.

### 2. Make it Personal

This thread focuses on creating student-centric learning environments. Of any of the 10 sections, I like these recommendations the most. The two main ideas in this section:

1. Advance students based on mastery. This restates some ideas from the last section. I still like it. They're still vague on details, offering only, "Establish flexible work schedules," and "Allow credits to be awarded based on mastery." I have a hard time visualizing how this would work in reality, but perhaps that's because I've spent the last 27 years in the existing system. I'm worried by the recommendation to develop a variety of assessments and projects to allow students to demonstrate mastery. This sounds like they'd be state-standardized affairs, which if they're anything like existing state-standardized activites, would be horrible. These should be developed locally (while being shared publicly for other educators) based on individual student needs.
2. Flexible learning environments. Yes. Please recognize that plenty of valuable learning takes place outside school. The integration of this informal learning with our formal education is much needed. This should go beyond counting a family trip to the Grand Canyon as an educational experience. If a student can diagnose and fix a car's electrical system, spending three weeks in a classroom learning about basic series and parallel circuits is a waste of their time. Schools should partner with and validate our students' out of school educational experiences.

This isn't my area of expertise, but I think the proposal to provide quality preschool for all children starting at the age of three is one of the biggest no-brainers in education reform. The payoff to society don't manifest for nearly two decades, but there is a seeming wealth of research that suggests preschool is a very good thing. I have some concerns with the recommendations similar to "Develop a system of accountability for providing language-rich, challenging, developmentally appropriate and engaging reading and mathematics curricula." The focus on reading and math smacks of No Child Left Behind, and suggests an emphasis on tightly structured learning environments. In the words of Alfie Kohn:

...the results are striking for their consistent message that a tightly structured, traditionally academic model for young children provides virtually no lasting benefits and proves to be potentially harmful in many respects.

### 4. Retool Assessments and Accountability

Now we're getting into some meat. The CAPSS report suggests standardized testing should be de-emphasized. I'd be willing to bet they'd suggest eliminating standardized tests as we know them were it not for the current national education environment. Props to them for that.

Here's a selected summary of their suggestions: (1) Provide a variety of assessment formats, (2) Assess students as they're ready to be assessed (instead of everyone at the same time), (3) Get assessment results back to students & teachers quickly so they inform instruction, and (4) Make the goals of all assessment transparent. It seems like they're saying one thing here. Yup, it's Standards-Based Grading.

In fact, they do mention SBG by name in this section, but they recommend making it "part of assessments." I'm a fan of SBG (as evidenced by previous posts), and I think this is a stellar recommendation.

I do have some hesitations with their recommendations, despite their SBG-like nature. For one, it's pretty clear from the language used they're not discussing day-to-day classroom assessment. They're discussing a new form for state standardized2 tests. I'm unclear on what this would look like, but it does sound like an improvement over the current system, though I'm skeptical it would come to pass in this improved manner. Another hesitation rests on the description of incentives for high performing schools. The report clearly recommends moving away from punitive measures, yet in my mind, providing incentives to high-performing schools is nearly indistinguishable from punitive measures against low-performing schools. Finally, the report lists subject areas for "base academic accountability." I take that to mean, "These are the subjects that will be assessed," or perhaps more clearly, "These are the subjects we think are important (things that are valued are assessed)." Notably absent are the arts and physical education- meaning the cuts to art and phys. ed. programs we see happening today are likely to continue were these measures put into place.

### 5. Offer More Options and Choices

Or, the section with the title that most poorly represents its contents. A better section title? "Consolidate School Districts." Their basic argument seems to be that having the current (supposedly high number of) 165 Connecticut districts creates an environment where it is difficult to align state and local initiatives, is economically inefficient, and fosters racial and ethnic isolation. While I agree that you can save some money by consolidating services like busing or food service, you also lose a connection with the community when the district encompasses many, many communities. Having worked in both small and large districts, the small district was much more connected to and valued by the community3. It may be more expensive to have small community districts- and that's not a small obstacle- but it would be worth it. It should be noted that reworking the state education system in the manner recommended by this report would also be expensive. In addition smaller districts would help schools be more flexible, personal, and transparent. Those adjectives would be a fair summary of the recommendations of this entire report, so why include this section?4

This section makes a lot of recommendations about the relationship between the State Department of Education and the Commissioner of Education as well as the roles of school boards and superintendents. That's a little bit outside my area of expertise, but I do like this statement from the introduction to the section:

Currently, organization and policy making for education are based on bureaucratic assumptions of hierarchy, centralized decision making, standardization and inspection. These characteristics limit individual discretion, depress creativity and foster stasis, not change.

That certainly describes my experience teaching in Connecticut. Despite completing my Master's in Secondary Education project by designing and implementing a student-centric, student-driven project5, I was told I couldn't continue the project unless all the science teachers wanted to use it. That's not exactly how one fosters innovation and creativity...

### 7. Boost Quality

This is a huge section with 26 recommendations for action ranging from incentives for attracting quality teachers, to improving teacher education and professional development, to revamping teacher tenure as we know it. I'm going to limit my analysis to the recommendations for professional development and teacher evaluation. I think restructuring the current tenure system is a major issue that deserves discussion, but that'll have to happen in another post so it doesn't turn this already lengthy review into a ridiculously long review.

1. Professional development for teachers.
• The report (rightly, in my opinion) makes many recommendations related to preparing pre-service teachers and helping new teachers grow as educators. One of my favorite recommendations suggests structuring a teacher's first year in the classroom as an internship with regular coaching and mentoring by master teachers. If it were up to me, I'd have new teachers carry half of a teaching load, giving them plenty of time during the day to observe other teachers, review and revamp instruction and assessment with a mentor, and generally work to improve their craft. Likewise, the mentors should have a reduced teaching load so they have time to both observe and meet with their mentees during the school day. The current system where exactly zero time is allocated for new teachers to review and reflect on their time in the classroom is a horrible model if we want new teachers to show improvement.
• A second recommendation states that districts should provide professional learning opportunities for teachers as a part of their regular job- and schedules should be configured to give teachers time to collaborate with their peers. Again, I agree. If you value professional learning and improvement, you should schedule time for it- not make it only something teachers do on their own time (which most do, but it's such a valuable thing schools should be purposefully providing opportunities for their teachers). However, a word of warning: I've taught in a school where the schedule was changed to provide teachers with 70 minutes of "collaboration time" each week. Teachers (including myself) were genuinely excited for this time to share lessons, have quick professional development sessions, and critique instruction and assessment. Instead, it was mandated from above that the "collaboration time" be used solely to analyze student standardized test-prep results. While I understand the importance of standardized tests in our current system, the cost was the loss of time for teachers to share their expertise with each other, learn how to effectively integrate technology, and design cross-curricular projects- all things teachers were excited to use that time to do. The moral of the story is that simply having collaboration time in the schedule doesn't mean it's being used effectively.
2. Teacher evaluation. As it is, the teacher evaluation system as I've known it is in need of reform. Last year I was observed by an administrator three times- each observation lasting approximately 70 minutes. Outside these official observations, administrators spent about 30 minutes in my classroom throughout the year. Okay, so that's a total of 240 minutes of observation for the entire school year by those who evaluate my performance. For some perspective, I taught four 70 minute classes each school day, and there are 180 school days per school year. That works out to 50,400 minutes of instructional time each school year. My evaluations were based on 240 out of those 50,400 minutes, or 0.48% of the total instructional time. It makes me nervous to think I'm being evaluated from such a position of ignorance6. The recommendations by the CAPSS include creating a standards-based evaluation system with regular performance reviews and including peer review as part of the performance review. As long as "regular performance reviews" includes frequent, informal observations by evaluators and "including peer review" can be expanded to provide students and parents a voice in the evaluation, then I think the recommendations are on track.

### 8. Involve Students and Parents

Schools give a lot of lip service to including parents and students in the education process. I've never been part of school that has done a good job at doing this. I've known teachers who were really good individually at involving parents in their classrooms and other teachers who provide students a large voice in their own education. Beyond the classroom level the furthest extent I've seen a district or (high) school involve parents is to invite them to serve on committees with little influence that meet at times untenable for most working adults' schedules.

I have no problems with the recommendations in the CAPSS report...other of course than the fact that they're so non-specific that they're just platitudes: "Engage parents as partners in their children's education," or "Create structures that encourage family involvement." Yes, those are good things- but what suggestions do you have for how to do these things?

Let me offer a few quick suggestions.

1. Use technology to make learning and school happenings more transparent. How? Have administrators start a blog or create an online newsletter that is updated regularly sharing goings on at the school. Share a photo a day. Invite teachers and students to do the same. Let students share their learning and reflections through student blogs (or evening events where students show off projects, etc.). In my mind, these things are the low hanging fruit- They're easy to implement and can cost nothing (depending on the tools used).
2. Form collaborations with people in the community. Examples?
• Maybe you have an assisted living community near the school. That's a community with a huge amount of knowledge, skill, and disposable time. Provide transportation to retirees so they can read, mentor, advise, or provide academic support to students.
• Create a community garden on school grounds that "rents" plots to community members. Have students run the administration and marketing of the community garden. Sell the fruits (& vegetables) of the gardens' labor at a farmer's market in the school parking lot on the weekends.
• Start a hackerspace in the school for the community. Students in class such as design, computer science, engineering, or any other class where they need to build stuff could be given free memberships and all other students can become members for discounted rates. Hackerspace members can access it all day. Let advanced students lead workshops for community members.

Ideas like these take more effort and money- but in the end the rewards may pay for themselves. In essence, make the school a community learning center and let the community share its skills and knowledge with the students and vice versa.

### 9. Leverage Technology

This section is surprisingly short (considering the topic), and the recommendations focus around two main ideas:

• Students and educators should have access to educational resources at any time. They don't quite recommend making broadband internet access a universal right, they do hint at it. I'd agree- though I'm not sure how that gets implemented. The inexpensive computers available today make computer ownership possible for even quite poor families. Paying $30-$50/month for internet access is much less likely to fit into tiny budgets. I also like the recommendation to "leverage online environments [...] for two-way communication, feedback, and collaboration..." Those environments are widely used today (in the form of social network sites), but more often than not are blocked by the schools themselves. It'd be nice to see schools embracing the power of these tools instead of hiding from them.
• Keep the technology infrastructure up to date. Of course I agree with this, but it's a matter of money. Even though reasonably powerful computers are becoming less and less expensive, it's still a major cost. I'd like to see schools use free and open source software (Open Office instead of Microsoft Office, for instance) or free resources such as Google Apps for Education. These would help keep software costs down and allow for money to be allocated more wisely elsewhere.
• .

### 10. Continue the Transformation Process

The report makes suggestions on how to avoid reform stagnation at both the state and district level. Several of the recommendations focus on items like changing statutes or education budgets. I don't have too much of an opinion on these items (due to my own relative ignorance on the topics more than anything else). However, two of the recommendations contain a similar idea that I find extremely attractive. Essentially, they say: Let innovators innovate.. One suggests districts can receive waivers for state statues and regulations to experiment with new ideas to improve student learning. The second recommends providing systems for teachers and principals to experiment with innovative practices.

If you let smart people do creative things- even if those things are outside the state's or school's "mandates"- you'll end up with a ton of great ideas that help everyone in the end (see: Google's 20% time). Instead of alienating smart people and ultimately driving them out of the education sector, you'd be empowering them and attracting more innovation.

______________________________

1. There isn't a single good definition for what I mean here, but think of the stereotypical adult-centric school or classroom. (back)
2. Clearly the assessments would be less standardized than the existing Connecticut Academic Performance Test or Connecticut Mastery Test, but they'd still be the state standard. (back)
3. I admit this could simply be due to specific situations in each respective district, but after hearing and reading about other people's similar experience, it seems to be a fair generalization. (back)
4. For a smart person's perspective on this matter, let me recommend Deborah Meier's article, As Though They Owned the Place: Small Schools as Membership Communities (pdf alert). (back)
5. That, to toot my own horn, was nominated for a Scholar of Excellence award by my advisor. (back)
6. I readily admit any administrator worth their salt talks to students regularly and knows more about what goes on inside the classroom than simply what they see when they're personally in the classroom. I still think 0.48% is a pretty sorry basis for an evaluation. (back)

## Missing school

This morning I volunteered to help out one of my physics instructors with an activity on fiber optics at a local high school. I had to skip one of my classes this morning in order to volunteer, but I'm killing it in that class and I haven't been in a high school classroom in a long time. I've been out of the classroom since the middle of June, 2011. Okay, okay, so that's not even 6 months, but the 60ish minutes I spent in a high school today reminded me how much I miss it.

I miss the mental gymnastics of devising solid lessons and activities. I really miss the relationships with students. After playing a minor part in a classroom for an hour this morning I wanted to stay the rest of the day observing teachers, tweaking lessons, and talking to students.

My life is much less stressful this year (despite the lack of a salary). I go to class. I do my homework. I come home to more free time than I've ever had in the last 9 years.

And yet, I miss teaching1. It's good gig.

______________________________

1. I'll be back teaching next year (assuming I get a job). In fact if you know anyone hiring a physics/earth science/chemistry teacher next year...(back)

## Pipe Insulation Roller Coaster Assessment

Welcome back. If you haven't joined us for the last two posts, let me recommend that you first read about determining rolling friction on the coaster and the project overview.

On to the assessment...

Assessment is extremely important. It explicitly informs students what things we value (and thus the things we value). If we assess the wrong things, students will focus on the wrong things. This can turn an otherwise excellent project into a mediocre project. For this post, I'll share two methods of assessment: First, the "old" method I used when I last taught physics (in 2008). Second, my updated assessment scheme that I'd use if I did this project again.

### The old assessment strategy

Embedded below is the document I gave to students at the beginning of the pipe insulation roller coaster project. Most noticeably it includes a description of the assessment scheme I used way back in January of 2008.

As you can see, I split the assessment of this project into two equal parts:

#### An assessment of the finished roller coaster

I wanted students to think carefully about the design, construction, and "marketing" of their coasters. I wanted them to design coasters that not only met the requirements, but coasters that were beautiful and interesting. Individual items being assessed under this rubric were weighted differently. For example, "Appropriate name of the coaster" was only worth 5%, while "Creativity, originality, and aesthetics" was worth 20%. Here's a link to the sheet I used when assessing this aspect of the coaster project.

#### An assessment of the physics concepts

In the embedded document above, you can see the breakdown of what items were being assessed. In my last post on pipe insulation roller coasters, you can see how students labeled their coasters with information on the marble's energy, velocity, and such along the track. Groups were required to turn in a sheet with the calculations they performed to arrive at these numbers. These sheets were the primary basis for determining whether students understood the physics concepts.

#### Problems

There are a lot of problems with the assessment scheme as described above. I'm not going to try to address them all, so here are a couple of the biggest issues:

• Assessing coaster design
• I'm a fan of elegant design. For this project I'm a fan of finished coasters that look well designed and exciting. That's why I included the first part of the assessment. I wanted to incentivize students to think about the design and construction of their coasters. In retrospect this is probably unnecessary. Students generally came into this project with plenty of intrinsic motivation to make their coaster the best in the history of the class. While I'd still stress the importance of quality design in the future, I'd completely cut this half of the assessment. Students already cared about the design of their coaster. If anything, awarding points for coaster design had an net negative effect. Especially because it doesn't assess anything related to the understanding of physics.
• Assessing student understanding of physics concepts
• As a normal part of working in a group while attempting to complete a large project in a limited time, students split up the work. Students are generally pretty smart about this in their own way. While I stressed that everyone in the group should contribute equally towards the calculations. Most groups would have the student who had the best understanding of the physics do most of the calculations. Why? Because it was faster. They needed to finish their coaster and just having the fastest person do the calculations meant more time for construction. While I generally knew when students in a group were adding very little to the calculations (and would assess them accordingly), on the whole this method didn't give me a good picture of each individual students' level of understanding. There were certainly students who skated through the project while minimally demonstrating their understanding of the energy and friction concepts involved.

### The new assessment strategy

You've probably already picked up on a few of the improvements I'd make for this project.

1. Use standards-based assessment. Standards-based assessment is an integral part of the classroom throughout the year- not just for projects. If you're unfamiliar with what this "standards-based" business is all about click the little number at the end of this sentence for plenty of links in the footnotes1. Here are a list of standards that would be assessed through this project:

#### Content standards assessed

• Energy
• Understand and apply the law of conservation of energy.
• Explain and calculate the kinetic energy and potential energy of an object.
• Explain and calculate the amount of work done on and by an object.
• Solve basic conservation of energy problems involving kinetic energy and potential energy.
• Solve conservation of energy problems involving work and thermal energy.
• Circular Motion
• Solve basic circular motion problems using formulas.
• Habits of Mind
• Collaborate and communicate with others to meet specific goals.
• Handle and overcome hurdles creatively and productively.

The specific standards used can vary based on your specific implementation.

2. No points for coaster requirements. As I mentioned earlier, it proved unnecessary to incentivize their coaster designs and meeting the basic requirements of the project. This decision also comes out of standards-based grading, which focuses assessment around, "Do you know physics?" instead of "Can you jump through the right hoops?" That isn't to say we don't talk about what makes a coaster "exciting" or "aesthetically pleasing" or whatever. It just means a student needs to demonstrate their understanding of the physics to earn their grade.
3. A focus on informal assessment. Rather than heavily relying on a sheet of calculations turned in at the end of the project (and probably done lopsidedly by one or two group members) to determine if the group understands the physics, I'd assess their understanding as I walked around the classroom discussing the coasters and their designs with the students as they work on them. Starting with questions like, "Why did you make that loop smaller?," or "Where are you having trouble staying within the requirements?" can be used to probe into student thinking and understanding. The final calculations would still be a part of the assessment, but no longer the single key piece of information in the assessment.

On the whole I was very happy with this project as I used it in the past. As I've learned and grown as a teacher I've found several ways I can tweak the old project to keep up with the type of student learning I want to support in my classroom. If you have other suggestions for improvement, I'd be happy to hear them.

As a bonus, here's a student produced video of the roller coaster project made for the daily announcements. The video was made by a student who wasn't in the physics class, so there's a little more emphasis on the destruction of the roller coasters at the end of the project than I'd like. Kids. What can ya do?

______________________________

1. Here are posts I've written about my experience implementing standards-based assessment. I'm not an expert, so let me also direct you my bookmarks related to standards-based grading, and some resources written by a couple people who are more expert: Shawn Cornally and Frank Noschese (who offers blog posts, a shared google doc foler, and a collection of bookmarked links). There are certainly other great resources out there, but these are a great starting point. (back)