Book Review: The Monsters of Education Technology

As background, here’s a brief summary of my relationship with Educational Technology, from c. 2003 to the present:
My EdTech Arc
Thanks to my Twitter Archive and apps like TimeHop, I’ve been frequently exposed to my EdTech Pollyanna stage. There sure were lots of excited exclamation points

In the more recent past, I’ve been feeling a bit more like the EdTech Curmudgeon:

  • “Hey, have you heard about this tool? Students can make digital flashcards!” Me: “Ummm…”
  • “Look this tool that automatically grades ScanTron tests!” Me: “Ummm….”
  • “Check out these Khan Academy videos! Students can totally get extra help!” Me: “Ummm….”

Usually I don’t say much when friends and colleagues bring up questionable tools. I mean, that’s pretty much exactly where I was c. 2008. Though I do try to bring up issues with the tools, or their parent companies, or (most often) the type of teaching and learning these tools reinforce when there’s an opportunity, usually I just end up feeling like Debbie Downer in a room full of excited cheerleaders.

Perhaps that’s why I enjoyed reading Audrey Watters’ The Monsters of Educational Technology so very, very much.
The Monsters of Education Technology
Audrey Watters self-describes as the Cassandra1 of EdTech. As she notes about herself in the book:

"I don’t tend to talk about ed-tech revolution and disruptive innovation unless it’s to critique and challenge those phrases. I don’t give ed-tech pep talks, where you leave the room with a list of 300 new apps you can use in your classroom."

You're not going to walk away from Watters' writing feeling like a world beater: able to revolutionize the whole education in your classroom tomorrow. Instead she stares deep into the soul of so many Silicon Valley edtech startups and finds them to be empty.

But this is an important thing. Education technology has been, and- apropos the Cassandra comparison- will continue to be lauded as “the way” to “fix” what’s “wrong” with teaching and learning today. It’s flashy. It’s shiny. It looks futuristic and fancy to administrators and politicians. Unfortunately, so many of the tools simply replicate or reinforce questionable practices from our past. Instead, Watters challenges us (emphasis mine):

“To transform education and education technology to more progressive and less programmed ends means we do have to address what exactly we think education should look like now and in the future. Do we want programmed instruction? Do we want teaching machines? Do we want videotaped lectures? Do we want content delivery systems? Or do we want education that is more student-centered, more networked-focused. [...] And instead of acting as though ed-tech is free of ideology, we need to recognize that it is very much enmeshed in it.”

When schools adopt new technologies there often isn’t much thought given to the pedagogical or social implications the new technologies bring to the classroom. Very little consideration is given to what the tools really add to the learning process over already existing tools (besides being shinier and newer), and even less consideration is given to how this might affect who we are as humans, as Watters describes in her critique of algorithms designed to grade student essays:

"We have no laws of ‘ed-tech robotics.’ We rarely ask, ‘What are ethical implications of educational technologies?’ Mostly, we want to know ‘will this raise test scores?’ ‘Will this raise graduation rates?’ We rarely ask, ‘Are we building and adopting tools that might harm us? That might destroy our humanity?’"

Despite the picture I’ve painted so far, Watters goes beyond simply prophesying doom upon humanity at the hands of educational technology. Though deeply skeptical and harshly critical of most educational technology, she isn’t anti-technology.

She champions technologies that give students agency. Technologies that places students in control of their own learning.  Technologies that promote learning outside the artificial structures society has created in schools and universities. Clearly one of her favorite ideas is an initiative started at the University of Mary Washington, called a Domain of One’s Own. UMW gives each student their own domain- not just space on a university server- but a domain they can take with them after they graduate:

“Their own domain. Again, the word matters here. Students have their own space on the Web. A space for a blog or multiple blogs. A digital portfolio for their academic work that can become a professional portfolio as well. A place to store their digital stuff in the cloud. Moreover, a lesson on the technologies that underpin the Web. HTML. CSS. RSS. It's not quite “hosted lifebits,” but it’s a solid step in that direction. The initiative represents a kind of open learning – learning on the Web and with the Web, learning that is of the Web. ‘Domain of One’s Own’ offers a resistance to the silos of the learning management system and to the student as a data mine. It highlights the importance of learner agency, of learning in public, of learning together, of control over one’s digital identity and over one’s educational data, and the increasing importance of digital literacies."

This is an idea that seems so simple yet powerful. It’s something that could be done even at the high school level- maybe students don’t receive their own actual domain, but students could be given a space somewhere online where 1) they are in control and 2) that can be taken with them once they’ve graduated.

Audrey Watters’ work is simultaneously deeply informative, insightful, troubling, and hopeful. I highly recommend The Monsters of Education Technology and pretty much everything she’s written online at Hacked Education to anyone working in or near an educational setting.

The book is available directly through Audrey Watters’ website in a variety of print and e-book formats. You should definitely buy it.

"Indeed humanity and learning are deeply intertwined. They are intertwined with love, not with algorithms." - Audrey Watters

______________________________

  1. Cassandra of classical mythology was given the gift of prophecy, but when she spurned Apollo’s love she was cursed to always prophesy truthfully, but never to be believed.     (back)

Google Drive Lab Report Workflow

This year I've rolled out using Google Drive for all Physics lab reports. Several people have asked me what this looks like, so I thought I'd share. Feel free to suggest a better/easier methodology- this is something that's come together based on how I know how to use Google Drive, and I certainly don't know all the ways to use Google Drive.

A big debt is owed to Katrina Kennett, whose posts and EdCamp Boston sessions on using Google Drive for paperless grading inspired my use, and for Frank Noschese, whose lab rubric I borrowed from heavily.

The setup

1- Creating shared folders. As soon as I get a finalized class list and my students' email addresses, I set up shared assignment folders in Google Drive for each student in my Physics class. This is a folder that is only shared between the individual student and myself, so anything I put into the folder they can see and vice versa.

Here's what it looks like for me in Google Drive:

Shared Assignment Folders in Google Drive

It can be a tedious process to individually create individual folders for each student. Fortunately, you don't have to- there's a Google Script called gClassFolders, that will automatically create folders for all of your students from a spreadsheet with your students' information. I won't go into detail here about how to setup gClassFolders, as the official site does an excellent job walking you through the process.

2- Share the rubric. I created the lab report rubric in a Google Spreadsheet, then I make a copy of the rubric for each student, and share with them, and place the rubric in their individual folders. Again, this could be a tedious process. Fortunately it isn't, thanks to Doctopus. Doctopus will make copies of the rubric for each student, share it with that student, and put it into their GDrive assignment folder. Super easy.

To use Doctopus, you'll just need a spreadsheet with students' names and email addresses (which you probably already have from using gClassFolders in step 1), and then it'll walk you through your sharing and naming options. Again, I'll forgo the lengthy explanation of using Doctopus, because the official site has you covered.

At this point, when each student signs into GDrive, they'll see their shared folder, with a spreadsheet titled, "Josh- Lab Report Rubrics," for example.

A student's view of the assignment folder

Now we're ready for some student lab reports.

Google Drive in Action

3- Students write lab reports. In lab, students record their data in lab notebooks, graph their data using LinReg, and discuss their results in a post-lab Whiteboard Meeting.  For their formal lab report, they create a Google Doc and type up their lab report. For graphs, they take screenshots of the graphs, and add them to the lab report as an image.

When they have finished the lab report, they drop it into their Physics Assignment folders, where I can then see it and have permissions to edit the lab report.

4- Scoring. Since I am able to edit their lab reports, I leave comments directly on their lab report, as shown below.

Comments on a Google Doc lab report.

A nice feature of Google Docs is that students receive notifications when I leave a comment, so they know right away when I've commented on their lab report.

At the same time I'm commenting on a student's lab report , I'm filling out the Lab Report Rubric & Checklist for their lab report. An important note: For each student, I'm filling out the lab checklist on my copy of the lab report rubric, and not the copy that I've already individually shared with students. This may seem odd, but in the end it means that students will have one spreadsheet that contains the rubrics for every lab that they've done. Below I'll explain how to make that happen.

5- Copying the rubric to students. After I've finished filling out the lab report rubric and checklist for a student's lab report, I select the "Copy to..." option on the tab of the spreadsheet:

The "Copy to..." location

A window then pops up asking me what Google Spreadsheet I'd like to copy it to. Since I've already created a lab report rubric spreadsheet for every student (in step 2), I just search for the student's first name, and select their lab report rubric spreadsheet:

Searching for student lab report spreadsheets

Once selected, the sheet is copied to that student's spreadsheet, where they can see it. On a student's spreadsheet, it'll show up as "Copy of [tab name]," as shown below:

Copied tab- Student view

Voila! Each student has one document that will contain every lab report rubric we do all year. This makes it easier for students to look back at previous lab reports and see where they made mistakes or needed more depth. It will hopefully also easily document their their growth over time.

Once I've copied a lab report rubric to the student's spreadsheet, I revert my copy of the rubric back to its original state so it's ready for me to start on the next lab report.

6- Rewrites. When a student turns in a less-than-stellar lab report, they're required to do a rewrite. A nice (and new) feature of Google Drive is the Activity Pane, which shows all the changes that are being made to documents in a specific Google Drive folder. As students work on their rewrites, I can check the activity pane for the folder with the students' shared folders and quickly see who has been updating their documents (and who hasn't).

Activity view in GDrive

Wrap Up

This is the first year I've used such a system, and it's definitely a work in progress. So far I've been quite happy with how the process has worked, and being able to create one document that contains the rubric for every lab report we do all year is a major plus.

Again, if you have any comments, questions, or suggestions for improvement, let me know. I'd definitely be open for suggestions that make the process even more streamlined.

Looking Back on EdCampCT 2012

Welp...a second EdCampCT has come and gone. EdCamps are always a great time for learning and meeting people you've only interacted with online. This EdCamp was special- as a co-organizer, it still amazes me that I had a part in bringing 100ish educators together to learn from each other. As an organizer, the day of EdCampCT was a bit hectic, but I was able to attend several sessions, talk to lots of people, and think a little about who attended and what was going on at EdCamp.

Attendees

It's great that EdCamps are conferences where 100% of the attendees actually want to be there. I've been to several education conferences where the majority of attendees were required to go by their administration. Some of those conferences were good, though most weren't. When the attendees want to be there and have a personal stake in the content of the conference it makes for a much happier conference culture and more involved attendees.

We don't have exact counts, but based on the simple number of hands that were raised when we asked participants if this was their first EdCamp, it looks like it was the first time for about 50% of EdCampCT 2012 attendees. Though I have zero actual data to support the following assertion, it seems like many newcomers heard about EdCampCT through word of mouth recommendations from participants of previous EdCamps.
For example:

Things to consider for future EdCamps

  • First, there seems to be demand for future EdCampCT events. Let me lay everyone's worries to rest and let you know we are planning on holding EdCampCT 3.0 in 2013. Keep an eye on the official website and twitter feed (@EdCampCT), though we probably won't be announcing the date for the next EdCampCT until early 2013. If you'd be interesting in helping organize next year's EdCampCT, drop us a line1.
  • It's exciting to introduce so many educators to the EdCamp movement. It does make me worry that we might not be meeting the needs of first time EdCampers as well as we could, however. In general, I think the whole ideology behind EdCamps helps include newcomers, but there's always room for improvement. Did we explain the EdCamp ideology/format in a way that made it clear to those who are unfamiliar with EdCamps? What more could we do to encourage first time EdCampers to lead sessions? If you have any ideas or insights, I'd love to hear them in the comments.
  • EdCamps are definitely becoming a thing that happens more and more frequently. The only EdCamps that were held in New England prior to the first EdCampCT in 2011 were EdCamp Keene, EdCamp Boston, and EdCamp NYC. In between EdCampCT 2011 and 2012 there were eight EdCamp events in New England- and that's only counting the recurring EdCamp BHS & RSD6 events as one each. Before 2012 is over another four EdCamps will be held in New England2. I love that there's such a high demand for EdCamp-style professional development. I wonder, though, what effect the increasing ubiquity of EdCamps will have on attendance at any one EdCamp:
    • Will average attendance decrease because educators can attend EdCamps closer to home?
    • Will attendance increase because more people will be exposed to EdCamps (and obviously love it) and thus want to attend more events?
    • If more and more schools adopt EdCamp-style professional development as a regular part of the school year, will the demand for "special event" EdCamps (like most EdCamps held to date) decrease?

    While I'd miss the "special event" EdCamps when they're gone, I think it'd be a major feather in the hat of the EdCamp movement to have had a major effect on professional development all over the world. In this hypothetical future, I'd bet there'd still be room for a few "special event" EdCamps if for no other reason than because it's always fun to meet with people from outside your school and district. I'm sure EdCampCT would be one of those that'd still go on even after we've totally revolutionized PD across the world- after all as the EdCamp Foundation Chairman of the Board says:


    🙂

Possible Improvements

  • Overly technology focused? Personally, I'd like to see a little less of a focus on technology and a greater focus on effective teaching & learning in general. Maybe this is just a somewhat selfish hope from someone who has been paying attention to the EdTech world for several years now. The conversations & sessions I've really enjoyed at EdCamps have focused primarily on some aspect of teaching other than explicitly on technology (on Standards-Based Grading, for example). That said, there's no doubt that the technology-centric sessions are extremely popular- and I recognize that these sessions are great for teachers who are getting started with technology in the classroom.
  • Better outreach & publicity. We (the organizers of EdCampCT) tried pretty hard to spread the word about EdCampCT to as many educators as possible. There's no doubt, however, that Twitter is how a lot of people hear about EdCampCT. This likely means there's a bias in who attends the event towards educators who are already at least somewhat tech-savvy. I wonder what else we might do to spread the word about EdCampCT to those who might not use (or even heard of) the Tweeter. Certainly these teachers could benefit from the EdCamp PD model as well.

Other Items of Note

  • EdCamp Food. It seems we've become known as the EdCamp of tasty food. This is not a bad thing. We're pretty lucky that our host, The Ethel Walker School, has a food service crew that is also used for special events held at the school (weddings, alumni events, etc.). They know how to make super tasty food. I'd have to say that although the potato chips- which earned international acclaim last year- were still super delicious, the rest of the food was also wa-a-a-a-ay above average. I realize the food isn't what makes an EdCamp great (it's the learning & sharing, natch), but if we're lucky enough to be in a position to also provide tasty food it ain't gonna hurt the learning that happens. 🙂
  • The second time around. Last year I can remember being seriously worried that nobody would sign up for the first ever EdCampCT. I remember worrying that we wouldn't have enough people who would be willing to lead sessions. This year- the second time around- I wasn't nearly as worried. In fact, the whole planning & preparing for EdCampCT 2012 involved much less all-around anxiety- not because it was necessarily less work the second time- but rather because we already had the experience of organizing one EdCamp under out belt. Something I need to work on is taking time to talk and connect with people a little more at EdCampCT. As an organizer I wanted the event to go smoothly for everyone so I found myself leaving conversations to go check on this or that. While there are a lot of things that I do need to help with as an organizer, it's probably well worth taking a little extra time to make connections and have conversations.
  • Session/Conversation Trends.
    • iPads were again a hot topic: There were five individual sessions that focused specifically on iPads. That seemed a big increase from last year, but it turns out there were four iPad sessions last year. So, the trend continues.
    • Evernote and Symbaloo seemed to be hot topics on Twitter. Each tool had its own session, but it definitely seemed that the sharing went beyond just the participants in those sessions (Unless the people who attended those sessions were just tweeting like crazy). While I've been using Evernote for awhile now (mostly for recipes, actually), Symbaloo was new to me. It's now on my short list of things to check out before school starts.
    • A few tools from the SmackDown (see the full list of tools shared here) that I really like and fully endorse:
      • DarkSky App: An iPhone/iPad app that gives very detailed forecasts one hour out. For example, it'll tell you something like, "Moderate rain will start in 10 minutes and last 35 minutes." It's already been useful helping me decide when I should go out for a run and mow the lawn.
      • Caffeine: An app for Macs that does one simple thing- it keeps your computer from going to sleep. If you ever use your computer to present or watch longer form videos, it's a great thing to have. It's also free.
      • Waze: A mobile GPS navigation app (available for most smart phones) that uses community information to determine the best routes. What's great is that it uses information from Waze users to update traffic conditions. If there's a slow-down on the highway that will automatically show up on the map with the average speed of traffic. It'll also look for faster alternative routes. I've been using Waze for a couple years and it's saved me from getting caught in nasty traffic many times.

    And not least

    Finally, it was great to work with such a great group of co-organizers to help put this event together. Thanks Sarah, Jen, and Dan! It takes a good bit of work to pull off EdCampCT, but everything always goes smoothly because of the dedicated work of all my co-organizers. I also want to give a special shout out to Sarah- who as a result of working at The Ethel Walker School (in addition to being amazingly awesome) always gets stuck with putting in more work than any of the other organizers.

    I look forward to helping plan EdCampCT events for many more years! 🙂

    ______________________________

    1. The best way to do that would be by either: posting a comment to this blog, sending a tweet to @EdCampCT or any of the organizers (Sarah- @sedson, Jen- @jweeks21, Dan- @DanAgins, or Ben- @WillyB). (back)
    2. EdCamps Hudson Valley, New Hampshire, Rhode Island, and SeaCoast (NH). You should check them out. (back)

The spring constant of Winston Churchill's belly

This will be the most scientific and precise post regarding Winston Churchill's belly you'll read today. Maybe all week.

Today, we'll be analyzing the following video:

After randomly embedding the preceding video while thinking about Hooke's Law and the spring constant in my last post, what I, and I'm sure you as well, immediately wonder is, of course, "I wonder what type of spring constant Winston Churchill's belly had?" This seems like something worthy of my time.

Here we go!

If we're going to figure this out, we need some data. First, we need some sense of scale. Since I have no idea the how tall the Animaniacs are, let's focus on the historical figures. I'm going to go with Winston Churchill's height to give the video some scale since he's pretty stretched out whilst his belly is being jumped upon1. It's surprisingly hard to find Churchill's height online with any sort of citation. I found what seems like a pretty solid source (via Wikipedia) for the height of Harry S. Truman (1.75 m). Using that information along with the following picture, I can figure out Churchill's height after throwing the image into Tracker:

Churchill and Truman were nearly the same height. I got 1.76 m (5 ft, 9 in) for Churchill. That seems pretty close to most of the unsourced figures for his height I found online.

I think the best way to go about finding the spring constant for Winston Churchill's belly is to use gravitational potential energy and elastic potential energy. If we can find the gravitational potential energy Stalin has at the top of his bounce and the maximum compression of Churchill's belly, we should be able to do the following:

mg\Delta y = \frac{1}{2}kx^2 \\ \\ k = \dfrac{2mg\Delta y}{x^2}

Where m is Stalin's mass, Δy is Stalin's maximum height above Churchill's belly, and x is the maximum compression of Churchill's belly.

I can fairly easily find Δy and x using Tracker to analyze the video.

I used 1.70 m for Churchill's height in the video instead of the 1.76 m figure above since his knees are bent slightly. Using that information to scale the video, Stalin's maximum height (Δy) is 0.65 meters and the maximum compression of Churchill's belly (x) is 0.28 m.

Finding Stalin's mass will require another long and probably fruitless internet search. Instead, I'm going to assume from the above picture Stalin is approximately the same height as Harry S. Truman and then assume Stalin's BMI is slightly above average (he was a dictator- which means he has access to lots of food). I'm going to say Stalin's BMI is 26. According to this BMI calculator, that would give Stalin a weight of 175 lbs, or 79.4 kg.

Now we've precisely (ha.) figured out all our variables, so we can go ahead and solve the equation for the spring constant (k):

k = \dfrac{2mg\Delta y}{x^2} \\ \\ \\ k = \dfrac{2(79.4\text{ kg})(9.8\text{ m/s}^2)(0.65\text{ m})}{(0.28\text{ m})^2} \\ \\ \\ k = 12,900\text{ N/m}

OK, so what's that mean? It means that if you could compress Winston Churchill's belly by a full meter it would require 12,900 Newtons of force. On the surface of the Earth, that would take a mass of 1,315 kg (2,900 lbs) sitting on his belly to compress it by a full meter2. WolframAlpha helpfully notes that this is approximately a mass equivalent to approximately 2 "typical dairy cows."

We can also learn something about the Animaniacs' collective mass now that we know the spring constant. If we rearrange the previous equation to solve for the mass, we get:

m = \dfrac{kx^2}{2g\Delta y}

It looks like the maximum height the Animaniacs attain is 0.77 m with a maximum belly compression of 0.16 m. Now solving for the mass we find:
m = \dfrac{(12900\text{ N/m})(0.16\text{ m})^2}{2(9.8\text{ m/s}^2)(0.77\text{ m})} \\ \\ \\ m = 21.9\text{ kg}

Collectively the three Animaniacs have a mass of 21.9 kg (48.3 lbs). Wow. They're lighter than I anticipated. If you divide that figure evenly by three, the average Animaniac weight is 16.1 lbs. Clearly Dot and Wakko are smaller than Yakko. This may, in fact, prove Dot's hypothesis that in addition to being cute, she's a cat:

Watch animaniacs - what are we? in Animation  |  View More Free Videos Online at Veoh.com

______________________________

  1. Also, I came across a few places that speculated that Stalin may have use elevator shoes to make himself seem taller, so it might be harder to get an accurate figure for him. However, this isn't exactly going to be a super-accuracy fest anyway, so maybe I shouldn't let that bother me. (back)
  2. I'm not sure if Churchill actually has a meter of stomach to depress, but you get the idea. (back)

Critiquing the CAPSS Recommendations for School Reform

I want to make my classroom the best learning environment possible. Most of my posts on this site focus on lessons, assessments, or ideas on how to improve the learning environment inside my classroom. Improving our individual teaching craft is one of the easiest places (not to say it's necessarily easy) as a teacher to effect change.

However, as I've worked towards improving what happens in my classroom I've frequently run into obstacles. These obstacles were primarily exterior to my classroom. Sometimes they were school or district policies, sometimes national or state requirements, and sometimes they were the result of how we, as a culture, have historically structured this thing we call "school." Most of these policies and structures were created with good intentions in an attempt to improve our schools and our children's education.

Given my generally negative experiences with "traditional1" instructional models and structures, I've found myself more and more interested in systemic school reforms. How can we create modern schools and structures that leverage the advancements in technology and access to information to provide students with an education that prepares them to be active participants in our nation's democracy, economy, and society?

It was no surprise when an editorial in our local paper titled Major Restructuring Recommended for Schools caught my eye. In it, the author briefly describes the Connecticut Association of Public School Superintendents (CAPSS) new report, "Recommendations for Transformation," a list of recommendations to transform the state education system "so it is able to meet the needs of students in the future." Naturally, I downloaded, read, and critiqued the full 36 page report (here's the official download link [pdf file], here's a version with my commentary [pdf file]).

My critique of the CAPSS recommendations

The report includes 134 individual recommendations for action across ten broad categories. I won't go into them all. Instead I'll give a brief breakdown of each broad category and get more specific around recommendations of particular interest.

The tl;dr version

This is a long article. For those of you thinking, "I'm can't read this whole thing. There is too much," let me sum up. Speaking in sweeping generalities, I applaud the CAPSS recommendations. In many ways the recommendations are progressive, forward-thinking, and focus on the best interests of students instead of on things that would be easy to implement or get through the political process. Recommendations such as competency-based advancement, standards-based assessments, and integrating out-of-school learning experiences into the formal education process suggest that CAPSS is interested in totally reworking what we mean by "school." This makes me happy. Too often reform movements are limited by the inertia of history and that-which-already-exists. CAPSS is clearly trying to overcome this inertia. Schools that followed the recommendations in the report could be student-centered environments that have a laser-like focus on student learning, support and integrate learning experiences that occur outside the classroom, remove conventions of little educational value (e.g. letter grades, traditional homework, and adult-friendly-but-child-poor assessments), and make schools an intrinsic part of their community.

And yet CAPSS puzzlingly makes recommendations that would make schools larger, less personal, and less a part of their community. Consolidating districts might save some money- which is an important consideration- but this seems to fly in the face of entire other sections of this report (For example, Section 2: Make it Personal; Section 4: Retool Assessments & Accountability; Section 8: Involve Students & Parents). Creating fiscally sustainable school districts is important, but eliminating small community schools in favor of large regional schools fosters disconnect between the schools and their community, students skating through schools unknown by their teachers, and an overall less personalized educational experience. Furthermore, many recommendations are so general that they're simply platitudes without any real meat to them (i.e. "Engage parents as partners in their children's education."). More detail and explanation is needed as to exactly what many recommendations are actually recommending. Lastly, how about some references? Surely (hopefully) the CAPSS group that created the report relied on more than the four citations included in this report- three of which are statistics on current educational practices. Nowhere do they cite sources to support their positions- either in this report, on their website, or any other report provided at their website.

I think CAPSS took a step in a positive direction by making many forward-thinking recommendations for the future of education in Connecticut. While none of these recommendations are binding, it heartens me to see an organization of this sort making progressive recommendations. It gives me hope there will be enough momentum to effect some real and positive educational reform in the near term. However, portions of the report conflict with the overall progressive theme- pointing towards deep elements of hesitation toward the large- and in my opinion needed- education reforms.

If you'd like a more detailed breakdown of the 10 categories of recommendations made in the CAPSS report, read on!

1. Raise the Bar

There are essentially two recommendations here: (1) Create "ambitious, focused, and cohesive" education standards, and (2) provide a system that measures student learning and promotes students through school based on content mastery instead of seat time.

  1. Standards. Question: There already are state education standards, how are these standards different? Are these different than the Common Core Standards? Further, the recommendations specifically focus on standards for "college and career readiness." Those are important goals, but I'd also like them to focus on helping students become effective participants in a democracy. On the whole I'm skeptical of the standardization movement. The report spends a lot of time recommending greater flexibility. In my experience standards tend to inhibit flexibility. Have students who are really interested in a topic not included in the standards? Sorry, no time for that- it's not in the standards.
  2. Content mastery. This is one of those bold recommendations that I love. Essentially, they support the idea that as soon as a student shows mastery of a topic they can move on to a new topic. 13 years in a classroom does not necessarily make an education. In this model, students would be able to advance more quickly or more slowly depending on their individual content mastery- they wouldn't have to wait until the end of the year to move on to the next topic. This is essentially standards-based grading on systemic steroids. However, they fall short on proposing what School would look like under this system. How would mastery be determined? How does it impact the organization of classes at schools? These are big questions that need some serious thought for this to be taken seriously.

2. Make it Personal

This thread focuses on creating student-centric learning environments. Of any of the 10 sections, I like these recommendations the most. The two main ideas in this section:

  1. Advance students based on mastery. This restates some ideas from the last section. I still like it. They're still vague on details, offering only, "Establish flexible work schedules," and "Allow credits to be awarded based on mastery." I have a hard time visualizing how this would work in reality, but perhaps that's because I've spent the last 27 years in the existing system. I'm worried by the recommendation to develop a variety of assessments and projects to allow students to demonstrate mastery. This sounds like they'd be state-standardized affairs, which if they're anything like existing state-standardized activites, would be horrible. These should be developed locally (while being shared publicly for other educators) based on individual student needs.
  2. Flexible learning environments. Yes. Please recognize that plenty of valuable learning takes place outside school. The integration of this informal learning with our formal education is much needed. This should go beyond counting a family trip to the Grand Canyon as an educational experience. If a student can diagnose and fix a car's electrical system, spending three weeks in a classroom learning about basic series and parallel circuits is a waste of their time. Schools should partner with and validate our students' out of school educational experiences.

3. Start with Early Childhood

This isn't my area of expertise, but I think the proposal to provide quality preschool for all children starting at the age of three is one of the biggest no-brainers in education reform. The payoff to society don't manifest for nearly two decades, but there is a seeming wealth of research that suggests preschool is a very good thing. I have some concerns with the recommendations similar to "Develop a system of accountability for providing language-rich, challenging, developmentally appropriate and engaging reading and mathematics curricula." The focus on reading and math smacks of No Child Left Behind, and suggests an emphasis on tightly structured learning environments. In the words of Alfie Kohn:

...the results are striking for their consistent message that a tightly structured, traditionally academic model for young children provides virtually no lasting benefits and proves to be potentially harmful in many respects.

4. Retool Assessments and Accountability

Now we're getting into some meat. The CAPSS report suggests standardized testing should be de-emphasized. I'd be willing to bet they'd suggest eliminating standardized tests as we know them were it not for the current national education environment. Props to them for that.

Here's a selected summary of their suggestions: (1) Provide a variety of assessment formats, (2) Assess students as they're ready to be assessed (instead of everyone at the same time), (3) Get assessment results back to students & teachers quickly so they inform instruction, and (4) Make the goals of all assessment transparent. It seems like they're saying one thing here. Yup, it's Standards-Based Grading.

In fact, they do mention SBG by name in this section, but they recommend making it "part of assessments." I'm a fan of SBG (as evidenced by previous posts), and I think this is a stellar recommendation.

I do have some hesitations with their recommendations, despite their SBG-like nature. For one, it's pretty clear from the language used they're not discussing day-to-day classroom assessment. They're discussing a new form for state standardized2 tests. I'm unclear on what this would look like, but it does sound like an improvement over the current system, though I'm skeptical it would come to pass in this improved manner. Another hesitation rests on the description of incentives for high performing schools. The report clearly recommends moving away from punitive measures, yet in my mind, providing incentives to high-performing schools is nearly indistinguishable from punitive measures against low-performing schools. Finally, the report lists subject areas for "base academic accountability." I take that to mean, "These are the subjects that will be assessed," or perhaps more clearly, "These are the subjects we think are important (things that are valued are assessed)." Notably absent are the arts and physical education- meaning the cuts to art and phys. ed. programs we see happening today are likely to continue were these measures put into place.

5. Offer More Options and Choices

Or, the section with the title that most poorly represents its contents. A better section title? "Consolidate School Districts." Their basic argument seems to be that having the current (supposedly high number of) 165 Connecticut districts creates an environment where it is difficult to align state and local initiatives, is economically inefficient, and fosters racial and ethnic isolation. While I agree that you can save some money by consolidating services like busing or food service, you also lose a connection with the community when the district encompasses many, many communities. Having worked in both small and large districts, the small district was much more connected to and valued by the community3. It may be more expensive to have small community districts- and that's not a small obstacle- but it would be worth it. It should be noted that reworking the state education system in the manner recommended by this report would also be expensive. In addition smaller districts would help schools be more flexible, personal, and transparent. Those adjectives would be a fair summary of the recommendations of this entire report, so why include this section?4

6. Reform Leadership

This section makes a lot of recommendations about the relationship between the State Department of Education and the Commissioner of Education as well as the roles of school boards and superintendents. That's a little bit outside my area of expertise, but I do like this statement from the introduction to the section:

Currently, organization and policy making for education are based on bureaucratic assumptions of hierarchy, centralized decision making, standardization and inspection. These characteristics limit individual discretion, depress creativity and foster stasis, not change.

That certainly describes my experience teaching in Connecticut. Despite completing my Master's in Secondary Education project by designing and implementing a student-centric, student-driven project5, I was told I couldn't continue the project unless all the science teachers wanted to use it. That's not exactly how one fosters innovation and creativity...

7. Boost Quality

This is a huge section with 26 recommendations for action ranging from incentives for attracting quality teachers, to improving teacher education and professional development, to revamping teacher tenure as we know it. I'm going to limit my analysis to the recommendations for professional development and teacher evaluation. I think restructuring the current tenure system is a major issue that deserves discussion, but that'll have to happen in another post so it doesn't turn this already lengthy review into a ridiculously long review.

  1. Professional development for teachers.
    • The report (rightly, in my opinion) makes many recommendations related to preparing pre-service teachers and helping new teachers grow as educators. One of my favorite recommendations suggests structuring a teacher's first year in the classroom as an internship with regular coaching and mentoring by master teachers. If it were up to me, I'd have new teachers carry half of a teaching load, giving them plenty of time during the day to observe other teachers, review and revamp instruction and assessment with a mentor, and generally work to improve their craft. Likewise, the mentors should have a reduced teaching load so they have time to both observe and meet with their mentees during the school day. The current system where exactly zero time is allocated for new teachers to review and reflect on their time in the classroom is a horrible model if we want new teachers to show improvement.
    • A second recommendation states that districts should provide professional learning opportunities for teachers as a part of their regular job- and schedules should be configured to give teachers time to collaborate with their peers. Again, I agree. If you value professional learning and improvement, you should schedule time for it- not make it only something teachers do on their own time (which most do, but it's such a valuable thing schools should be purposefully providing opportunities for their teachers). However, a word of warning: I've taught in a school where the schedule was changed to provide teachers with 70 minutes of "collaboration time" each week. Teachers (including myself) were genuinely excited for this time to share lessons, have quick professional development sessions, and critique instruction and assessment. Instead, it was mandated from above that the "collaboration time" be used solely to analyze student standardized test-prep results. While I understand the importance of standardized tests in our current system, the cost was the loss of time for teachers to share their expertise with each other, learn how to effectively integrate technology, and design cross-curricular projects- all things teachers were excited to use that time to do. The moral of the story is that simply having collaboration time in the schedule doesn't mean it's being used effectively.
  2. Teacher evaluation. As it is, the teacher evaluation system as I've known it is in need of reform. Last year I was observed by an administrator three times- each observation lasting approximately 70 minutes. Outside these official observations, administrators spent about 30 minutes in my classroom throughout the year. Okay, so that's a total of 240 minutes of observation for the entire school year by those who evaluate my performance. For some perspective, I taught four 70 minute classes each school day, and there are 180 school days per school year. That works out to 50,400 minutes of instructional time each school year. My evaluations were based on 240 out of those 50,400 minutes, or 0.48% of the total instructional time. It makes me nervous to think I'm being evaluated from such a position of ignorance6. The recommendations by the CAPSS include creating a standards-based evaluation system with regular performance reviews and including peer review as part of the performance review. As long as "regular performance reviews" includes frequent, informal observations by evaluators and "including peer review" can be expanded to provide students and parents a voice in the evaluation, then I think the recommendations are on track.

8. Involve Students and Parents

Schools give a lot of lip service to including parents and students in the education process. I've never been part of school that has done a good job at doing this. I've known teachers who were really good individually at involving parents in their classrooms and other teachers who provide students a large voice in their own education. Beyond the classroom level the furthest extent I've seen a district or (high) school involve parents is to invite them to serve on committees with little influence that meet at times untenable for most working adults' schedules.

I have no problems with the recommendations in the CAPSS report...other of course than the fact that they're so non-specific that they're just platitudes: "Engage parents as partners in their children's education," or "Create structures that encourage family involvement." Yes, those are good things- but what suggestions do you have for how to do these things?

Let me offer a few quick suggestions.

  1. Use technology to make learning and school happenings more transparent. How? Have administrators start a blog or create an online newsletter that is updated regularly sharing goings on at the school. Share a photo a day. Invite teachers and students to do the same. Let students share their learning and reflections through student blogs (or evening events where students show off projects, etc.). In my mind, these things are the low hanging fruit- They're easy to implement and can cost nothing (depending on the tools used).
  2. Form collaborations with people in the community. Examples?
    • Maybe you have an assisted living community near the school. That's a community with a huge amount of knowledge, skill, and disposable time. Provide transportation to retirees so they can read, mentor, advise, or provide academic support to students.
    • Create a community garden on school grounds that "rents" plots to community members. Have students run the administration and marketing of the community garden. Sell the fruits (& vegetables) of the gardens' labor at a farmer's market in the school parking lot on the weekends.
    • Start a hackerspace in the school for the community. Students in class such as design, computer science, engineering, or any other class where they need to build stuff could be given free memberships and all other students can become members for discounted rates. Hackerspace members can access it all day. Let advanced students lead workshops for community members.

    Ideas like these take more effort and money- but in the end the rewards may pay for themselves. In essence, make the school a community learning center and let the community share its skills and knowledge with the students and vice versa.

9. Leverage Technology

This section is surprisingly short (considering the topic), and the recommendations focus around two main ideas:

  • Students and educators should have access to educational resources at any time. They don't quite recommend making broadband internet access a universal right, they do hint at it. I'd agree- though I'm not sure how that gets implemented. The inexpensive computers available today make computer ownership possible for even quite poor families. Paying $30-$50/month for internet access is much less likely to fit into tiny budgets. I also like the recommendation to "leverage online environments [...] for two-way communication, feedback, and collaboration..." Those environments are widely used today (in the form of social network sites), but more often than not are blocked by the schools themselves. It'd be nice to see schools embracing the power of these tools instead of hiding from them.
  • Keep the technology infrastructure up to date. Of course I agree with this, but it's a matter of money. Even though reasonably powerful computers are becoming less and less expensive, it's still a major cost. I'd like to see schools use free and open source software (Open Office instead of Microsoft Office, for instance) or free resources such as Google Apps for Education. These would help keep software costs down and allow for money to be allocated more wisely elsewhere.
  • .

    10. Continue the Transformation Process

    The report makes suggestions on how to avoid reform stagnation at both the state and district level. Several of the recommendations focus on items like changing statutes or education budgets. I don't have too much of an opinion on these items (due to my own relative ignorance on the topics more than anything else). However, two of the recommendations contain a similar idea that I find extremely attractive. Essentially, they say: Let innovators innovate.. One suggests districts can receive waivers for state statues and regulations to experiment with new ideas to improve student learning. The second recommends providing systems for teachers and principals to experiment with innovative practices.

    If you let smart people do creative things- even if those things are outside the state's or school's "mandates"- you'll end up with a ton of great ideas that help everyone in the end (see: Google's 20% time). Instead of alienating smart people and ultimately driving them out of the education sector, you'd be empowering them and attracting more innovation.

    ______________________________

    1. There isn't a single good definition for what I mean here, but think of the stereotypical adult-centric school or classroom. (back)
    2. Clearly the assessments would be less standardized than the existing Connecticut Academic Performance Test or Connecticut Mastery Test, but they'd still be the state standard. (back)
    3. I admit this could simply be due to specific situations in each respective district, but after hearing and reading about other people's similar experience, it seems to be a fair generalization. (back)
    4. For a smart person's perspective on this matter, let me recommend Deborah Meier's article, As Though They Owned the Place: Small Schools as Membership Communities (pdf alert). (back)
    5. That, to toot my own horn, was nominated for a Scholar of Excellence award by my advisor. (back)
    6. I readily admit any administrator worth their salt talks to students regularly and knows more about what goes on inside the classroom than simply what they see when they're personally in the classroom. I still think 0.48% is a pretty sorry basis for an evaluation. (back)

Pipe Insulation Roller Coasters: Rolling Friction

Fair warning: This isn't a description of the pipe insulation roller coaster (a.k.a. PI Coaster) project. It is the activity we did immediately before starting on the roller coasters.

The PI coaster project was one of those quality projects that students enjoyed while still requiring solid content knowledge. I last used this project in 2008- the last year I taught physics. I'd like to think that I've grown as a teacher since then, so I decided I should update it to be what I'd expect of a project from myself today. You know. SBG-it up. Throw in some video analysis. Etc. Suddenly I found myself driving to the local hardware store to pick up some pipe insulation at 9:30 at night.

The Goal

The goal of this activity is to find the coefficient of friction acting between the marble and the track. By the time we get started on this project, we would have already gone over kinematics, F=ma, friction, and uniform circular motion in class, and we'd be right in the middle of the Work & Energy unit.

Specifically, the following concepts are needed for this investigation:

  • Energy may change forms, but is conserved (minus any work done by friction):

    [latex, size=2]\Sigma E_{first} = \Sigma E_{last} - W_{fr}\(

  • The amount of work done on an object depends on the size of the net force acting on the object and the distance the force is applied:

    [latex, size=2]W=F\cdot d\)

  • The size of the frictional force depends on the coefficient of friction between the two surfaces and the weight of the object:

    [latex, size=2]F_{fr}=\mu F_N\(

Here's the setup:

Students set up 12 feet of track as shown in the picture above and measure the height from which the marble is dropped (on the left of this image). In order to find the coefficient of friction, you first need to find the amount of work done by friction on the marble as rolls through the track. To do this students use the following formula:

[latex, size=2]PE_g = E_k - W_{fr}\)

Solving for work done by friction and doing a little substitution for the energies:

[latex, size=2]W_{fr}=mgh - \frac{1}{2}mv^2\(

Looking at the right side of the equation, we need to find the mass of the marble, the height from which the marble is dropped, and the velocity of the marble at the end of the track. The first two are easy enough to measure.

Finding the final velocity of the marble isn't terribly tricky, but the method I used in 2008 had a lot of error. Students would measure out the final 50 cm of the track (as seen below). Then they'd send the marble through the track 10 times- each trial they would use a stopwatch to time how long it took the marble to travel the final 50 cm.

Timing the marble was hard. Depending on the height of the track, the marble takes less than half a second to whip through the final 50 cm. Using a handheld stopwatch often led to large differences between one trial and the next. Not so great for accurate data.

Using Tracker to find velocity

In rethinking this activity, it struck me that Tracker Video Analysis might be great to cut down on these timing errors. Only one way to find out: Break out the tripod.

After fiddling with the setup of the tripod and camera for a bit, I realized two things.

  1. The marbles were too dark to stand out in the video. No easily deterred, I took a few marbles out to the garage and spray painted them orange. I'd have used hunter's orange or neon green, but I didn't have any of that laying around.
  2. My "video camera" (a.k.a. an iPhone) only films at ~24 frames per second. When I started the marbles on the track 1 meter above the ground, they showed up as a long, faint blur when on an individual frame. I lowered the track to 0.75 m. The marbles still showed up as a blur, but they were much more distinct blurs1.

Once I troubleshot my way through those issues, I filmed this amazing & exciting clip for analysis:

I did six trials to get a good set of data I could average. You could easily get away with 3 trials and still get good data. I also measured the velocity of each marble during the final five data points to use as a final velocity.

The average final velocity from the trials above: 1.720 m/s

Calcumalations

Using the same energy-loss method detailed above, I calculated the coefficient of rolling friction (\mu_r) for the marble over the entire length of the track:

[latex, size=2]W_{fr}=mgh - \frac{1}{2}mv^2\)

[latex, size=2]W_{fr}=(0.0045 \text{ kg})(9.8 \text{ m/s}^2)(0.75\text{ m})- \frac{1}{2}(0.0045\text{ kg})(1.720\text{ m/s})^2\(

[latex, size=2]W_{fr}=0.034\text{ J}\)

Then solving for the friction force:

[latex, size=2]W_{fr}=F_{fr}\cdot d\(

[latex, size=2]F_{fr}=\dfrac{W_{fr}}{d}\)

[latex, size=2]F_{fr}=\dfrac{0.034\text{ J}}{3.66\text{ m}}\(

[latex, size=2]F_{fr}=0.0093\text{ N}\)

Solving for the average coefficient of friction:

[latex, size=2]F_{fr}=\mu_rF_N\(

There's no up or down acceleration, so F_N = F_g.

[latex, size=2]\mu_r=\dfrac{0.0092\text{ N}}{(0.0044\text{ kg}\cdot 9.8\text{ m/s}^2)}\)

[latex, size=3]\mu_r=0.21\(

Is that a reasonable figure? According to the EngineersHandbook.com, wet wood on wood's coefficient of friction is 0.2. From my vast experience slipping and falling on a wet decks, I know wet wood is dern slippery, and I would've expected\mu_r for the marble to be pretty low as well.

Alternate method

Using Tracker, I can find the acceleration of the marble as it rolls along at the end of the track. Using someF=ma magic I can find\mu_r using acceleration instead of velocity.

I created velocity-time charts for each marble and added best-fit lines to find the average velocity and acceleration of the marble. I found the average acceleration of the marble to be -0.065\text{ m/s}^2.

[latex, size=2]F_{fr}=ma=(0.0045\text{ kg})(-0.065\text{ m/s}^2)= -0.00029\text{ N}\)

Then finding the coefficient of friction:

[latex, size=2]F_{fr}=\mu_rF_N\(

[latex, size=2]\mu_r=\dfrac{0.00029\text{ N}}{(0.0045\text{ kg}\cdot 9.8\text{ m/s}^2)}\)

[latex, size=3]\mu_r=0.0066$$

"Wait, what? That's two orders of magnitude smaller!" That's what I said when I first got that number. Then I realized I this method was calculating\mu_ronly for a straight and level section of the track. You'd expect the friction to be much less along a straight track than when the marble's being forced to do loops and turns.

Is it worth it?

Using video analysis is more time-consuming, but I also think it helps students see more clearly that the coefficient of friction between the marble and the track is constantly changing. I think I'd have to try this out with students once or twice before deciding whether it's an effective use of class time. The basic concepts are covered sufficiently using my old method, though they're fleshed out in more detail using video analysis.

Additionally, I think I'd have each group of students use a different track configuration- one with two loops, one with S-curves, etc. That'd give us an even better idea of how the track layout will effect the friction between the marble and track.

The Pipe Insulation Roller Coaster Series

  1. Pipe Insulation Roller Coasters: Rolling Friction
  2. Pipe Insulation Roller Coasters
  3. Pipe Insulation Roller Coaster Assessment

 

 

 ______________________________

  1. If anyone would like to chip in for the Buy Ben a High Speed Camera Fund, let me know. 🙂     (back)

Learning Tracker Video Analysis with Napoleon Dynamite

I know I'm late to the game. Rhett Allain, John Burk, Frank Noschese, among many others have been sharing how they use Tracker (or a similar tool) to analyze the physics of videos. Since I'm working on picking up my teaching certification in Physics this year, I figure this would be a nice addition to the teaching toolbox1.

So, what is Tracker? It's a free and open-source video analysis and modeling tool designed to be used in physics education. It works on Macs, PCs, and Linux boxes. Logger Pro is a similar tool, but it's not free or open-source2.

Getting going

To begin, I watched Rhett Allain's video tutorial, but it includes a few more complicated pieces that I wasn't quite ready for. Luckily sitting in the Related Videos sidebar on YouTube was this tutorial, which went over the super-basics for n00bs like myself. Alright. Tracker downloaded & installed. Basic tutorial viewed. Now I need me a video to analyze.

I wanted something pretty easy to break myself in: a fixed camera angle, no panning, with an object moving directly perpendicular to the camera. I figured YouTube must be full of videos of people jumping bikes, and I went out to find my first video analysis victim. Amazingly, one of the first videos I found was both interesting, funny, and had the perfect still camera and perpendicularly-moving object:

Perfect! OK, now I needed to calibrate Tracker so it can accurately determine scale. Hmm...well Napoleon is standing fairly close to the sidewalk. I wonder if Jon Heder's height is online? Well, of course it is. In fact, Google gives me an estimated height right on top of the search results by just typing in height Jon Heder. However, I think I'll use IMDb's data, which lists his height at 185cm (sans 'fro).

Napoleon Dynamite's height
Calibrating size with Napoleon Dynamite

There might be a small error there since he is standing a few feet back from the ramp, but it should be OK.

Did Pedro get, like, 3 feet of air that time?

It took me awhile to realize that I needed to shift-click to track an object...once I figured that out things went smoothly. I tracked the back tire of Pedro's bike. Here's a graph of  the back tire's height vs. time:

There are a couple hitches in the graph. A few times the video would advance a frame without the screen image changing at all. Must be some artifact of the video. I added a best-fit parabola to the points after the back tire left the ramp. Hmm...the acceleration due to gravity is -8.477 m/s^2. That's a bit off the expected -9.8 m/s^2. That could be a result of the hitches in the data, my poor clicking skills, or my use of Napoleon Dynamite's height as my calibration. We'll go with it, since it's not crazy bad.

Coming up to the ramp the back tire sits at 0.038m and reaches a maximum height of 0.472 m. How much air does Pedro get? ~0.43m, or 1.4ft. Napoleon's estimate is a little high.

Maybe Napoleon meant Pedro's bike traveled forward three feet in the air? Let's check the table.

I highlighted the points of interest. We can look at the change in x-values from when the tire left the ramp (at 0 meters) until the tire lands back on the sidewalk (at y = 0). The bike traveled 1.3 meters while airborne; about 4.25 feet. So maybe that's what Napoleon meant.

Who was faster?

Let's check the position-time graphs for Pedro and Napoleon.

I added best fit lines to both sets of data. We can easily compare their velocities by checking the slope of their best fit lines.

  • Pedro's velocity: 5.47 m/s (12.24 mph)
  • Napoleon's: 5.44 m/s (12.16 mph)

If I account for potential errors in measurement, their velocities are basically the same. Though if forced to pick a winner, I'd Vote for Pedro.

How tall is Pedro?

It should be fairly straightforward to find Pedro's height using the data in the video. The first thing I need to do is verify that the camera angle is exactly the same when Pedro is standing behind the sidewalk as it was earlier. After switching back and forth between the two parts, it's pretty clear that the camera angle is a little different. Nuts.

So, I need to find and measure an object that is visible in both parts of the video. I chose the left window on the (Pedro's?) house. Going to the first part of the video where I'm pretty sure the calibration is accurate, I used the measuring tape to measure the height of the window. I got 1.25 meters.

Jumping to the second part, I calibrated the video by setting the height of the window to 1.25 meters. Then I used the measuring tape to determine Pedro's height. I got 1.67 meters, or about 5' 6". Seems like a reasonable result. Let's compare it to what the Internet says about Pedro's height. IMDb gives Efren Ramirez's (a.k.a. Pedro) height as 1.70 meters (5' 7").

Not too shabby for my first time using Tracker.

Bonus

______________________________

  1. You might notice this post is pretty similar in style to Rhett Allain's video analyses on Dot Physics. Well, it is. When just learning how to do something, it's always best to start by imitating the masters, right? Oh, if you haven't yet, you should definitely check out his many, many amazing examples using video analysis to learn all sorts of crazy things. The guy's a Tracker ninja.     (back)
  2. To be fair, it's only $189 for a site license of Logger Pro, which ain't too shabby. According to Frank Noschese, Logger Pro is a little more user-friendly. Tracker has a bit of a learning curve.     (back)

How I use LaTeX

In the last installment, I described what LaTeX is and my adventures in learning to use it. Today, I'll explain how, as a teacher still figuring out all this LaTeX craziness, I get things done using it.

As I mentioned, I've been using LaTeX to write up lab reports in the classes I'm taking this semester. LaTeX is great with formal documents, especially when they need to include symbols, fractions, and other exciting calculations. LaTeX works great (for me) to create formal documents. It has easy commands to create headings and sub-headings, bulleted and numbered lists, and (of course) it makes including formulas and symbols easy peasy.

That being said, I've been working for quite a while to make any handouts or slides for students more visually appealing. Lots of graphics. Design elements. And so forth. You can make slides and handouts using LaTeX. I don't think you should. Here's a slide deck I've used to introduce the basics of chemical reactions. In Keynote or PowerPoint it didn't take much effort to create. In LaTeX I think it'd take for-ev-er. Does that mean you can't get the awesome formula making of LaTeX in anything other than formal documents?

LaTeXiT

Lucky for you, there's LaTeXiT [update: Mac only]. It comes automatically with the full version of LaTeX. Basically, it lets you type in the commands to create the great looking formulas & symbols you'd expect from LaTeX then allow you to drag & drop them into your slide decks or handouts.

Commands typed in the text box. Output appears up top.
Dragging from LaTeXiT to Keynote

One of the great things LaTeXiT does is allow you to export the formula in a variety of image formats- including vector based pdf image files. While that sounds like geekily unnecessary information, it means that you can adjust the size of your formula so it's as huge as you'd like and it'll never get all pixellated.

Starting out

Since at first I didn't know any of the LaTeX symbols, I kept a couple pdfs that explained all the commands for different symbols open while I was using LaTeX. If I needed how to add, say, absolute value symbols, I just used the "find" function on my pdf viewer to locate where it described that command. At this point, I rarely need to look up new commands, since I've memorized all the usual ones simply through repetition. I've included below links to the mandi LaTeX package and it's documentation, which was made specifically for physics classes. Also included are links to a guide for all sorts of math symbols. Both have been super-useful for me while learning to use LaTeX.

[Update] LaTeXiT History & Library

Thanks to John Burk via twitter, I've discovered that LaTeXiT saves every formula you enter. That means you can pull up the history panel and drag & drop any of the formulas you've entered without having to re-type the commands. That's a major time saver.
The history. Drag & drop to your heart's delight.
Further, you can save equations in the "Library," and organize them into folders. Being the super-organized person I am1, I'll probably create folders like Kinematics, Newton's 2nd, Heat them dump equations I create into them as I go. Eventually I'll have an extensive library of equations and symbols ready to go.

______________________________

  1. not so much.   (back)

Learning new things: LaTeX

I can usually get programs like Microsoft Word to format my documents so the way I envision the document in my head matches up pretty close to what I end up with on the screen. You know, however, that sometimes getting the document to look right can often take as much time as it takes to type the document in the first place.  If you add to that the hassle of trying to get equations for physics or chemistry to show up correctly, it's pretty easy to such down a lot of time simply knocking out a short and simple handout.

Last July, I caught John Burk's post on a new LaTeX1 package that makes writing physics equations much easier. Although I had been peripherally aware of LaTeX in the past, I really didn't know much. Since I had some extra time in the summer (and since I'm not teaching this year, freeing up more time), I decided to jump in and try to figure LaTeX out.

What is LaTeX?

Don't be fooled. LaTeX is not a word processor. It took me awhile to figure that one out. While you type in the text that you want to show up in your final document, you're also adding some code telling it exactly how you want your final document to look. Want a new section in your document? Type section{Section Title}. This automatically creates a section title with a larger bold font, and automatically adds it to your table of contents (if you have one).

Why bother?

Since I'm sciencey (is that how you spell sciencey?), I tend to use more formulas, symbols, and other weird notations in my documents than the average bear. As previously mentioned, getting these to work in pretty much any standard word processing software sucks. It's a major pain. Especially if there are special characters all over it. Even more so if you want the formulas to actually look right. LaTeX provides simple codes that allow you to make equations and symbols look exactly how you envisioned them in your head.

For example, typing
a=\dfrac{2(\Delta y)}{t^2}

will tell LaTeX to do this:
a=\dfrac{2(\Delta y)}{t^2}

If you'd like to see a full document in LaTeX, here's a plain text file that I wrote in a LaTeX editor. Here's the finished typeset product (pdf warning).

What I've learned

  • There's a learning curve. It takes awhile to figure (and remember) how to write in LaTeX as well as the different codes for symbols, parentheses, etc. If you're writing a document that's on a tight deadline it's not a good time to decide to experiment with LaTeX. When I started I sat down for a couple hours on a lazy Saturday afternoon and tried to figure it out. I've also committed myself to writing up all the lab reports I have to do this semester using LaTeX so I'll get the hang of things.
  • There's a lot of information online about LaTeX. If you don't know a command, you'll be able to find it by searching. As a bonus, you occasionally get some "interesting" search results due to LaTeX (the program) being spelled the same as latex (the rubbery material).
  • Once you get the hang of it, it's faster than messing about with Word. I've only been using LaTeX for a month and I'm already past the break even point. As a bonus, my documents have beautiful formulas that display correctly. I can only imagine things will get faster from here.
  • I doubt I'll use LaTeX as a teacher to create entire documents. I will use LaTeX as a teacher to insert formulas and symbols into documents and slides. I'll do a follow up post explaining specifically how I envision I'll use LaTeX as a teacher.
  • The "official" way to write it is \LaTeX, which of course, requires \LaTeX to make.

Resources

______________________________

  1. pronounced "lay-tech," which of course makes total sense.     (back)

Exams: SBG-style

The goal of any exam, ideally, is to assess how much students have learned over the course of a semester or school year. I changed the focus of grading in my classes from counting points to counting progress towards specific learning goals, I knew my exams needed to reflect that change as well.

This summer I had initially thought I might design some sort of alternate, performance-based exam that would mesh well with the tenets of standards-based grading. However, this year all exams for the same class were required to be exactly the same regardless of teacher. Since I'm currently one of four teachers who teach the 9th grade Integrated Science course and the only one using standards-based grading, I knew I had to take our common exam and make the best of it.

So, the exams had to have the same questions, but they didn't need to be in the exact same order, right? I reordered all the questions on the exam based on the learning goal they assessed.

Multiple choice section, SBG exam

This process uncovered several questions which didn't address any of the learning goals, so these "others" were grouped together to make their own section.

Overall, I wasn't thrilled with the exam, but I think it was quite good given the requirements it had to meet.

Assessment

Breaking down the exam into its composite learning goals allowed me to assess each learning goal on the exam individually. It took decently longer to grade the exams in this way, but it also provided me and my students with a wealth of information about their learning throughout the first semester.

I created a Google Spreadsheet that automatically calculated the individual scores for each learning goal and the overall exam grade. Once the grading was done, I shared each student's spreadsheet with them through Google Docs.

Below is an example of a filled out scoresheet (and here's a blank calculation sheet if you're interested):

Example Exam Calculation Spreadsheet

Details

Overall grades. You may notice I calculated two "overall" grades. I told students their overall grade on the exam would be the average of their scores on each learning goal (giving each learning goal equal weight), but I wasn't sure if that might result in some odd effects on the overall grade due to some flaw I hadn't planned for. As a check, I also calculated the exam's score "traditionally," or simply by dividing the total points possible by the total points earned. Interestingly these two scores were almost always ridiculously close to each other (for most students it was <1%). I'm not sure exactly what that means, but it was interesting nonetheless.

Unfinished long answer questions. The exam had 6 long answer questions and students were required to complete at least 4 of them. I had a few students who either skipped the long answer questions entirely or did fewer than were required. It didn't make sense to penalize any one learning goal for not doing all the long answer questions (since, after all, simply not doing the long answer questions didn't necessarily mean they didn't understand the content of the learning goals). However, I felt that there should be some penalty for doing fewer than required1.  As a result, I calculated what percentage one long answer question was of the entire exam and divided that by 2- which gave me 1.84% in this case. For each required long answer question that was not completed, I took 1.84% off their overall exam grade.

Spreadsheet-fu. I honed some serious "if-then" formula skills in the process- an area of serious spreadsheet-fu weakness before this spreadsheet. Despite the time it took me to figure out how to make the spreadsheet do what I want, I'm still pretty sure using the spreadsheet instead of calculating everything by hand saved me several hours. Plus, now I have another formula type under my belt.

Final thoughts

Perhaps unsurprisingly, my predictions about what learning goals would be problematic for students on the exam were dead-on. They were the same learning goals that more students struggled with during the course of the semester. There really weren't any surprises on the mid-term.

What then, is the purpose of an exam in a SBG classroom? Exams are meant to assess how well students know the material that has been presented throughout the semester. However, if I am regularly assessing students' understanding of learning goals throughout the semester is there any benefit to a final, summative exam? Most students' exam grades were eerily close to their grades for the rest of the semester2.

If we're doing SBG well, it seems to me the final exam is unnecessary. We should already have a good understanding of exactly what students know, so why bother with a big test at the end of the semester?

Should the exam in an SBG classroom be something totally different than what we've traditionally come to think of exams as being? Or should they just be done away with?

___________________________________

  1. At first I really balked at penalizing students for not completing the required long answer questions. However, after thinking about it for a bit, I came to the conclusion that to some degree the decision of a student to skip one or more of the long answer questions  was indicative of a lack of understanding of the content at least to some degree.     (back)
  2. On average, the exam grades were just a bit lower than grades for the rest of the semester. I can rationalize that in several ways: additional anxiety due to it being an exam, or a less than perfect exam design, etc.     (back)