Peer instruction is worth the effort

Most blog posts, articles or books with a title like this would go on to describe the positive impact of peer instruction on student learning. I even write those kinds of posts, myself.

This one is different, though, because it’s not about peer instruction being worth the effort by (and for) the students. This one is about how it’s worth the effort by (and for) the instructor.

In my job with the Carl Wieman Science Education Initiative, I sometimes work closely with one instructor for an entire 4-month term, helping to transform a traditional (read, “lecture”) science classes into productive, learner-centered environments. One of the common features of these transformations is the introduction and then effective implementation of peer instruction. At UBC, we happen to use i>clickers to do facilitate this but the technology does not define the pedagogy.

Early in the transformation, my CWSEI colleagues and I have to convince the instructor that they should be using peer instruction. A common response is,

I hear that good clickers questions take soooo much time to prepare. I just don’ t have that time to spend.

So, is that true, or is it a common misconception that we need to dispel?

Here’s my honest answer: Yes, transforming your instructor-centered lectures into interactive, student-centered classes takes considerable effort. It feels just like teaching a new course using the previous instructor’s deck of ppt slides.

What about the second time you teach it, though?

A year ago, in September 2010, I was embedded in an introductory astronomy course. The instructor and I put in the effort, her a lot more than me, to transform the course. By December, we were exhausted. Today, one year later, she’s teaching the same course.

My, what a difference a year can make.

This morning I asked her to tell me about how much time she spends preparing her classes this term, compared to last year. We’re not talking about making up homework assignments or exams or answering email or debugging the course management system or… Just the time spent getting ready for class. This year she spends about 1 hour preparing for her 1-hour classes. That prep time consists of

  • a lot of re-paginating last year’s ppt decks because they’re not quite in sync. Today’s Class_6 is the end of one last year’s Class_5 plus the beginning of the last year’s Class_6 so it needs a new intro, reminders, learning goals slide.
  • she tweaks the peer instruction questions, perhaps based on feedback we got last time (students didn’t understand the question, no one chose a particular choice so find a better distractor, and so on). The “Astro 101” community is lucky to have a great collection of peer instruction questions at ClassAction. Many of these have options where you can select bigger, longer, faster, cooler to create isomorphic questions. It takes time to review those options and pick ones which best match the concept being covered.
  • like every instructor, she looks ahead to the next couple of classes to see what needs to be emphasized to prepare the students.

“And how,” I asked, “does that compare to last year?”

Between the two of us (I was part of the instructional team, recall) we probably spent 4-5 hours preparing each hour of class. In case you’ve lost the thread, let me repeat that:

Last year: 4-5 hours per hour in class.
This year: 1 hour.

“And do you spend those 3-4 hours working on other parts of the course?”

Nope. Those 3-4 hours per class times 3 classes per week equals about 10 hours a week are now used to do the other parts of being a professor.

Is incorporating peer instruction into classes worth the effort? Yes, absolutely. For both the students and the instructors.

Posted in astro 101, clickers, teaching | 2 Comments

Click it up a notch with i>clicker2

As some of may have heard, i>clicker is coming out with new hardware. UBC Classroom Services is already installing the new i>clicker2 receiver in many classrooms. I’ve been working with them to design a holder that mounts the receiver on the desktop so the receiver is 1) secure 2) visible.

iclicker2 receiver and UBC-designed mount

New i>clicker2 receiver mounted on classroom desktop with a base designed at UBC. The base swivels 359 degrees so the instructor can see the distribution from either side of the podium. That's a USB port on the base, where you plug in the chip with the i>clicker software and your class data. (Photo: Peter Newbury)

The i>clicker2 has more options, allowing for alpha-numeric responses in addition to the usual A thru E choices. (Image from iclicker.com)

This new receiver is fully compatible with the current i>clicker clickers, the simple, white A-E clickers we know and love.

No surprise, along with the new receiver comes a new i>clicker2 clicker.

Hold it, hold it! Don’t have a fit! Yes, there are more buttons and that seems to explicitly contradict i>clicker’s advertised simplicity. The first time I saw it, yes, I, er, had a fit.

However, I’ve since had a long chat with my colleague Roger Freedman (follow him on Twitter @RogerFreedman ) at UCSB. He’s a great educator, textbook author, avid clicker user, and i>clicker2 guinea pig. In his opinion, which I sincerely trust, i>clicker2 opens up new and powerful avenues for peer instruction. His favourite is ranking tasks which can be implemented without those awkward clicker questions with choices like A) 1>2=3>4 B) 1=2>3=4 …

Here’s the thing(s):

  • instructors could use the i>clicker2 to revert back to ineffective peer instruction questions
  • i>clicker2 opens up new options for peer instruction
  • they’re coming (though UBC has not declared when)

Conclusion Let’s be pro-active and prepared to train instructors when the i>clicker2 arrives.

The first step (after finishing your fit) is figuring out what the new clicker can do. And that’s the reason for this post. In 30 minutes – er, make that 11 minutes – I’ll be heading to a demo. The rest of this post will be written shortly…

(Image CC Pedro Moura Pinheiro on flickr)

It’s 3 hours later. I’m E X C I T E D! The demo with Roberto and Shannon was, well, they had a wide spectrum of audience members, from never held a clicker before to experienced users. I had a great chat with them afterwards, though. Details below, but first, some nice features of the i>clicker2 unit:

Click to enlarge. (Images: Peter Newbury)

  • (left) When you turn on the i>clicker2, it flashes the ID number. No more problems with the sticker getting rubbed off (though Roberto assures us they have better stickers now.)
  • (center) There are only 2 batteries (but still 200 hrs of use). See those 2 little sticky-outty things at the top? They’re rubber feet to stop the clicker from sliding off the desk. Nice touch.
  • (right) There’s a metal post for a lanyard. Good idea.

I won’t go into all the details about the features of the software. There are lots.   You can take the tour at iclicker.com.

The hard part

Those of us who have been using i>clickers for peer instruction have gotten pretty ingenious about asking good, discussion-promoting questions even though we’re limited to choices A–E. It’s going to take some thinking and discussion to figure out how to take advantage of the expanded capabilities of the i>clicker2. Ranking tasks are a great start: students can easily enter a string of letters like BCDEA to rank items. It’s going to take some testing. Which leads to…

The great part

Roberto and Shannon are going to lend me a class set of i>clicker2’s for the term! Eighty clickers to try out in the classes I work in. Suh-weet!

I was chatting with my friend Warren (@warcode) afterwards. He said, “When you asked Roberto to show you what i>clicker2 can do that i>clicker can’t, his response was, essentially, ‘Here’s a set clickers. You tell us.’ ”

Challenge accepted! Stay tuned!

 

Posted in clickers, teaching | Tagged , , | 2 Comments

Peer instruction workshop: the post-mortem

About a week ago, my colleague Cyn Heiner (@cynheiner) and I ran an all-morning-and-into-the-afternoon workshop on effective peer instruction using clickers. I wrote about preparing for the workshop so it’s only fitting that I write this post-mortem.

If “post-mortem” sounds ominous or negative, well, the workshop was okay but we need to make some significant changes. For all intents and purposes, the workshop we delivered is, indeed, dead.

This was our (in hindsight, ambitious) schedule of events:

Schedule for our workshop, "Effective peer instruction using clickers."

The first part, demonstrating the “choreography” of running an effective peer instruction episode, went pretty well. The participants pretend to be students, I model the choreography for 3 questsions while Cyn does colour commentary (“Did you notice? Did Peter read the question aloud? No? What did he do instead.”) The plan was, after the model instruction, we’d go back and run through the steps I took, justifying each one. It turned out, though, that the workshop participants were more than capable of wearing both the student hat and the instructor hat, asking good questions about what I was doing (not about the astronomy and physics in the questions). By the time we got to the end of the 3rd question, they’d asked all the right questions and we’d given all the justification.

We weren’t agile enough, I’m afraid, to then skip the next 15 minutes of ppt slides when we run through all the things I’d done and why.

Revised workshop: address justification for steps as they come up, then very briefly list the steps at the end, expanding only on the things no one asked about.

In the second part of the workshop, we divided the participants into groups of 2-3 by discipline — physics, chemistry, earth and ocean sciences — and gave them a topic about which they should make a question.

Topics for peer instruction questions. (Click to enlarge.)

We  wrote the topics on popsicle sticks and handed them out. This worked really well because there was no time wasted deciding on the concept the group should address.

We’d planned to get all those questions into my laptop by snapping webcam pix of the pages they’d written, and then have each group run an episode of peer instruction using their own question while we gave them feedback on their choreography. That’s where things went to hell in a handcart. Fast. First, the webcam resolution wasn’t good enough so we ended up scanning, importing smart phone pix, frantically adjusting contrast and brightness. Bleh. Then, the questions probed the concepts so well, the participants were not able to answer the questions. Almost every clicker vote distribution was flat.

One group created this question about circuits. A good enough question, probably, but we couldn't answer it in the workshop.

These are the votes for choices A-E in the circuits question. People just guessed. They are not prepared to pair-and-share so the presenter did not have the opportunity to practice doing that with the "students."

The presenters had no opportunity to react to 1 overwhelming vote or a split between 2 votes or any other distribution where they can practice their agility. D’oh! Oh, and they never got feedback on the quality of their questions — were the questions actually that good? We didn’t have an opportunity to discuss them.

We were asking the participants to create questions, present questions, answer their colleagues’ questions AND assess their colleagues’ peer instruction choreography. And it didn’t work. Well, d’uh, what were we thinking? Ahh, 20/20 hindsight.

With lots of fantastic feedback from the workshop participants, and a couple of hours of caffeine-and-scone-fueled brainstorming, Cyn and I have a new plan.

Revised workshop: Participants, still in groups of 2-3, study, prepare and then present a clicker question we created ahead of time.

We’ll create general-enough-knowledge questions that the audience can fully or partially answer, giving us a variety of vote distributions. Maybe we’ll even throw in some crappy questions, like one that way too easy, one with an ambiguous stem so it’s unclear what’s being asked, one with all incorrect choices… We’d take advantage of how well we all learn through contrasting cases.

To give the participants feedback on their choreography, we’ll ask part of the audience to not answer the question but to watch the choreography instead. We’re thinking a simple checklist will help the audience remember the episode when the time comes to critique the presentation. And that list will reinforce to everyone what steps they should try to go through when running an effective peer instruction episode.

The participants unanimously agreed they enjoyed the opportunity to sit with their colleagues and create peer instruction questions. Too bad there wasn’t much feedback, though. Which leads to one of the biggest changes in our peer instruction workshop

2nd peer instruction workshop: Creating questions

We can run another workshop, immediately after the (New) Effective peer instruction or stand-alone, about writing questions. We’re still working out the details of that one. My first question to Cyn was, “Are we qualified to lead that workshop? Shouldn’t we get someone from the Faculty of Education to do it?” We decided we are the ones to run it, though:

  • Our workshop will be about creating questions for physics. Or astronomy. Or chemistry. Or whatever science discipline the audience is from. We’ll try to limit it to one, maybe two, so that everyone is familiar enough with the concepts that they can concentrate on the features of the question.
  • We’ve heard from faculty that they’ll listen to one of their own. And they’ll listen to a visitor from another university who’s in the same discipline. That is, our physicists will listen to a physicist from the University of Somewhere Else talking about physics education. But our instructors won’t listen to someone from another faculty who parachutes in as an “expert.” I can sort of sympathize. It’s about the credibility of the speaker.

Not all bad news…

Cyn and I are pretty excited about the new workshop(s). Our bosses have already suggested we should run them in December, targeting the instructors who will start teaching in January. And I got some nice, personal feedback from one of the participants who said he could tell how “passionate I am about this stuff.”

And, most importantly, there’s a physics and astronomy teaching assistants training workshop going on down the hall. It’s for TA’s by TA’s and many of the “by TA’s” were at our workshop. Now they’re training their peers. These people are the future of science education. I’m proud to be a part of that.

 

Posted in clickers, physics, teaching | Tagged , , | 1 Comment

Preparing for our peer instruction workshop

It’s Sunday morning. On Tuesday, I’ll be running an all-morning-and-maybe-into-the-afternoon workshop in my department, Physics and Astronomy, at UBC. My science education colleagues and I, all part of the Carl Wieman Science Education Initiative, are working hard to be proactive, rather than reactive, when it comes to transforming the way we (that is, my teaching colleagues, faculty, university, WTH go for it, post-secondary educators) teach science.

The workshop I’m running with my colleague Cynthia Heiner (@cynheiner on Twitter) is about effective peer instruction. Er, think-pair-share. No, clickers. Or…

That’s the first thing I thought carefully about before putting this workshop together (originally for the CWSEI end-of-year conference last April): the title.

This learner-centered instructional technique of posing a multiple-choice question, getting students to individually choose an answer and then pairing up to discuss with each other why they made those choices, most of the world calls it think-pair-share (TPS). Eric Mazur branded it, or at least popularized it, as peer-instruction (PI). My university, like many others, runs these episodes using clickers. So, what to call this workshop? I made a choice and have diligently stuck with it:

Effective Peer Instruction using Clickers

i>clicker classroom response system

My colleagues are calling this a “clicker workshop” but I don’t want to give it that label. You see, about half of 20 people who have registered are grad students. I’m thrilled! One way to transform science education is to train the next generation of instructors. And when they head off into the rest of the world after graduation, some will get academic jobs that include teaching. And some won’t have clickers: they’ll be forced to use – gasp! – colored voting cards.

Many instructors use these coloured ABCD cards instead of clickers.

Like a lot of instructors do. Successfully. I don’t want these eager new faculty members to ever think, “Oh, I can do clickers but you guys don’t have them, so I guess I’ll just lecture.” So, this workshop is about effective peer instruction. Sure, it’s customized to using i>clickers to collect and assess the students votes, but the goal of the workshop is how to “choreograph” an episode of peer instruction so it maximizes student participation, engagement and learning.

To be honest, I’m pretty confident about content of the workshop. I’ve spent a lot of time with, and talking to, Ed Prather and his team from the Center for Astronomy Education at the University of Arizona. And I consider myself fortunate to have regular conversations, 140 characters at a time, with @derekbruff, @RogerFreedman, @RobertTalbert, @jossives, @Patrick_M_Len, @etacar11, @astrocarrie and other tweeps using peer instruction and other learner-centered instructional strategies.

If there’s one aspect of the workshop, and peer instruction, that I don’t feel I have a good handle on, it’s clicker points. With i>clickers, the system records who voted, not just how many chose A, B, C, D or E, so it is simple to reward clicks with points that contribute to each student’s marks. There are lots of options: a point for any click, a point for picking the right answer, both, points only if there is a second vote, no points,… It’s an over-constrained problem with too many competing and complementary factors:

  • students will participate if they get marks
  • unless they perceive the marks are simply for attendance
  • giving too many (any?) marks for right answers inhibits students from listening to their own ideas, relying instead on their supposedly “smarter” neighbours
  • if students engage and contribute to the class, shouldn’t they be rewarded?
  • effective peer instruction promotes learning and success on exams – isn’t that reward enough?
  • what about the voting card people? They can’t give points but they’re successful.
  • Or are they? Everyone in the field is well-aware of “card fade”, the drop in participation throughout the term as students (and the instructor?) loose their enthusiasm for voting.
  • a million other reasons and arguments…

Yeah, I’m struggling. But I took a big step towards clarity last week because of a post by my friend @jossives, “So long clicker participation points“, and a comment by @brianwfrank

I think, for an instructor who is new to running discussions among and with students in lecture, it’s pretty much fine to use points for “clicking”, espceially as a safety net….Ultimately, I think the direction an instructor should likely head is away from points for clicking

I really like that, and it’s the approach I’m going to promote at the workshop. What Brian says echoes my conversation with Ed Prather last week when he said, roughly, if you’re really worried about your policy for handing out clicker marks, you’ve already missed the boat. You have to convince your students that peer instruction promotes learning and success, and keep reminding them, and then “walk the walk” by putting nearly-identical assessments on their homework and exams. Ed, never one to mince words, concluded, “If you’re unwilling to do that, then you can worry about points.” I added, “unwilling, or unable…” Ed can get full participation of his 800 (yes, eight zero zero) student astronomy classes because he has incredible “presence” in the room. Some instructors, especially new ones, struggle with keeping their students focused. Throw in a new teaching technique that the new instructor is still learning, and you can’t blame the students for disengaging. So, clicker points to reward their effort for a few terms, until you are so confident with peer instruction, you don’t need that “safety net.”

There’s one last component of the workshop that I’m nervous about: getting the participants to authentically participate

  • veteran clicker users: I don’t want them to just fall back into their usual routine. I want them to genuinely try new things, like not opening the clicker poll until the students are prepared or, and this one has had the biggest backlash already, turning to the screen and modeling how to answer the questions, perhaps by “acting out” some of the concepts.

    Theatre of Dionysus (by nrares on flickr CC)

  • newcomers: effective peer instruction choreography take some “performance”. You’ve got to put yourself out there and lead the episode. I have to create an environment where the grad students don’t feel like they’re making fools of themselves in front of the faculty.

This will take some gentle yet firm cajoling at the beginning of the workshop. To the veterans, I think I’ll ask them to model our choreography for the benefit of the others, especially the newcomers, so they can get a clear experience of the workshop.

Alright, T-45 hours until the workshop. Tomorrow will be full of last minute details and working out the choreography of our choreography workshop with my co-presenter, Cynthia. Those of you following me on twitter at @polarisdotca will be the first to hear how it went. The rest of you, 1) why aren’t you on twitter? and 2) you’ll have to wait for a follow-up post.

Posted in clickers, teaching | Tagged , , , | 2 Comments

Sending bottle rockets to new heights (of learning)

My Twitter streams crossed this morning and before I even got to work, a blog post about kids, STEM, learning science, teaching science and rockets was practically spilling out of my head.

It started with a tweet from @physorg_com (h/t to @andrewteacher and @fnoschese) about this column “Don’t show, don’t tell? Trade-off between direct instruction and independent exploration” The researchers gave pre-schoolers a new toy with varying amounts of instruction and then watched what they did with the toy. The kids who were shown how one part of the toy worked could replicate that action, usually, but didn’t find all the other cool stuff the toy did. Kids who didn’t receive explicit instruction figured out much more about the toy. It’s a nice article – have a look if you have minute or two.

The article reminded me of my own experiences with the PhET physics simulations and some research the PhET developers have done (damn, can’t find the ref but I’m sure Wendy would be happy to point you in the right direction). The least effective way to use the sims is to give students a recipe (“Do this. Now click here. Measure this. Now do this. Now this….”) Better but still not terrific is just letting the students play with the sim (“Here’s a cool sim. Play for a while and see what happens.”) The most effective way to use the sims, in their studies anyway, is to give the students a goal or challenge (“Make the light bulb shine the brightest!“)

The other crossing Twitter stream started with @mrsebiology

The ensuing conversation with her and @irasocol reminded me of how I throttled up our UBC Summer Camp bottle rocket activity so it was much more than just something to fill the kids’ time.

Image by richpt on flicker (CC)

Bottle rockets are a popular activity with kids and families. My friends at the H.R. MacMillan Space Centre run Saturn 5 Saturdays where families bring a 2-litre pop bottle and build and launch their rockets. [Update 30 June: the next Saturn 5 Saturday is July 16, 11am – 2 pm. Thx @AskAnAstronomer] The rockets blast into the air, the kids (or leaders!) get soaked. They chase the rockets as they plummet back to the ground. It’s great fun.

But suppose you have the time, manpower and goal to make the activity educational, not just entertaining.  The recipe method (“Build the rocket like this: fins, nose cone, give it a name, now stand back as I launch it. Wheee!”) is fun, yes, quick, yes. Educational, not so much.  There are two ways we turned our rocket activity into a learning experience:

1. A rocket science experiment: What makes the rocket go highest?

How much water do you put in the rocket? More fuel = higher launch, you’d think. And how much pressure is best? Again, bigger is better, right? We made one set of tokens that read “low pressure”, “medium pressure” and “high pressure”. A second set has “empty”, “1/3 full”, “2/3 full”, “full”. One by one, the rocketeers pick one of each, setting the parameters for their launch.

After the launch, the group will decide if it was  a good one. Once, we tried using inclinometers to measure the maximum height of the rocket but that was waaaay too messy and confusing.  Instead, before they start launching, I ask them for 3 adjectives to describe bad, okay and great rocket launches. The group decides on words like “lame!”, “ok”, and “awesome!” Their rockets, their results, their words.

Then it’s onto to sending the rockets skyward on a ribbon of water.  After each one, we record the result in the matching cell in our results table:

low pressure medium pressure high pressure
empty
1/3 full awesome!
2/3 full
full lame!

As the Table gets filled in, we start making predictions and then testing them.  It’s pretty funny to watch the full, low pressure rocket. The rocketeer and the rest of the group know what’s going to happen — when you pull the release on the launcher, you hear a tiny “pop” and the rocket falls over. It’s no surprise that the higher the pressure, the higher the rocket goes. But it is surprising that the 1/3 full rockets go the highest. There’s an interesting compromise being having lots of fuel and getting that fuel off the launch pad. The thrill of discovery is pretty cool.

And none of that occurs in the recipe method where the leader takes the rocket from the rocketeer, fills it 1/3 full (we already know that’s the best volume, you see), and then launches it. Don’t tell them the answer. Perhaps, don’t even shepherd them to the solution. Instead, provide them with tools and feedback so they find their own way. (Oh geez, that was the thread on physlrner this morning in response to this interesting “Socrates = Border collie” post.)

2. Add a parachu–, er, safe return system

After watching that many rocket launches, some kids start to get bored. You’re outside so let them go off and play tag or hide-n-seek for a while. But some rocketeers are aching to launch again. And again. And again. So turn up the challenge.

I usually bring out a box of “stuff”: cardboard, file folders, string, tape, plastic bags, elastics, etc. and tell the kids they can launch again but only after they’ve added a parachute to get their rocket safely back to Earth. They usually form small groups by themselves – two head are better than one. @mrsebiology tweeted back “the parachute option is part of the ‘final exam’ challenge.”

This morning, though, I had a great conversation with @irasocol about this added challenge. Perhaps saying “parachute” gives too much away and directs them too much. Who knows what they might think up — the space shuttle is a glider, right? Ira tweeted

Yes, I--, er, my son, has this amazing Lego space shuttle set.

Which got me thinking, in the real world, we don’t care about the rocket, just the astronauts. The next time I run one of these rocket activities, here’s what I’m going to do: Give each kid a Lego mini-figure and challenge them to get the astronaut safely back to the ground. Capsule with parachute? Sure. Glider strapped to the side of the rocket? You betcha. Another idea I can’t even imagine? Absolutely!

There you have it, some ideas on how to throttle up your bottle rocket activity into an opportunity to engage in science, problem solving, engineering. Oh, it’s still fun. But now, so much more.

Do you have your own ways to send this activity to new heights? Please add a comment and share them with us!

Posted in outreach, teaching | Tagged , , , , | 3 Comments