“[Y]ou must cut down the mightiest tree in the forest with…a herring.”

28 08 2012

Laura Gibbs deserves some kind of prize for public service. I’ve been tweeting her series of posts about peer grading in Coursera for a while now, but since Audrey Watters has written them up I figured I might as well consider them here too. What you need to know going in is that Laura teaches online for the University of Oklahoma so she’s clearly rooting for Coursera as she takes their course on Science Fiction and Fantasy. I think this makes her indictments of the process all the more damning.

For example, there’s this:

So, what kind of data is Coursera collecting about the efficacy of this process? None. What kind of feedback are people getting on their feedback? None. What kind of guidelines and tips did we get on offering feedback? (Almost) none. Given that this is a skill, and a skill that many people have not had to use in the past, I think we would need a LOT of tips and guidelines to help with that, along with feedback so that people who are just now developing this skill can estimate how well they are doing.

And this:

By far the biggest problem, though, is vague and/or inaccurate feedback… and that’s a much harder problem to solve. It’s much like the problem with the poor quality of the essays overall; yes, there are inappropriate essays (blank essays, essays only a few words long, plagiarized essays, even spam essays) that need to be flagged – but the larger problem is the bewildering number of essays that are of such poor quality that it gets very discouraging to spend time on them. Without some kind of additional instructional component to the class, I am just not convinced that this often unreliable and/or unhelpful anonymous peer feedback can really help people to improve their writing.

Remember, this is just about the peer feedback system. I haven’t even mentioned the plagiarism problem or the lack of writing instruction in general.

“Can peer feedback really work in a setting where there is so little community and where this is little sense of reciprocity?,” asks Audrey. Well, that depends upon how you define the term “work.”

If you watched that Daphne Koller TED video, you probably remember the joke about how she tried to convince those terrible humanities professors that multiple choice was a perfectly acceptable way to test for higher order critical thinking and they did’t buy it. Ha ha ha. Unable to do that, they went with Plan B: peer grading. The impression this story left on me was that Coursera was only interested in doing the absolute minimum in order to make their humanities classes acceptable. Certainly, everything Laura has written suggests that they didn’t exactly put much forethought into some pretty basic problems.

But I want to take this point one step further. I would argue that creating an effective peer review process for grading writing is impossible – like chopping down the mightiest tree in the forest with a herring. Since writing is a skill that you never really stop learning, peer grading is therefore almost always the blind leading the blind.

For example, I am about to go into deep seclusion to polish my book manuscript for the last time before it hits the copy editor. It needs polishing because I have a bad habit of using the passive voice the first time I write anything at all complicated. Usually I turn those sentences around when I catch them during proofing, but I don’t always catch them. If your peers don’t know what passive voice is, or (as seems very likely in a lot of these Coursera classes) your peers don’t even speak English as their first language, learning how to write well solely from them is going to be impossible.

Since I teach history, I am prone to think of learning history as an excellent end in itself. However, if you desire employment when college is over, learning how to write well is the best skill that academic history classes can offer you. No wonder employers don’t take job applicants with online college degrees seriously then.

It appears that Coursera is giving them little reason to think otherwise.

About these ads

Actions

Information

2 responses

30 08 2012
Historiann

We need to make sure the jokers who run central admin at our unis see this stuff, Jonathan.

Meanwhile, a commenter at my blog sent me this link to an op-ed from the Prez of Wililams College defending the work we do in old-fashioned and yet time-tested F2F courses: http://online.wsj.com/article_email/SB10000872396390444327204577615592746799900-lMyQjAxMTAyMDIwOTAyODk3Wj.html

It’s called “In Defense of the Living, Breathing Professor.” It’s ironic that Coursera and other online schemes are so unaccountable, given the fact that you and I (and all other faculty, including tenured faculty) are expected to account for every minute of our time, and that every student we work with every year gets to weigh in on our performance as professors. And yet: data on the efficacy of online courses in general? Data on the efficacy of specific online courses offered by specific instructors through specific unis? Bueller? Bueller? Anyone?

1 10 2012
MOOCs and the History Classroom « Jacksonian America: Society, Personality, and Politics

[...] has also emphasized the structural problems of MOOCs, which include a lack of substantive feedback, academic dishonesty (in a free course, [...]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s




Follow

Get every new post delivered to your Inbox.

Join 2,284 other followers

%d bloggers like this: