In 2008, the contrarian tech writer Nicholas Carr wrote an article entitled, “Is Google Making Us Stupid?” Upon recommending it to a roomful of teachers the other night, I noticed that this article is famous enough to have its own Wikipedia page. I think of it as a kind of prequel for Carr’s less-famous book, The Shallows, but since I probably can’t convince you to read that before you get to the end of this post I’ll work off his article instead.
The main point of the article comes near the beginning:
I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.
In short, the Internet has a negative effect on everyone’s attention span and Google thrives on that effect.
First, all reading gets chopped down to discreet chunks. Next, all the lectures get chopped down to fifteen minutes. Then students watch those lectures at double-speed so that they can get on to what they really want to do (assuming their not Facebooking in another browser window already). You know where I’m going with this, but that would be a far too easy post to write. Therefore, I’ll go in a Carr-inspired rather than Carr-analogous direction.
Carr is more than smart enough to recognize that there are advantages to having the Internet (and by implication, Google) available. “For me, as for others,” he writes (or is this so old now that I should write “wrote?”):
the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind. The advantages of having immediate access to such an incredibly rich store of information are many, and they’ve been widely described and duly applauded.
This is the reason I’ve changed my teaching methods in recent years. When I was growing up, history used to be all about how many facts you can memorize. In some places, I’m sure it still is. Certainly, students still have to know something about facts. You have no idea how depressing it is to ask a class who Robert Wagner was and get the answer that he used to be on “Hart to Hart.”* But Senator Robert Wagner is important not just for the sake of knowing who Robert Wagner was or what he did, but for knowing what he represented and still represents in America today. You are never going to get that from just a Google search, and, alas, you’ll never get that from a Coursera MOOC.
Read the last eight months of this blog if you want to understand my problems with Coursera’s format, but I’m not just talking about the format here. I’ve learned not to stake my life on a quick reading of anything MOOC. Nevertheless, the overwhelming majority of the courses that they offer seem to be introductory. [Seriously, are there any prerequisites for any MOOCs anywhere? Wouldn’t that mean that they’d no longer be open?]
Granted some of those introductory courses might be very difficult (like machine learning, for instance), but what do you do if you want to take your MOOC education to the next level? At Cal State, you can pay tuition and get on-campus courses, but if MOOCs are really the future of higher education, what’s going to happen to all those less popular upper-level courses that we teach every semester when most schools go all MOOC, all the time (kind of like this blog)?
Unfortunately, specialized classes are very un-MOOCish. After all, fewer people are going to be interested in Agricultural Economics than Introduction to Micro almost by definition. Fewer people means less opportunity to make money from whatever data they’re willing to give you. Perhaps more importantly, the way that upper-level courses tend to be taught (at least in my experience) serves as a stark contrast to the MOOC M.O. These courses are often structured around required reading, that reading tends to be deep reading, and it requires the active participation of a professor in order for students to be able to apply the principles they learned in intro courses to this new material in the most interesting ways. To put it another way, does anyone assign Milton in Intro to Poetry?
That’s why giving the impression that you can get the equivalent of an entire college education by scratching the surface of absolutely everything is a fraud upon the learning public. Yet the public is conditioned to think that way by the way that the WWW is structured, a mile long and an inch deep.
Of course, to blame only Coursera for potentially making us stupid is patently unfair. From their perspective the customer is always right (even when they’re not) so their business plan is a reflection of the values of their best paying customers, namely university administrators. As Bob Samuels argues:
“[T]he push to base university funding on degree attainment rates applies a factory model of production to the complicated world of instruction. Instead of pushing for innovative creativity, we are re-imagining education as a technological machine that spits out graduates at a faster rate. Yet, students are not widgets, and faculty are not assembly line workers; instead, we need complex solutions to complex systems.”
Unfortunately, we won’t find those solutions to our problems by Googling “MOOCs,” “Higher ed reform” or even “Edtech flavor of the month.” In fact, I don’t think we’ll find those solutions on the Internet at all. Some might say that makes me contrarian too, but that I would argue is the whole problem with higher education right there.
* In case you’re wondering, that’s a true story.