Guest post by Carol Keeley
Garrison Keillor kicked the beehive with his recent
death-of-publishing op-ed. The
reaction was vigorously optimistic, with a little messenger-mocking. The backdrop to this volley was
BookExpo America, widely described as funereal. As usual, I agree with everyone. Keillor is right that the era of publishing he grew up with is dead, which is what he actually said, versus what he was said to have said. And he’s right that this merits a toast and some tears. But as my
yoga teacher says, “We rage against impermanence.” The people who insist Jazz is Dead are usually trying to nail it to a time and style, thus killing it. Things evolve. So books are–hey, hello? Am I losing you?
Research yells that I have ten seconds or three brief paragraphs–whichever’s first–to deliver my info-pellet. Then you’ll vanish. I need to deliver something provocative or BREAKING NEWS or oozingly cute enough to go viral, or I’ll vanish, too. We’re all itching to check email, skim headlines, see the latest petrol-soaked pelican, go down the hyperlink rabbit hole, surface in an hour or two, dimmed and blinking. I don’t know one honest soul who hasn’t experienced the tug of technology, splintered attention, a dip in concentrated reading. Keillor concedes that people are still reading; they’re “reading for hours off their little screens, surfing around from Henry James to Jesse James to the epistle of James to pajamas to Obama to Alabama to Alanon to non-sequiturs, sequins, penguins, penal institutions,” wherever the bouncing links lead.
“The net seizes our attention only to scatter it,” writes
Nicholas Carr in
The Shallows: What the Internet Is Doing to Our Brains. Quick question: Did you click on any hyperlinks so far? Was the linkless last paragraph easier to read? Did you sense what neuroscientists have measured–the firing of your prefrontal cortex as you shift from reading to decision-making, which impedes comprehension? Probably not. Our brains have such plasticity that, in one study, they literally rewired after six days of light internet use. The new circuitry then self-enforces in order to fine-tune. This is great news for stroke victims and veterans who’ve lost limbs. The brain, like our bodies, adapts majestically, whether it’s to pacemakers or cocaine.
The internet
is rewiring our brains. Laura Miller
summarizes the neuroscience at the heart of Carr’s book:
[There’s evidence that] even the microseconds of decision-making attention demanded by hyperlinks saps cognitive power from the reading process, that multiple sensory inputs severely degrade memory retention, that overloading the limited capacity of our short-term memory hampers our ability to lay down long-term memories.
Reading an entire book “doesn’t make sense. It’s not a good use of my time,” says Joe O’Shea, a recent Rhodes scholar. He skims for the gist on Google Book Search instead. I’d have less resistance to this if he weren’t a philosophy major. I was a phil major. I can’t fathom how one skims Heidegger or Spinoza. Or anyone else on the wall of books to my left. Okay, fine. I’ve outted myself as a book-fetishist.
And here’s the real schism: those who relish deep reading versus those who don’t, like the Rhodes scholar or
Clay Shirky, who claims, “No one reads
War and Peace. It’s too long.” Ditto for Proust. Shirky’s pugnacious
response to Carr claims his real fear is that “people will stop genuflecting to the idea of reading
War and Peace,” proving that “the literary world is now losing its normative hold on culture.” No offense to lit comrades, but that world passed on long ago. As Ursula LeGuin wrote in
Harper’s Magazine: “the hedonists who read because they want to. Were such people ever in the majority?” No. There have always been people who complain that books make them sleepy, LeGuin says in “
Staying Awake: Notes on the Alleged Decline of Reading.” She dates the “Century of Book” in the U.S. from about 1850 to 1950. Literature was still a “major form of social currency,” she writes. “To look at schoolbooks from 1890 or 1910 can be scary; the level of literacy and general cultural knowledge expected of a ten-year-old” was impressive. Now they don’t have to know it; they can just Google it.
Carr isn’t a whisker-eared, irrelevant Luddite, raging against impermanence. Suggesting that we use technology mindfully is not a demand for stasis. His book, and the Atlantic article that seeded it, was sparked by noting his own diminished capacity for sustained attention, his moth-eaten memory. Like Shirky, Carr has made a living writing about technology. Are their views really so oppositional? As a Zen pianist friend says, “Whenever it’s either/or, it’s both.”
Fear of change, especially in the form of technology, isn’t new. In Phaedrus, the argument is actually between Socrates and Plato, not Thamus and Theuth, whom Socrates uses. It was written on the cusp of another Either/Or debate–oral traditions versus writing. Writing will not make people wiser, Socrates argues. It will “implant forgetfulness in their souls,” and fill them “not with wisdom, but with the conceit of wisdom.” The oral tradition retained its hold in the West long after books appeared. There was no word separation or punctuation in early books because they were exclusively read aloud. Carr tracks the evolution of reading with tenderness. It was a revelation to me that the conjoined intimacy and privacy of reading–the aspect I cherish most–is in fact very recent. As readers changed, so did writing. That’s clearly happening again. But we rage against impermanence. Socrates spoke for those who feared that things written in the “water” of ink would merely be read, not engraved in memory. It’s not enough to rub our eyes against something, was the argument. It must wed the folds of our minds to be synthesized as wisdom.
Carr makes a similar case, with equal poetry and compelling neuroscientific evidence. He compiles research showing that “people who read linear text comprehend more, remember more and learn more than those who read text peppered with links.” In another study, text-only viewers scored significantly higher on tests than did text and multimedia users. The more overtaxed our brain is, the more susceptible we are to distraction. And when working memory–the metered parking spot between short-term and long-term memory–gets pressured, it empties. Whatever paused there doesn’t get wired into long-term memory. Hence increased forgetfulness. These are facts.
These facts don’t feed my ego. I don’t feel right or better because of them. I feel alerted. I depend on the Internet. I love links. My first job in publishing was as a fact-checker. Having immediate access to primary sources, being able to contextualize quotes or deepen comprehension of a subject without taking the subway to a library is bliss. But potential doesn’t confer reality. “Dozens of studies by psychologists, neurobiologists, educators and Web designers point to the same conclusions,” writes Carr. “When we go online, we enter an environment that promotes cursory reading, hurried and distracted thinking and superficial learning.” And once we adapt, which we swiftly do, this becomes the environment we crave. Distraction is our natural state. The Buddhist expression is monkey mind, a gentle abbreviation of the old yogic view that “the mind is a drunken monkey, stung by a scorpion.” Focusing the mind takes effort and intention. So this new world suits our native state in many ways. Maybe our brief spell of deep literacy was aberrant.
Marshall McLuhan noted that a “new medium is never an addition to an old one, nor does it leave the old one in peace.” Adept multitaskers and people enmeshed with technology often consider text tyrannical, as McLuhan did. A new medium “never ceases to oppress the older media until it finds new shapes and positions for them.” Ridiculing Tolstoy and Proust and skimming Kant is anyone’s privilege. But ridiculing those who value deep reading seems reactive to me. And where there’s reactivity, there’s often attachment. A recent
New Yorker profile of Andrew Breitbart describes an info-stream of ten IM conversations and four Web sites, news feeds and constant input juggling–all pretty standard for the multitasker. “There’s just something about knowing information when it happens,” Breitbart says, like “telling somebody, ‘Did you know that Michael Jackson just died?’ It’s just weirdly powerful. It’s fun.”
Weirdly powerful indeed. We’ve all become insta-news surfers and swappers. The
write-up in
New York Times Book Review of Carr’s book complained, politely, that he’d overlooked evidence that “the Internet and related technologies are actually good for the mind.” Example? An “influential study” demonstrating that “after just ten days of playing Medal of Honor, a violent first-person shooter game, subjects showed dramatic increases in visual attention and memory.”
Could just be me, but I hit some speed bumps above. It’s weirdly powerful and fun to deliver bad news to someone? It’s good for the mind to spend ten days playing a violent first-person shooter game? These examples are more persuasive if we skim-read them. If you pause to consider the specifics, it’s discomforting. (Obviously, there’s more dimension to what both men are claiming. Carr and his reviewer, Jonah Lehrer, have had a thoughtful
conversation online.) McLuhan also said that our tools numb “whatever part of our body they ‘amplify.'” Deep reading and deep thinking are entwined and both require “a calm attentive mind,” says Carr, as do “empathy and compassion.” Recent
research shows a forty percent plunge in empathy in college students since the Eighties, with the most precipitous decline in the past ten years. While extreme, the
couple who let their unnamed baby starve to death while they doted on their virtual daughter, Anima, are still scaldingly symbolic and cautionary.
Today’s media imitate the online experience, conceding its dominance. The revolution has already happened. The New York Times has slashed text and added summaries, so we can continue to skim-read, as we do online. That same New York Times recently had four feature stories in one week on how technology and computers are negatively impacting our brains and our families. The science is clear. Our brains are being rewired by the gadgets we’re glued to. Whether this shift is an improvement, an apocalypse, or something to modulate mindfully is an individual judgment.
Books may well go the way of vinyl, as was
suggested at BookExpo, where publishers enthused about interactive technologies. Books might become artifacts for quality-snobs or the nostalgic. As reading habits change, writing adapts. Agents and editors increasingly tell writers to keep their books under 220 pages (almost exactly what Carr’s is, minus footnotes), to amass Facebook friends, to build an online presence and website, to compose with an eye toward multimedia aps, to pander, essentially, to the fractured attention span. We’re all urged to merge with what Heidegger called “the frenziedness of technology.”
Let’s be clear. Commerce is at play here. No one cares about your brain but you. LeGuin addresses the for-profit aspects of publishing in her essay. And Carr makes it clear that “Google’s profits are tied directly to the velocity of people’s information intake.” They are in “the business of distraction.” I’m terrified to even type this, as enmeshed as I am with Google. We’re so glutted from overload while online, we don’t think about algorithms or profits, or whether it matters that we’re parading the conceit of wisdom as deep thinking atrophies. Meanwhile industry, enabled by our tendencies, makes choices for us, such as off-shore drilling and new uses for corn.
The internet is now our time-shared brain, our ersatz intimacy and intellect. We’re filled with the conceit of wisdom, but wisdom–like spending hours with a book or having an intimate dinner conversation–is experiential. And far more demanding. The Shallows isn’t for those who think deep reading is itself a conceit. The danger, says Carr, is when those shouters provide the “intellectual cover that allows thoughtful people to slip comfortably into the permanent distractedness of the online life.” If you care about your own ability to think and read deeply, please treat yourself to Carr’s book. Maybe this era has simply passed, like the one Keillor eulogizes. But I don’t believe it needs to be Either/Or. We can use these dazzling tools mindfully. There are things we should rage against, like letting industries exploit our inattention by fostering it.
By now the only people still reading are my editor, two friends, and my brother Mark, who–when I stayed through his college graduation after others left vomiting from the heat–said what my four readers deserve to hear: “Thanks. I’ll pay you later.”
This is Carol’s tenth post for Get Behind the Plough.