“In the intellectual order, the virtue of humility is nothing more nor less than the power of attention.” Simone Weil
A Distracted World
The May 25th issue of New York Magazine featured this cover story by Sam Anderson: The Attention Crisis--And Why Distraction May Actually Be Good For You.
I've long been fascinated with the effects of multi-tasking, which stems in part from my interest in Marshall McLuhan and in part from watching my own mental world get torn apart by distractions. I thought to myself, "It'll be good to read something positive about the issue."
Thing is, the article isn't positive about our world of distraction. It's the best pop magazine article I've ever seen about the causes and problems of our distracted world, but its title is misleading. At the end of the article, Anderson throws a couple of positive spins on the mess, but that's about it for the "benefits of distraction": a couple of reasons why it might not be so bad after all . . . if we can just survive the dumbing down that is currently taking place at an alarming speed. (If you read the article and have a short attention span, I recommend you just skip the first page of the online version and start at the top of page two.)
Two years ago, I bought my first real cell phone (digital, not analogue). I loved it. I wrote about it at TCS Daily, where I recounted this personal anecdote: "I'll often use my cell phone and return the call while walking, so I can exercise and earn money at the same time. I did this the third day I had the phone, while walking back to the office after lunch. I called the client at Point A and ended the call a half mile later, at Point B. But after I hung up, I felt like I was waking from a deep daydream. For a moment, I couldn't even remember what route I had taken from Point A to B, though I've walked the route over a hundred times."
My experience wasn't unique, and Anderson explains the phenomenon: "[W]hen forced to multitask, the overloaded brain shifts its processing from the hippocampus (responsible for memory) to the striatum (responsible for rote tasks), making it hard to learn a task or even recall what you've been doing once you're done."
A type of voluntary amnesia. That strikes me as unsettling. As it does the experts. Anderson interviewed David Meyer, "one of the world's reigning experts on multitasking."
I begin, a little sheepishly, with a question that strikes me as sensationalistic, nonscientific, and probably unanswerable by someone who's been professionally trained in the discipline of cautious objectivity: Are we living through a crisis of attention?
Before I even have a chance to apologize, Meyer responds with the air of an Old Testament prophet. “Yes,” he says. “And I think it's going to get a lot worse than people expect.” He sees our distraction as a full-blown epidemic–a cognitive plague that has the potential to wipe out an entire generation of focused and productive thought. He compares it, in fact, to smoking. “People aren't aware what's happening to their mental processes,” he says, “in the same way that people years ago couldn't look into their lungs and see the residual deposits.”
Good and Bad Multitasking
In my TCS Daily article, I wrote, "I like multitasking, if it's the right kind. Reading a book while waiting for laundry to dry: smart multi-tasking. Reading a book while interviewing for a job: dumb multi-tasking. Ordering a Pabst while the head on your Guinness settles: fun multi-tasking." I've also counseled my children to do good multi-tasking: "Don't," I say, "do two mental things at once. Jog and listen to your iPod, good. Watch TV and do homework, bad." I've said it so many times that, when I started to read this next passage from the magazine to my teenagers, Abbie said (trying to hide her annoyance--she's a patient and kind daughter), "I know, Dad, I know . . .":
Meyer says that this is because, to put it simply, the brain processes different kinds of information on a variety of separate “channels”–a language channel, a visual channel, an auditory channel, and so on–each of which can process only one stream of information at a time. If you overburden a channel, the brain becomes inefficient and mistake-prone. The classic example is driving while talking on a cell phone, two tasks that conflict across a range of obvious channels: Steering and dialing are both manual tasks, looking out the windshield and reading a phone screen are both visual, etc. Even talking on a hands-free phone can be dangerous, Meyer says. If the person on the other end of the phone is describing a visual scene–say, the layout of a room full of furniture–that conversation can actually occupy your visual channel enough to impair your ability to see what's around you on the road.
The only time multitasking does work efficiently, Meyer says, is when multiple simple tasks operate on entirely separate channels–for example, folding laundry (a visual-manual task) while listening to a stock report (a verbal task). But real-world scenarios that fit those specifications are very rare.
So rare, in fact, that all the contraptions--Blackberries, email, the cell phone--are attacking our minds:
This is troubling news, obviously, for a culture of BlackBerrys and news crawls and Firefox tabs–tools that, critics argue, force us all into a kind of elective ADHD. The tech theorist Linda Stone famously coined the phrase “continuous partial attention” to describe our newly frazzled state of mind. American office workers don't stick with any single task for more than a few minutes at a time; if left uninterrupted, they will most likely interrupt themselves. Since every interruption costs around 25 minutes of productivity, we spend nearly a third of our day recovering from them. We keep an average of eight windows open on our computer screens at one time and skip between them every twenty seconds. When we read online, we hardly even read at all–our eyes run down the page in an F pattern, scanning for keywords. When you add up all the leaks from these constant little switches, soon you're hemorrhaging a dangerous amount of mental power. People who frequently check their e-mail have tested as less intelligent than people who are actually high on marijuana. Meyer guesses that the damage will take decades to understand, let alone fix. If Einstein were alive today, he says, he'd probably be forced to multitask so relentlessly in the Swiss patent office that he'd never get a chance to work out the theory of relativity.
Don't Just Do Something, Sit There
Perhaps the best part of the article--and Anderson's true ray of hope--is that we don't have to be mental sitting ducks of the techno culture. He interviewed Winifred Gallagher, the author of Rapt: Attention and the Focused Life (I've ordered it), and she basically says, "Don't get sucked in. If you do, it's your fault." A person whose mind atrophies from voluntary ADD shouldn't blame the techno culture anymore than a wino should blame the liquor industry.
Gallagher stresses that because attention is a limited resource–one psychologist has calculated that we can attend to only 110 bits of information per second, or 173 billion bits in an average lifetime–our moment-by-moment choice of attentional targets determines, in a very real sense, the shape of our lives. Rapt's epigraph comes from the psychologist and philosopher William James: “My experience is what I agree to attend to.” For Gallagher, everything comes down to that one big choice: investing your attention wisely or not. The jackhammers are everywhere–iPhones, e-mail, cancer–and Western culture's attentional crisis is mainly a widespread failure to ignore them.
“Once you understand how attention works and how you can make the most productive use of it,” she says, “if you continue to just jump in the air every time your phone rings or pounce on those buttons every time you get an instant message, that's not the machine's fault. That's your fault.”
Encouraging Conclusion?
In his discussion with Gallagher, Anderson writes about recent research involving Buddhist monks:
The most promising solution to our attention problem, in Gallagher's mind, is also the most ancient: meditation. Neuroscientists have become obsessed, in recent years, with Buddhists, whose attentional discipline can apparently confer all kinds of benefits even on non-Buddhists. (Some psychologists predict that, in the same way we go out for a jog now, in the future we'll all do daily 20-to-30-minute “secular attentional workouts.”) Meditation can make your attention less “sticky,” able to notice images flashing by in such quick succession that regular brains would miss them. It has also been shown to elevate your mood, which can then recursively stoke your attention: Research shows that positive emotions cause your visual field to expand. The brains of Buddhist monks asked to meditate on “unconditional loving-kindness and compassion” show instant and remarkable changes: Their left prefrontal cortices (responsible for positive emotions) go into overdrive, they produce gamma waves 30 times more powerful than novice meditators, and their wave activity is coordinated in a way often seen in patients under anesthesia.
He returns to this research in his conclusion, but not without first offering a lame hope, which I think is fairly summarized as follows: "Sure it looks like things are bad, but you never know. Something good might come of it." If I sound like I'm dismissing this part of Anderson's piece, I am. He wrote perhaps the best article I've read all year, but that conclusion is just too lame.
He then, though, latches onto the Buddhist meditation idea. Meditation, combined with his comments on Gallagher and some scientific facts, offer the best hope, but even here, Anderson seems to fall a bit short:
More than any other organ, the brain is designed to change based on experience, a feature called neuroplasticity. London taxi drivers, for instance, have enlarged hippocampi (the brain region for memory and spatial processing)–a neural reward for paying attention to the tangle of the city's streets. As we become more skilled at the 21st-century task Meyer calls “flitting,” the wiring of the brain will inevitably change to deal more efficiently with more information. The neuroscientist Gary Small speculates that the human brain might be changing faster today than it has since the prehistoric discovery of tools. Research suggests we're already picking up new skills: better peripheral vision, the ability to sift information rapidly. We recently elected the first-ever BlackBerry president, able to flit between sixteen national crises while focusing at a world-class level. Kids growing up now might have an associative genius we don't–a sense of the way ten projects all dovetail into something totally new. They might be able to engage in seeming contradictions: mindful web-surfing, mindful Twittering. Maybe, in flights of irresponsible responsibility, they'll even manage to attain the paradoxical, Zenlike state of focused distraction.
The idea of neuroplasticity is great. The reference to Buddhism even greater. And the embrace of paradox is the greatest of all--Christians live that paradox in the truth of the cross.
I don't think it's a coincidence that paradoxically-rooted Christianity has had an answer to the problem of distraction since St. Paul wrote these words in 1 Thessalonians 5: "Pray without ceasing."
You don't even have to be a Christian to know about this. The whole idea of incessant prayer was a driving theme of Salinger's Franny and Zooey (its Buddhist-like conclusions notwithstanding). Franny Glass was infatuated with the Jesus Prayer: "Lord Jesus Christ, have mercy on me, a sinner" (it has taken different forms over the centuries, but that's the version Franny used). She'd read about it in the Russian short story, "The Pilgrim."
The Jesus Prayer is nothing less than (but oh so much more than) beating your natural state of distracted existence into a form God can use. By repeating the Jesus Prayer (the Pilgrim started with 3,000 a day, using a rosary to count, and quickly escalated to 12,000), the words eventually ensconced themselves in his heart and repeated in his mind without effort and without use of his lips. That happens in the first 5% of the story. The rest of the story recounts his peripatetic journeys throughout Russia, where he meets many people and influences their lives with the insight given to him through his new-found life of focused distraction: focused on those he's speaking with, but the Jesus Prayer still running in the background.
It's a beautiful little story, and I highly recommend it. If you're interested in The Jesus Prayer, I also highly recommend this little book. They have much to say that is edifying.
I have come to distrust contemplative exercises that intentionally seek to eliminate all mental distractions (Thomas Dubay calls them "unnatural," I think he's right), and much Eastern Orthodox contemplative actions call for such things, as does Buddhism. But there's much to be said for The Pilgrim's Jesus-Prayer-Automation and attaining what Anderson calls the "Zenlike state of focused distraction." If Anderson were to have called it the "God-centered and love-intensive state of focused distraction," I think he'd be onto something supremely important. As it is, he has taken a wrong road at the end of an otherwise most excellent academic and literary journey.