The Myth of Multitasking

How intentional self-distraction hurts us  
Subscriber Only
Sign in or Subscribe Now for audio version

In one of the many letters he wrote to his son in the 1740s, Lord Chesterfield offered the following advice: “There is time enough for everything in the course of the day, if you do but one thing at once, but there is not time enough in the year, if you will do two things at a time.” To Chesterfield, singular focus was not merely a practical way to structure one’s time; it was a mark of intelligence. “This steady and undissipated attention to one object, is a sure mark of a superior genius; as hurry, bustle, and agitation, are the never-failing symptoms of a weak and frivolous mind.”

In modern times, hurry, bustle, and agitation have become a regular way of life for many people — so much so that we have embraced a word to describe our efforts to respond to the many pressing demands on our time: multitasking. Used for decades to describe the parallel processing abilities of computers, multitasking is now shorthand for the human attempt to do simultaneously as many things as possible, as quickly as possible, preferably marshalling the power of as many technologies as possible.

In the late 1990s and early 2000s, one sensed a kind of exuberance about the possibilities of multitasking. Advertisements for new electronic gadgets — particularly the first generation of handheld digital devices — celebrated the notion of using technology to accomplish several things at once. The word multitasking began appearing in the “skills” sections of résumés, as office workers restyled themselves as high-tech, high-performing team players. “We have always multitasked — inability to walk and chew gum is a time-honored cause for derision — but never so intensely or self-consciously as now,” James Gleick wrote in his 1999 book Faster. “We are multitasking connoisseurs — experts in crowding, pressing, packing, and overlapping distinct activities in our all-too-finite moments.” An article in the New York Times Magazine in 2001 asked, “Who can remember life before multitasking? These days we all do it.” The article offered advice on “How to Multitask” with suggestions about giving your brain’s “multitasking hot spot” an appropriate workout.

But more recently, challenges to the ethos of multitasking have begun to emerge. Numerous studies have shown the sometimes-fatal danger of using cell phones and other electronic devices while driving, for example, and several states have now made that particular form of multitasking illegal. In the business world, where concerns about time-management are perennial, warnings about workplace distractions spawned by a multitasking culture are on the rise. In 2005, the BBC reported on a research study, funded by Hewlett-Packard and conducted by the Institute of Psychiatry at the University of London, that found, “Workers distracted by e-mail and phone calls suffer a fall in IQ more than twice that found in marijuana smokers.” The psychologist who led the study called this new “infomania” a serious threat to workplace productivity. One of the Harvard Business Review’s “Breakthrough Ideas” for 2007 was Linda Stone’s notion of “continuous partial attention,” which might be understood as a subspecies of multitasking: using mobile computing power and the Internet, we are “constantly scanning for opportunities and staying on top of contacts, events, and activities in an effort to miss nothing.”

Dr. Edward Hallowell, a Massachusetts-based psychiatrist who specializes in the treatment of attention deficit/hyperactivity disorder and has written a book with the self-explanatory title CrazyBusy, has been offering therapies to combat extreme multitasking for years; in his book he calls multitasking a “mythical activity in which people believe they can perform two or more tasks simultaneously.” In a 2005 article, he described a new condition, “Attention Deficit Trait,” which he claims is rampant in the business world. ADT is “purely a response to the hyperkinetic environment in which we live,” writes Hallowell, and its hallmark symptoms mimic those of ADD. “Never in history has the human brain been asked to track so many data points,” Hallowell argues, and this challenge “can be controlled only by creatively engineering one’s environment and one’s emotional and physical health.” Limiting multitasking is essential. Best-selling business advice author Timothy Ferriss also extols the virtues of “single-tasking” in his book, The 4-Hour Workweek.

Multitasking might also be taking a toll on the economy. One study by researchers at the University of California at Irvine monitored interruptions among office workers; they found that workers took an average of twenty-five minutes to recover from interruptions such as phone calls or answering e-mail and return to their original task. Discussing multitasking with the New York Times in 2007, Jonathan B. Spira, an analyst at the business research firm Basex, estimated that extreme multitasking — information overload — costs the U.S. economy $650 billion a year in lost productivity.

Changing Our Brains

To better understand the multitasking phenomenon, neurologists and psychologists have studied the workings of the brain. In 1999, Jordan Grafman, chief of cognitive neuroscience at the National Institute of Neurological Disorders and Stroke (part of the National Institutes of Health), used functional magnetic resonance imaging (fMRI) scans to determine that when people engage in “task-switching” — that is, multitasking behavior — the flow of blood increases to a region of the frontal cortex called Brodmann area 10. (The flow of blood to particular regions of the brain is taken as a proxy indication of activity in those regions.) “This is presumably the last part of the brain to evolve, the most mysterious and exciting part,” Grafman told the New York Times in 2001 — adding, with a touch of hyperbole, “It’s what makes us most human.”

It is also what makes multitasking a poor long-term strategy for learning. Other studies, such as those performed by psychologist René Marois of Vanderbilt University, have used fMRI to demonstrate the brain’s response to handling multiple tasks. Marois found evidence of a “response selection bottleneck” that occurs when the brain is forced to respond to several stimuli at once. As a result, task-switching leads to time lost as the brain determines which task to perform. Psychologist David Meyer at the University of Michigan believes that rather than a bottleneck in the brain, a process of “adaptive executive control” takes place, which “schedules task processes appropriately to obey instructions about their relative priorities and serial order,” as he described to the New Scientist. Unlike many other researchers who study multitasking, Meyer is optimistic that, with training, the brain can learn to task-switch more effectively, and there is some evidence that certain simple tasks are amenable to such practice. But his research has also found that multitasking contributes to the release of stress hormones and adrenaline, which can cause long-term health problems if not controlled, and contributes to the loss of short-term memory.

In one recent study, Russell Poldrack, a psychology professor at the University of California, Los Angeles, found that “multitasking adversely affects how you learn. Even if you learn while multitasking, that learning is less flexible and more specialized, so you cannot retrieve the information as easily.” His research demonstrates that people use different areas of the brain for learning and storing new information when they are distracted: brain scans of people who are distracted or multitasking show activity in the striatum, a region of the brain involved in learning new skills; brain scans of people who are not distracted show activity in the hippocampus, a region involved in storing and recalling information. Discussing his research on National Public Radio recently, Poldrack warned, “We have to be aware that there is a cost to the way that our society is changing, that humans are not built to work this way. We’re really built to focus. And when we sort of force ourselves to multitask, we’re driving ourselves to perhaps be less efficient in the long run even though it sometimes feels like we’re being more efficient.”

If, as Poldrack concluded, “multitasking changes the way people learn,” what might this mean for today’s children and teens, raised with an excess of new entertainment and educational technology, and avidly multitasking at a young age? Poldrack calls this the “million-dollar question.” Media multitasking — that is, the simultaneous use of several different media, such as television, the Internet, video games, text messages, telephones, and e-mail — is clearly on the rise, as a 2006 report from the Kaiser Family Foundation showed: in 1999, only 16 percent of the time people spent using any of those media was spent on multiple media at once; by 2005, 26 percent of media time was spent multitasking. “I multitask every single second I am online,” confessed one study participant. “At this very moment I am watching TV, checking my e-mail every two minutes, reading a newsgroup about who shot JFK, burning some music to a CD, and writing this message.”

The Kaiser report noted several factors that increase the likelihood of media multitasking, including “having a computer and being able to see a television from it.” Also, “sensation-seeking” personality types are more likely to multitask, as are those living in “a highly TV-oriented household.” The picture that emerges of these pubescent multitasking mavens is of a generation of great technical facility and intelligence but of extreme impatience, unsatisfied with slowness and uncomfortable with silence: “I get bored if it’s not all going at once, because everything has gaps — waiting for a website to come up, commercials on TV, etc.” one participant said. The report concludes on a very peculiar note, perhaps intended to be optimistic: “In this media-heavy world, it is likely that brains that are more adept at media multitasking will be passed along and these changes will be naturally selected,” the report states. “After all, information is power, and if one can process more information all at once, perhaps one can be more powerful.” This is techno-social Darwinism, nature red in pixel and claw.

Other experts aren’t so sure. As neurologist Jordan Grafman told Time magazine: “Kids that are instant messaging while doing homework, playing games online and watching TV, I predict, aren’t going to do well in the long run.” “I think this generation of kids is guinea pigs,” educational psychologist Jane Healy told the San Francisco Chronicle; she worries that they might become adults who engage in “very quick but very shallow thinking.” Or, as the novelist Walter Kirn suggests in a deft essay in The Atlantic, we might be headed for an “Attention-Deficit Recession.”

Paying Attention

When we talk about multitasking, we are really talking about attention: the art of paying attention, the ability to shift our attention, and, more broadly, to exercise judgment about what objects are worthy of our attention. People who have achieved great things often credit for their success a finely honed skill for paying attention. When asked about his particular genius, Isaac Newton responded that if he had made any discoveries, it was “owing more to patient attention than to any other talent.”

William James, the great psychologist, wrote at length about the varieties of human attention. In The Principles of Psychology (1890), he outlined the differences among “sensorial attention,” “intellectual attention,” “passive attention,” and the like, and noted the “gray chaotic indiscriminateness” of the minds of people who were incapable of paying attention. James compared our stream of thought to a river, and his observations presaged the cognitive “bottlenecks” described later by neurologists: “On the whole easy simple flowing predominates in it, the drift of things is with the pull of gravity, and effortless attention is the rule,” he wrote. “But at intervals an obstruction, a set-back, a log-jam occurs, stops the current, creates an eddy, and makes things temporarily move the other way.”

To James, steady attention was thus the default condition of a mature mind, an ordinary state undone only by perturbation. To readers a century later, that placid portrayal may seem alien — as though depicting a bygone world. Instead, today’s multitasking adult may find something more familiar in James’s description of the youthful mind: an “extreme mobility of the attention” that “makes the child seem to belong less to himself than to every object which happens to catch his notice.” For some people, James noted, this challenge is never overcome; such people only get their work done “in the interstices of their mind-wandering.” Like Chesterfield, James believed that the transition from youthful distraction to mature attention was in large part the result of personal mastery and discipline — and so was illustrative of character. “The faculty of voluntarily bringing back a wandering attention, over and over again,” he wrote, “is the very root of judgment, character, and will.”

Today, our collective will to pay attention seems fairly weak. We require advice books to teach us how to avoid distraction. In the not-too-distant future we may even employ new devices to help us overcome the unintended attention deficits created by today’s gadgets. As one New York Times article recently suggested, “Further research could help create clever technology, like sensors or smart software that workers could instruct with their preferences and priorities to serve as a high tech ‘time nanny’ to ease the modern multitasker’s plight.” Perhaps we will all accept as a matter of course a computer governor — like the devices placed on engines so that people can’t drive cars beyond a certain speed. Our technological governors might prompt us with reminders to set mental limits when we try to do too much, too quickly, all at once.

Then again, perhaps we will simply adjust and come to accept what James called “acquired inattention.” E-mails pouring in, cell phones ringing, televisions blaring, podcasts streaming — all this may become background noise, like the “din of a foundry or factory” that James observed workers could scarcely avoid at first, but which eventually became just another part of their daily routine. For the younger generation of multitaskers, the great electronic din is an expected part of everyday life. And given what neuroscience and anecdotal evidence have shown us, this state of constant intentional self-distraction could well be of profound detriment to individual and cultural well-being. When people do their work only in the “interstices of their mind-wandering,” with crumbs of attention rationed out among many competing tasks, their culture may gain in information, but it will surely weaken in wisdom.

Christine Rosen, “The Myth of Multitasking,” The New Atlantis, Number 20, Spring 2008, pp. 105-110.

Delivered to your inbox:

Humane dissent from technocracy

Exhausted by science and tech debates that go nowhere?