Maria Bustillos’s review of Nick Carr’s new book The Glass Cage is really, really badly done. Let me illustrate with just one example (it’s a corker):
In the case of aviation, the answer is crystal clear, yet Carr somehow manages to draw the opposite conclusion from the one supported by facts. In a panicky chapter describing fatal plane crashes, Carr suggests that pilots have come to rely so much on computers that they are forgetting how to fly. However, he also notes the “sharp and steady decline in accidents and deaths over the decades. In the U.S. and other Western countries, fatal airline crashes have become exceedingly rare.” So yay, right? Somehow, no: Carr claims that “this sunny story carries a dark footnote,” because pilots with rusty flying skills who take over from autopilot “often make mistakes.” But if airline passengers are far safer now than they were 30 years ago — and it’s certain they are — what on Earth can be “dark” about that?
Note that Bustillos is trying so frantically to refute Carr that she can’t even see what he’s actually saying. (Which might not surprise anyone who notes that in the review’s first sentence she refers to Carr as a “scaredy-cat” — yeah, she actually says that — and in its third refers to his “paranoia.”) She wants us to believe that Carr’s point is that automating the piloting of aircraft is just bad: “the opposite conclusion from the one supported by facts.” But if Carr himself is the one who notes that “fatal airline crashes have become exceedingly rare,” and if Carr himself calls the decline in air fatalities a “sunny story,” then he just might not be saying that the automating of flight is simply a wrong decision. Bustillos quotes the relevant passages, but can’t see the plain meaning that’s right in front of her face.
Carr cites several examples of planes that in recent years have crashed when pilots unaccustomed to taking direct control of planes were faced with the failure of their automated systems. Does Bustillos think these events just didn’t happen? If they did happen, then we have an answer to her incredulous question, “If airline passengers are far safer now than they were 30 years ago … what on Earth can be “dark” about that?” That answer is: If you’re one of the thousands of people whose loved ones have died because pilots couldn’t deal with having to fly planes themselves, then what you’ve had to go through is pretty damned dark.
Again, Bustillos quotes Carr accurately: The automation of piloting is a sunny story with a dark footnote. If Carr says anywhere in his book that we would be better off if we ditched our automated systems and went back to manual flying, I haven’t seen it. I’d like for Bustillos to show it to me. But I don’t think she can.
The point Carr is making in that chapter of The Glass Cage is that flight automation shows us that even wonderful technologies that make us safer and healthier come with a cost of some kind — a “dark footnote” at least. Even photographers who rejoice in the fabulous powers of digital photography knows that there were things Cartier-Bresson could do with his Leica and film and darkroom that they struggle to replicate. Very, very few of those photographers will go back to the earlier tools; but thinking about the differences, counting those costs, is a vital intellectual exercise that helps to keep us users of our tools instead of their thoughtless servants. If we don’t take care to think in this way, we’ll have no way of knowing whether the adoption of a new technology gives us a sunny story with no more than a footnote’s worth of darkness — or something far worse.
All Carr is saying, really, is: count the costs. This is counsel Bustillos actively repudiates: “Computers are tools, no different from hammers, blowtorches or bulldozers; history clearly suggests that we will get better at making and using them. With the gifts of intelligence, foresight and sensible leadership, we’ve managed to develop safer factories, more productive agricultural systems and more fuel-efficient cars.” Now I just need her to explain to me how those “gifts of intelligence, foresight and sensible leadership” have also yielded massively armored local police departments and the vast apparatus of a national surveillance state, among other developments.
I suppose “history clearly suggests” that those are either not problems at all or problems that will magically vanish — because if not, then Carr might be correct when he writes, near the end of his book, that “The belief in technology as a benevolent, self-healing, autonomous force is seductive.”
But that’s just what a paranoid scaredy-cat would say, isn’t it?
UPDATE: Evan Selinger has some very useful thoughts — I didn’t see them until after I wrote this post.
I know I'm anything but unbiased here, but thanks, Alan, for your lucid response to this curious review.
In case there are any others who, like Bustillos, question the darkness of flight automation's dark footnote, they need not rely on my assessment. William Langewiesche has a thoroughgoing examination of the problem, using the Air France 447 automation-related disaster as a backdrop, in the new issue of Vanity Fair. (Quote: "A small glitch took Flight 447 down, a brief loss of airspeed indications—the merest blip of an information problem during steady straight-and-level flight. It seems absurd, but the pilots were overwhelmed. … But their incoherence tells us a lot. It seems to have been rooted in the very advances in piloting and aircraft design that have improved airline safety over the past 40 years. To put it briefly, automation has made it more and more unlikely that ordinary airline pilots will ever have to face a raw crisis in flight—but also more and more unlikely that they will be able to cope with such a crisis if one arises.")
There's also Maria Konnikova's recent New Yorker piece on the problem, which uses the Continental Connection 3407 automation-related disaster as a backdrop. (Quote: "The more a procedure is automated, and the more comfortable we become with it, the less conscious attention we feel we need to pay it. … If anyone needs to remain vigilant, it’s an airline pilot. Instead, the cockpit is becoming the experimental ideal of the environment most likely to cause you to drift off.")
Beyond the journalistic treatments, there is at this point a very large body of empirical research indicating that pilots have become too dependent on automation, a situation that, as the FAA itself warns, can "lead to degradation of the pilot's ability to quickly recover the aircraft from an undesired state." I review a great deal of this research in The Glass Cage, with citations, but apparently Bustillos, for whatever reason, chose to ignore it or dismiss it.
I share Bustillos’s hope that we’ll "get better at making and using" the tools of automation. But I don’t see how we’ll be able to accomplish that goal if we don’t examine the tools and their consequences with a critical eye rather than through rose-colored glasses.
Hello, Messrs. Carr and Jacobs. Not sure whether my earlier comment made it through, so I'll try again.
We're quite agreed that the tools of automation ought to be examined with a critical eye. My beef with The Glass Cage is that it examines those tools not with a critical eye, but with a blindly censorious one. It makes no sense to have chosen aviation safety as an example of the dangers of increased automation, when the net result of increased automation is that flying is safer than ever before. I don't doubt that there is value in making sure that pilots keep their manual skills honed, just in case, but the bigger picture (increased safety!) isn't in focus.
"All Carr is saying, really, is: count the costs." I am!! In this instance, the net cost is negative.
"Blindly censorious," huh? Carr writes near the beginning of the book, "The point is not that automation is bad. Automation and its precursor, mechanization, have been marching forward for centuries, and by and large our circumstances have improved greatly as a result." Or, near the end, "By reclaiming our tools as parts of ourselves, as instruments of experience rather than just means of production, we can enjoy the freedom that congenial technology provides when it opens the world more fully to us." (Emphases mine.) I could give you fifty other quotes from the book that are equally positive about technology, or more so.
Come on, man. Don't comment on books you obviously haven't read. At least not on my site.
I haven't yet read the book, but it seems to me that one cost easy to overlook is that as we forfeit skills and action to automation, we become useless ciphers. In the case of airline pilots, someone is bound to ask eventually "why have pilots at all if they can no longer pilot?" Do it all remotely, as with drone warfare, which is getting safer and more efficient than having someone in the cockpit. The ookiness of having no one on board at the controls gives pause, no?
Some of us have already answered this type of question with regard to food preparation. We'd rather cook our own food for the pleasure of knowing how to do things and then doing them rather than relying on frozen dinners prepared who knows where. Not every example of human activity rises to the level of Albert Borgmann's focal practices, but enough do that one might think twice about outsourcing to automation everything that can be.
"“Computers are tools, no different from hammers, blowtorches or bulldozers; history clearly suggests that we will get better at making and using them. With the gifts of intelligence, foresight and sensible leadership, we've managed to develop safer factories, more productive agricultural systems and more fuel-efficient cars.” …We've also created a Pacific Garbage Patch, and most likely global warming, and are making sponge of our land by running pipelines through it that carry toxic fluids and then break because profit-making companies cut corners and that's just a fact. …..If you're going to look at the whole picture as Bustillos is apparently trying to do—–then look at the Whole picture. Ask the question: What is the long term cost of these developments? What is the cost to the environment? Where does the energy and material come from to manufacture these machines that do more and more things that physical humans in the physical world used to do? What is the social cost of continual distraction from people around us, in favor of people who are far away?
Comments are closed.