From Cursive to Cursor

Subscriber Only
Sign in or Subscribe Now for audio version

First I used crayons and colored pencils to draw on loose sheets of paper, or in coloring books. Then I was given lined paper — solid horizontal lines interspersed with dotted ones that indicated half-heights — and taught to print. I used pencils, mostly. Later I got notebooks, and wrote with fountain pens into whose red or green or blue transparent barrels I inserted ink cartridges. (Throughout much of my education ballpoint pens were forbidden.) I also discovered that a properly tossed pen could be made to stand up with its nib embedded in the worn old wooden floorboards of Gorgas Elementary School.

For some years little changed, except a move to a school with linoleum floors. Then, when I was fourteen, I asked for a typewriter for Christmas. I received a small Olivetti portable — manual of course, which is to say, operated without electricity. This became my treasure. I was so impatient to use it that I couldn’t wait to learn proper touch-typing, and indeed I never have done so. (I’m typing these words with the two index fingers and one right thumb I have always used.) I wrote many stories on it, and I used it to record favorite quotations. But I did no schoolwork with it. That would not have been expected, nor, I think, welcomed.

When I started college I went to a local typewriter shop and purchased an enormous, ancient IBM Selectric, a model from the early 1950s. Using it was like writing with a jackhammer: its keys pounded the platen so forcefully that they virtually shook the house. I traded it in for a beautiful old Smith-Corona Silent — a dark gray machine with forest green keys that looked like it contained a hundred novels — and used that throughout college and much of graduate school.

I began writing my dissertation in 1984, and decided that a work of such scope — I had never before thought of writing anything book-length — might demand the transition to a computer, an option that had only recently become available. None of my fellow graduate students in English, as far as I knew, were making this change, but it appealed to me, not least because I knew that significant revisions might at some point be called for, and the idea of re-typing two or three hundred pages of text dismayed me. At about this time I started teaching at Wheaton College, where the English department had been provided with something called a DECmate, from Digital Equipment Corporation, which featured a program called WPS (Word Processing System). I wrote two chapters on that, but found it hard to concentrate while sitting in the busy corridor where the machine had been installed; also, the colleague whose office was closest to the machine disliked the noise. I decided I had to buy my own computer, even if I had to take out a loan to do it. And so in the spring of 1985 I bought the original Apple Macintosh. I have been using Macs ever since, and despite all the developments in personal computing in the quarter-century since — most notably, from my point of view, in how research is done — I cannot say that my habits of writing have undergone major alteration since. The changes that have occurred since 1984 are dwarfed in magnitude by the changes that I experienced in the previous decade.

Not often in the history of writing have people gone through as many technological changes in one lifetime as I and most other people of my generation have. Of course, literacy itself has been rare throughout most of humanity’s history; and before the invention of the typewriter, literate people made the transition from printing to cursive writing — and that was it. Such people would have been using the same technologies and methods at age eighty that they had learned as small children. Moreover, not so many decades ago many highly educated people — especially men — never learned to type at all. When Ronald Reagan learned in 1994 that he had Alzheimer’s disease and wished to inform the American people, he took out a page of his stationery and hand-wrote an explanatory letter — just as most men of his generation would have done.

Conversely, my teenage son began to write almost exclusively on computers several years ago; like most people his age, his handwriting is poor because he uses it so rarely, and he has never laid hands on a typewriter. It is not certain, though, that he will spend the rest of his life typing on keyboards: voice-recognition software might someday relieve him of that burden, and as an old man he may well look back nostalgically on the days when people interacted with computers primarily through keyboards. (Though some form of manual text entry will probably persist at least for those occasions when we don’t want other people to know what we’re saying.)

The history of changes in the technologies of writing is the nominal subject of Dennis Baron’s book A Better Pencil: Readers, Writers, and the Digital Revolution. Baron, a professor of English and linguistics at the University of Illinois whose scholarly work has focused on English grammar, went through a history similar to mine, though a few years earlier, and as a young man he worked on mainframe computers, something I never did. But he knows, as I do — we have experienced it firsthand, and more than once — that every new development in the technology of writing generates a chorus of utopian celebrations and a counter-chorus of dystopian jeremiads. And much of his book is occupied by the recording of those responses, especially the protests. He rightly notes that “the newest technologies of writing” tend always to be attacked as “impersonal, mechanical, intellectually destructive, and socially disruptive.” When, twenty years ago, Wendell Berry wrote an essay called “Why I Am Not Going to Buy a Computer” — Baron does not cite this essay, though it would have been relevant — he employed arguments of just this kind, and concluded: “When somebody has used a computer to write work that is demonstrably better than Dante’s, and when this better is demonstrably attributable to the use of a computer, then I will speak of computers with a more respectful tone of voice, though I still will not buy one.”

But along the way to this conclusion Berry happened to comment that after he hand-wrote his poems and stories and essays, his wife typed them up on a Royal standard typewriter. One reader — among many who responded to Berry, most of them with outrage — asked him, “Not to be obtuse, but being willing to bare my illiterate soul for all to see, is there indeed a ‛work demonstrably better than Dante’s’ … which was written on a Royal standard typewriter?” Berry later wrote, “I like this retort so well that I am tempted to count it a favorable response,” but, greatly though I admire Berry, I have to say that this is a frivolous comment, given that all of the criticisms Berry makes of the computer had earlier been directed against the typewriter. Baron quotes a long passage from an essay lamenting the rise of the typewriter that appeared in the Atlantic Monthly in 1895, and then a 1938 editorial from the New York Times — well after, one would have thought, the machine had become essential and inevitable — arguing that the typewriter was depriving business memos of, as Baron puts it, “the personal touch that only longhand could provide.”

Having recorded these protests, Baron, it seems to me, should say clearly what he thinks about them all. Yet he never does so. One of his recurrent points is that every new technology of writing has added to the ranks of authors, and this is presumably a good thing, but Baron never makes the case straightforwardly. His is a story largely without a point of view. For instance, at the end of a chapter that details the rise of Wikipedia, he writes,

Many teachers, concerned about the unreliability built into Wikipedia’s structure, refuse to allow their students to use it as a source. But even in its present, imperfect state, Wikipedia has proved so quick and easy to use that most of its readers, including teachers and presidential candidates, are willing to accept what they find as good enough for their purposes.

Well, yes; this is an unexceptionable statement. But that’s just the problem. Are teachers’ concerns about unreliability valid? If people “find” Wikipedia “good enough for their purposes,” is it good enough? Or should teachers and presidential candidates look elsewhere? These are matters Baron should address.

Similarly, later in the book Baron writes,

Impressive as the rapid technological and economic changes associated with the digital word are, what seems to differentiate pixels from pencils most is the speed of digitized communication, their reconfiguration of the public and private spheres, and the ways that the new technologies have dramatically increased both who gets to write, and how much they write.

Again, it would be hard to argue with these points — all of which have been made by many other writers — but they cry out for some kind of cost-benefit analysis. (Baron should also make more clear that he really means “who gets to write, and how much they write” for others, in public. People now post to blogs or on Facebook what several earlier generations indeed would have written, but in diaries, or letters written to one’s most intimate friends or family members.) If the public and private spheres have been reconfigured, what do we gain by the re-drawing of the boundaries? What do we lose? Or, to put the matter still more pointedly: Who gains and who loses, and in what measure?

Baron’s failure to face these essential questions is exacerbated by the odd structure of his book. After an opening overview of writing technologies, there is a chapter on Luddites and other techno-skeptics (including Theodore Kaczynski, the Unabomber); then a chapter largely on pencils, drawing heavily on Henry Petroski’s 1990 history The Pencil; then a chapter focusing on cuneiform (writing on clay); then a chapter on early word-processing systems. The book goes on in this way, treating topics apparently at random. Later chapters deal with varying means of forging texts, the emergence of blogs, the rise of MySpace and Facebook, and so on, in a generally, but not exclusively, historical order. The organization of the book is no easier to understand than its argument.

So I may well be wrong here, but if I had to state what I believe Baron’s chief point is, I would say something like this: “Many people are at least highly nervous about, and often fiercely critical of, today’s new ways of writing and distributing texts. But people were similarly nervous and critical in the face of all previously new writing technologies, and since those critics were wrong, it’s likely that today’s critics are as well.” But any claims along these lines — whether made by Baron or by someone else — begs all the necessary questions.

To think well about these matters we need to become visionaries not of the future but of the past. We tend to think that all those earlier critics were wrong simply because we cannot seriously imagine living in a world that lacks the technologies we now use every day. The thought of having to abandon my MacBook and go back to my old Smith-Corona fills me with dread — but should it? There’s no doubt that without a computer on which to write, and an Internet connection that enables me to do a great deal of research without rising from my seat, I would have written and published far less than I have. I would have had to spend countless more hours in libraries; I would have had to take hundreds or perhaps thousands of pages of notes by hand; I would have had to type and re-type my articles and books, using copious quantities of Wite-Out. All this I know. But what I do not know is whether I would have had a less successful career. I would have published fewer words, but perhaps those words would have been better chosen; perhaps my ideas would have been more carefully thought out. Who knows what might have occurred to me as I was walking to or from the library, or as I typed a particular page for the fourth time?

This kind of exercise can be extended farther into the past, of course. Perhaps pencils or quill pens would have been still better for my thinking and writing — and perhaps, as the New York Times suggested in 1938, the world of business would be improved if memos were written by hand. (It would be less efficient, but efficiency and productivity aren’t everything.) And if we’re going to go this far, we may as well ask, along with Socrates in Plato’s Phaedrus, whether writing itself should be repudiated in favor of face-to-face interlocution.

This kind of speculation may seem pointless: after all, we are simply not going to turn our backs on writing, and computers will not be wheeled out of offices and replaced by fountain pens and reams of foolscap. But some things can be written by hand — it happens every day — and it is possible to discard that e-mail draft, get up, and go looking for the person you were going to send it to. Technologies and practices that have been supplemented, or displaced from a prior centrality, do not thereby become unavailable. Thinking historically helps us to remember that fact, and to practice what Jacques Ellul called “the measuring of technique by other criteria than those of technique itself.”

Anyone who raises such questions will of course be called a skeptic or a Luddite. But asking questions, and seeking to imagine the past as vividly as we may imagine the future, is neither skepticism nor protest: it’s thinking. And thinking leads to an expansion of choice, a renewal of options for human communication. Dennis Baron does not bring the kind of critical scrutiny to the history he tells that that history deserves, which makes it all the more important that his readers do.

Alan Jacobs, "From Cursive to Cursor," The New Atlantis, Number 27, Spring 2010, pp. 85-90.
Related

Delivered to your inbox:

Humane dissent from technocracy

Exhausted by science and tech debates that go nowhere?