Every few days, it seems, I come across a rueful, even mournful citation of T. S. Eliot: “Where is the wisdom we have lost in knowledge? Where is the knowledge we have lost in information?” But it’s possible to make these distinctions in ways that are not so tendentious. One might think along these lines:
The point is not to see one of these as superior to the others, but to see them as a sequential development: for example, those who lack genuine knowledge — which, mind you, comes in different forms — will be necessarily deficient in wisdom and their counsel will be correspondingly less valuable.
This is all quite sketchy and needs further development, of course. Let’s start by complicating matters further. In his book The Creation of the Media, Paul Starr writes:
“Information” often refers specifically to data relevant to decisions, while “knowledge” signifies more abstract concepts and judgments. As knowledge provides a basis of understanding, so information affords a basis of action. “Information” carries the connotation of being more precise, yet also more fragmentary, than knowledge. From early in its history, American culture was oriented more to facts than to theory, more to practicality than to literary refinement — more, in short, to information than to knowledge.
Further: Near the beginning of his remarkable book Holding On To Reality, Albert Borgmann posits that there are three major kinds of information:
One way to explain the deficiency in our narratives of modernity is to say that they have failed to maintain these distinctions and have therefore failed to note the mediating role that specific technologies play in promoting transfer from data to knowledge to wisdom to counsel.
How does information become knowledge and wisdom–and subsequently a memory–for someone? This happens when we connect new information to various aspects of our lived experience–other things we've previously read, seen or heard.
Most databases are designed to provide quick ways to find what you want to look up. This differs from how your mind summons knowledge. We don't "look up" stuff in the sense that we don't query our brains, and then work through long search lists to see if we can find what we are looking for. No.
If someone asks "what's the best high school movie ever made?” you instantly have an answer from the tens of such movies you've seen in your life. That movie is connected to so many different memories that it just serves itself up, almost without thinking.
What does all this have to do with technology? A lot actually. I am about 1-week out from finishing work on an application that that allows Evernote users to create and save these two-way connections between pieces of information. For example I just connected this article to a passage from Philip Rieff. The moment I connected the two notes I saw other notes I had previously connect to the Rieff quote, which also tie in with this article (e.g. a quote from a piece by Joe Epstein on how the mind is a great wanderer). It is very cool. Think of these personal connections like pieces of string connecting how all your notes relate to one another. Instead of searching for a note you can remember, you just pull on that string and see everything that matters whether you were looking for it or not.
I'd loved to have some New Atlantis readers give it a look during testing. You can reach me at jdouglasj at gmail dot com, if you’d like to give it a look. You need: An Evernote account, and an iPad.
And if people have little language, particularly vocabulary but also an undeveloped command of grammar, what then? For this, in the technological age in which we reside, is the purpose of schooling (education is dead) from a very young age to one's mid-20s. Australia's national curriculum (especially in its systems of assessment), for one thing, makes that abundantly clear. Universities, with their preoccupation with so-called 'professions', deepen the malaise. Governments' determination to deprofessionalize (ugh!) teaching take these matters even further (teachers, especially when commanded to have Master's degrees, will know less and less about more and more). Orwell got it pretty right in the first two paragraphs of the essay on 'Newspeak' that follows the realization of Winston Smith that now he loved 'Big Brother'. Confine language to the immediate practicalities of daily existence. And nothing more. As some of these same pages of 'The New Atlantis' confirm, the digital mania of the institutions of schooling will deepen the dissociation of person and learning. How many graduates at any level from High school on could read, say, P. G. Wodehouse on 'Psmith Journalist'? Nuff said.
I’ve puzzled over these categories in the past and concluded that although a sequence of cognitive steps seems an inevitable part of the process of gaining intellectual mastery over a subject ranging anywhere from tiny to quite large, sorting the complexity of discrete, overlapping, and interconnected aspects to arrive at some sort of flowchart modeling a neural network is a hopeless and probably pointless task. For instance, one might distinguish between (raw) data, facts, information, and truth across a number of axes, including, for instance, how organized or universal a unit or grouping is. Complicate that further with whether such units are internal or external to a given human mind and perhaps even whether they lie beyond human perception but are amenable to instrumental measurement (e.g., infrared light). (This is a significant consideration in the current educational climate, where more students and teachers alike argue that it’s no longer necessary to possess interior knowledge so long as it’s searchable using Google or some such.) Repeat this matrix with know-how, knowledge, memory, and understanding as complicated by qualifiers such as intuitive (vs. reasoned), and quantitative (vs. qualitative). Another matrix could be constructed with instruction, wisdom, and counsel, all having to do with how successfully a unit is copied, approximated, or transmitted (fully analog in an organic context) from one mind to another across expressive and receptive boundaries. How far down the rabbit hole does one go before recognizing that human cognition is not an algorithm that can be cracked with sufficient effort?
Comments are closed.