University of Virginia Library

Search this document 


  

collapse section 
 1. 
I
 2. 
 3. 
  
collapse section 
 1. 
 2. 
 3. 
 4. 
 5. 
 6. 
 7. 
 8. 
  
collapse section 
 1. 
 2. 
  
collapse section 
 1. 
  
collapse section 
 1. 
 2. 
  
collapse section 
 1. 
collapse section 
collapse section1. 
  
  
collapse section 
 1. 
  
collapse section 
 1. 
  
collapse section 
 1. 
 2. 
 3. 
  
collapse section 
 1. 
 2. 
  
collapse section 
 1. 
 2. 
  
collapse section 
 1. 
  
collapse section 
 1. 
  
collapse section 
 1. 
  
collapse section 
 1. 
  
collapse section 
 1. 
  
collapse section 
 1. 
  
collapse section 
 1. 
  
  
collapse section 
 1. 
  
collapse section 
 1. 
 2. 
 3. 
  
collapse section 
 1. 
  
collapse section 
collapse section1. 
  
  
collapse section 
 1. 
  
collapse section 
 1. 
  
collapse section 
 1.0. 
collapse section2.0. 
collapse section2.1. 
 2.1a. 
 2.1b. 
collapse section2.2. 
 2.2a. 
 2.2b. 
  

collapse section 
 1. 
 2. 
 3. 
 4. 
 5. 
 6. 
 7. 
 8. 
 9. 

I

Most of us have read with private delight Yeats's withering verses about scholars, those "old, learned, respectable bald heads," who "edit and annotate the lines / That young men, tossing on their beds, / Rhymed out in love's despair." Perhaps because we all feel uncomfortably vulnerable to the indictment, we can share a macabre enjoyment at wondering what Yeats would have thought about the electronic computer, the lightning-rapid, passionless, remorseless, soul-less editor and annotator that cannot cough — in ink, or anything else — and wears no shoes to wear the carpet with. What magnificent wrath and scorn would Yeats have let fall upon us for invading his world of symbol and Irish legend to count "gyres" and index the varieties of "love" on an IBM machine!

In thoughts like these, and the fears they represent, lies the first great problem of making computer concordances. For every good humanist feels ambivalent about the intrusion of technology into his domain. While we may, with one part of our minds, accept the fact that electro-mechanical devices must inevitably take over the routine chores of scholarship — collation of texts, for example, and enumerative bibliography — with another part of our minds we warmly commend the Dante Society of America for resisting the help of a computer to complete its monumental, new Dante Concordance now in progress. Members of the Society, it turns out, scattered through the nation, working alone by hand on their assigned blocks of pages, value too highly the sense of community that seals them into one tribe to wish to sacrifice it for the advantages of speed. (What is five years — or twenty-five — in the timeless world of Dante studies?) Here, we like to


2

Page 2
think, is the embattled humanist courageously holding out against automation, and deserving of our whole-hearted support.

Our psychological resistance to automation in the Humanities is likely to be stiffened by our superb innocence. Delightedly, we indulge ourselves with terrors that are meaningless to people who know anything about computers. If electronic brains can index and edit poetry, we inquire fearfully, how long will it be before they begin to compose poetry? But we rarely stay for an answer, so ardently do we cherish our fancies. We cannot, I suppose, be expected to welcome the arrival of a computer-poet, though it might be extremely interesting to have some of his productions on which to test our critical principles. (Would it be committing the biographical heresy to identify the poet as a computer? If we refuse to take any account of the poet, the better to scrutinize the internal order of the poem, which would prove the more stimulating exercise — the search for irony, or the discrimination of a persona?) It is hardly to our credit, however, that we find "sinister" implications in every technological advance, unshakable in our conviction that literature and technology don't mix — a conviction probably held by the monk in his scriptorium, gloomily contemplating the first moveable type. Could we not be expected to show at least as much maturity and vision as the mathematicians, who see no threat to their own supremacy in the arrival of machines that make thousands of calculations every second? "We can always think of more things to ask the machine to do than it can ever learn to do," they will say confidently, and get on with the business of developing the sensitivity and power of their marvellous tools, knowing that every advance yields them more freedom from drudgery, more opportunity for creative research. We might wonder whether it is these people or the Humanists who are the more dedicated to the human use of human beings.

But our innocence is not the only problem. For even those of us who try to come to terms with automation are likely to be frustrated, owing to our inability to communicate with computer scientists in their own language. When the computer programmer talks of a "word," he means "thirty-six bits," and by "thirty-six bits" he means six "six-bit" elements of the binary number system in which the machine counts. The basic number of bits (an abbreviation of "binary digits") happens to be six not, as a learned humanist friend of mine conjectured, because certain tribes of American Indians developed an effective number system on a base of six, but because six is the smallest number of binary digits that will accommodate the 47 characters on an IBM print wheel.


3

Page 3
These, of course, are misunderstandings of the simplest kind, involving the transfer of metaphors from one discipline to another. Far more complex and disturbing are the misunderstandings that result when we attempt to parse a technical paper dealing with computer processes. Though the words are clearly English, and often familiar, we are likely to find the concepts beyond our grasp, and the language, somehow, impenetrable.

As a result of our inability to speak the language of computer science, the computer people are obliged to communicate with us in our language, and they have a way of telling us the things we seem so delighted to hear. "This machine," they will say reassuringly, speaking of a new computer, "is fairly stupid. It has only about a second-grade intelligence." "Of course," they add, after a carefully timed pause, "the last machine we had was only in the first grade. . . ." And they will go on thoughtfully to tell us about the "compiler," a new device by means of which the machine can be taught to learn from its own mistakes, and thus in a sense to program itself. The question that immediately rises to haunt our minds — "who confesses the compiler?" or something of the sort — has little meaning for the programmer because he has been using words and metaphors drawn from our world, not his own, and ours is so obviously remote from reality as to be almost a fairy-land.

These two worlds represent, of course, the two cultures so brilliantly portrayed by Sir Charles Snow in his memorable Rede Lecture of 1959, The Two Cultures and the Scientific Revolution. The separation between them is, as Sir Charles declared, one of the critical problems of our age. Its magnitude becomes distressingly clear to anyone who endeavors to apply the processes of computer technology to research in the Humanities.