File /Humanist.vol22.txt, message 539


From: Humanist Discussion Group <willard.mccarty-AT-mccarty.org.uk>
To: humanist-AT-lists.digitalhumanities.org
Date: Fri, 20 Feb 2009 11:50:40 +0000 (GMT)
Subject: [Humanist] 22.554 getting closer?


                 Humanist Discussion Group, Vol. 22, No. 554.
         Centre for Computing in the Humanities, King's College London
                       www.digitalhumanities.org/humanist
                Submit to: humanist-AT-lists.digitalhumanities.org



        Date: Fri, 20 Feb 2009 11:46:06 +0000
        From: Willard McCarty <willard.mccarty-AT-mccarty.org.uk>
        Subject: getting closer

As far as I know Andrew Brook is quite right: we are nowhere close to
modelling the processes of reading -- if by "modelling" we mean
constructing some kind of device, then flipping the switch and watching
it go as far as it will. But we are a bit closer than that if what we
mean is a modelling that (to use Susan Sontag's language) is erotic
rather than hermeneutic. (I refer to her essay, "Against
Interpretation", which concludes with the sentence, "In place of a
hermeneutics we need an erotics of art." Gillian Beer extends and
exemplifies these erotics marvellously in her book Open Fields.)

Back to hardware and software. I think we need to wake up to the
fact that as the technology of computing has progressed from batch
processing to interaction design, what one might call an erotics of and
with computing has finally become possible at the level of applications
in the humanities. It was certainly a reality at the level of the
machines in the very early days of computing, and I would guess still is
at that level today for the very few who are involved with microcode and
the like. In A History of Computing in the Twentieth Century (1980), N.
Metropolis, after listing those who had hands-on access to the MANIAC at
Los Alamos (Edward Teller, Enrico Fermi, Stanislaw Ulam et al), remarks
as follows: "It is perhaps worthwhile mentioning that the problem
originators interacted directly with the computer. With the eventual
achievement of interactive capabilities and high-level languages, there
may be a return to what, in retrospect, seem like halcyon days" (p.
463).

So, welcome to the halcyon days! The question is, what are we doing with
them? How much of our thinking remains behind in the era when one gave
a computer a job to do, went away for a time and returned to inspect the
result? How *in touch* are we?

In a review of books by Robert Oakman and Susan Hockey in 1980, Lou
Burnard commented that very little of the discussion in the wider
computing literature on information retrieval and database design had
been noticed by literary scholars. He noted that according to the
prevailing mentality, it was as if data were "being held and processed
within a computer as it if were organized in large filing cabinets,
through which efficient electronic nymphs riffle to retrieve punched
cards one at a time." Computing, he declared, "has now moved beyond such
a self-image. Literary computing will not come of age until it
recognizes this fact and adapts the new tools of of data analysis and
data modelling to its own ends" (Times Literary Supplement, 9 May 1980,
p. 533).

We've done that, more or less, I'd suppose. But has literary computing
"come of age"? I must say that as I read books such as Rolf Herken, ed.,
The Universal Turing Machine: A Half-Century Survey, 2nd edn (1995), it
is hard not to conclude that all along there was a much deeper problem,
that today as much as yesterday we are dozing away in rote exercises of
implementation without questioning what needs to be questioned. In
Herken's book, the intellectual excitement that we should be feeling, an
excitement that comes out of asking such questions, is perhaps most
clearly expressed by the theoretical biologist Robert Rosen. This is how
his essay begins:

 > One of the most remarkable confluences of ideas in modern scientific
 > history occurred in the few short years between the publication of
 > Gödel's original papers on formal undecidability in 1931, and the
 > work of McCulloch and Pitts on neural networks, which appeared in
 > 1943. During these twelve years, fundamental interrelationships were
 > established between logic, mathematics, the theory of the brain, and
 > the possibilities of digital computation, which still literally takes
 > one's breath away to contemplate in their full scope. It was believed
 > at that time, and still is today, over half a century later, that
 > these ideas presage a revolution as fundamental as that achieved by
 > Newton three centuries earlier. (p. 485)

Yes, of course, at our end of the street revolutions happen much more
slowly. And yes, of course, I am being rather unfair -- but deliberately
in order to provoke someone more knowledgeable into spelling out what
exactly, in these halcyon days we now enjoy, even in the comfort of our
own studies and offices, we are doing about this revolution happening
all around us. For one thing, it seems to me that we could be taking
Susan Sontag's call for an erotics of art (and literature, and
everything else) much more seriously.

Comments?

Yours,
WM

-- 
Willard McCarty, Professor of Humanities Computing,
King's College London, staff.cch.kcl.ac.uk/~wmccarty/;
Editor, Humanist, www.digitalhumanities.org/humanist;
Interdisciplinary Science Reviews, www.isr-journal.org.



_______________________________________________
List posts to: humanist-AT-lists.digitalhumanities.org
List info and archives at at: http://digitalhumanities.org/humanist
Listmember interface at: http://digitalhumanities.org/humanist/Restricted/listmember_interface.php
Subscribe at: http://www.digitalhumanities.org/humanist/membership_form.php


   

Humanist Main Page

 

Display software: ArchTracker © Malgosia Askanas, 2000-2005