[Pages 1-3 in print version. © the Johns Hopkins University Press, 1992.
The problem of causality. It is not always easy to determine what has caused a specific change in a science. What made such a discovery possible? Why did this concept appear? Where did this or that theory come from? Questions like these are often highly embarrassing because there are no definite methodological principles on which to base such an analysis. The embarrassment is much greater in the case of those general changes that alter science as a whole. It is greater still in the case of several corresponding changes. But it probably reaches its highest point in the case of the empirical sciences: for the role of instruments, techniques, institutions, events, ideologies, and interests is very much in evidence; but one does not know how an articulation so complex and so diverse in composition actually operates. -- Michel Foucault, The Order of Things , xii-xiii.
When designers of computer software examine the pages of Glas or Of Grammatology , they encounter a digitalized, hypertextual Derrida; and when literary theorists examine Literary Machines , they encounter a deconstructionist or poststructuralist Nelson. These shocks of recognition hypertext, apparently unconnected areas of inquiry, have increasingly converged. Statements by theorists concerned with literature, like those by theorists concerned with computing, show a remarkable convergence. Working often, but not always, in ignorance of each other, writers in these areas offer evidence that provides us with a way into the contemporary episteme in the midst of major changes. A paradigm shift, I suggest, has begun to take place in the writings of Jacques Derrida and Theodor Nelson, Roland Barthes and Andries van Dam. I expect that one name in each pair will be unknown to most of my readers. Those working in computing will know well the ideas of Nelson and van Dam; those working in literary and cultural theory will know equally well the ideas of Derrida and Barthes.
All four, like many others who write on hypertext and literary theory, argue that we must abandon conceptual systems founded upon ideas of center, margin, hierarchy, and linearity and replace them by ones of multilinearity, nodes, links, and networks. Almost all parties to this paradigm shift, which marks a revolution in human thought, see electronic writing as a direct response to the strengths and weaknesses of the printed book. This response has profound implications for literature, education, and politics.
The many parallels between computer hypertext and critical theory have many points of interest, the most important of which, perhaps, lies in the fact that critical theory promises to theorize hypertext and hypertext promises to embody and thereby test aspects of theory, particularly those concerning textuality, narrative, and the roles or functions of reader and writer. Using hypertext, critical theorists will have, or now already have, a laboratory with which to test their ideas. Most important, perhaps, an experience of reading hypertext or reading with hypertext greatly clarifies many of the most significant ideas of critical theory. As J. David Bolter points out in the course of explaining that hypertextuality embodies poststructuralist conceptions of the open text, "what is unnatural in print becomes natural in the electronic medium and will soon no longer need saying at all, because it can be shown" (143).