Authors: Alex Wright
The Intermedia development team pioneered a number of innovative software features that now seem commonplace: e-mail with hyperlinks, collaborative authoring tools, and “live” objects that could be updated dynamically across multiple applications. Most importantly, the system allowed for bidirectional links, so that any document would “know” which documents pointed to it (one of the present-day Web’s notable shortcomings is its lack of such a backtracking feature). The system also allowed for links to point to multiple resources simultaneously, using so-called link anchors. Perhaps the most important distinction between Intermedia and the modern Web stemmed from Intermedia’s roots as a scholarly tool: Every document in the system was editable; the system was designed as much for writing as for editing. “The tools you used to create documents were the same tools you used to link them together,”
45
recalls Yankelovich. In contrast to Intermedia’s read–write tools, today’s Web browser works only as a reader, designed to consume rather than create.
Intermedia constituted a landmark in the history of hypertext. It
also proved to be ahead of its time. Despite a string of early successes, by the early 1990s the IRIS team encountered difficulties securing funding for an ambitious second-phase project to be known as the Continents of Knowledge. The National Endowment for the Humanities turned down the institute’s grant request, questioning why scholars in the humanities should need computers.
46
By the time IRIS closed its doors in 1994 the institute had dwindled down to a small team of three headed by director Paul Kahn. By this time many key staffers had left for higher-paying jobs at private-sector computer companies.
It seems a bitter irony that such a promising hypertext experiment shut its doors on the eve of the Web revolution. While Intermedia failed to survive into the Web era, the project left a trail of important scholarship. One of the earliest and most enthusiastic advocates of Intermedia was Brown University professor George P. Landow, who would go on to write important theoretical texts like
Hypertext
and
Hyper/Text/Theory
that brought a critical eye to the philosophical implications of hypertext. Describing one IRIS project, a collection of materials about Tennyson’s “In Memoriam,” Landow described how teaching the text in a hyperlinked environment brought out its networked qualities, by capturing “the multilinear organization of the poem,” and describing how “using the capacities of hypertext, the web permits the reader to trace from section to section several dozen leitmotifs that thread through the poem.”
47
In his 1992 book
Hypertext,
Landow articulated a vision that unified the literature of the nascent hypertext movement with the aims of contemporary literary scholarship. Invoking both Ted Nelson and theorist Jacques Derrida, Landow penned an elegant jeremiad arguing for a new model of authorship, insisting that “we must abandon conceptual systems founded upon ideas of center, margin, hierarchy, and linearity and replace them with ones of multi-linearity, nodes, links, and networks.”
48
He goes on to explore the deep parallels between hypertext—with its emphasis on the decentering of the text and reduction of authority—with contemporary trends in literary theory, notably the work of Foucault and Derrida. Landow even goes so far as to argue for reconstituting our ideas of authorship:
The Intermedia system, circa 1989 (courtesy of George P. Landow).
In reducing the autonomy of the text, hypertext reduces the autonomy of the author. Lack of textual autonomy, like lack of textual centeredness, immediately reverberates through conceptions of authorship.… Similarly, the unboundedness of the new textuality disperses the author as well.
49
Echoing Nelson’s call to break down the rigid structures of institutional hierarchy—albeit in a more sober academic voice and from a position lodged squarely inside the academy—Landow recognizes the politics inherent even in the seemingly simple act of creating a footnote, a transaction predicated on “divisions of text that partake of fixed hierarchies of status and power.” If footnotes are replaced by hyperlinks, that power relationship becomes more fluid and negotiable, and the authority of texts now derives not from the author’s act of subordination but from the reader’s own constantly shifting perspective. “Hypertext linking situates the present text at the center of the textual universe, thus creating a new kind of hierarchy.”
50
This
new democratization of texts, in which the “marginal” work steps on to a level playing field with privileged texts, meshed nicely with postmodern notions of “deconstruction,” dissolving the illusion of the authoritative writer and creating a new teleology of literature, with readers and writers engaged on more equal footing.
Invoking Derrida’s vision of a new form of hieroglyphic writing, Landow intimated that networked hypermedia authoring might provide a remedy to the linear, sequential, hierarchical model of thinking that had taken deep root in the culture in the form of the printed book. During the age of print, the relentless march toward linear sequencing of text had effectively suppressed what hypertext scholar Gregory Ulmer calls “the pluridimensional character of symbolic thought.”
51
Hypertext seemed to hold the promise of a remedy for the linear constrictions of the printed text: postmodernism made manifest.
Landow has gone on to write provocatively about the changing nature of authorship, exploring the inherent economic impetus behind authorship. As technology changes the economics of publishing, it inevitably changes the status of authors. Today, writers “exist in a cyborg relation with a complex technology that affects us more than we affect it.”
52
Arguably, however, this relationship is nothing new; technologies have always shaped peoples’ behavior at a macro level often masked by the powerful subjective sensations of individuals. Landow admits that by taking such a provocative stance, “I horrify a lot of people in literature and the humanities.”
53
Throughout the 1980s and early 1990s, the hypertext movement quietly gathered steam, spawning other early experimental environments like Apple’s Hypercard, Perseus, and early Internet information-sharing applications like the Wide Area Information Server. None of these systems achieved widespread use, but each contributed to a growing wave of interest in a new humanist mode of computing that took advantage of the possibilities of hypertext.
One application that gained an enthusiastic following in the years
leading up to the Web was Gopher, originally developed as a campus information system for the University of Minnesota by Mark McCahill and Farhad Anklesaria. Named for the university’s mascot (and for its ability to “fetch” information), Gopher functioned as a large distributed menu system, enabling content owners to publish content in a unified browsing environment. Gopher presented users with a series of nested menus in simple lists of linear text. Each text link pointed to an object within the directory tree, which might be stored locally on the host machine or—a glimmer of the Web—on a remote computer accessible over the Internet. While the system followed a strictly hierarchical model, allowing navigation between resources only through nested menus, it nonetheless may have paved the way for the evolution of the Web by setting in motion a process of linking documents across disparate geographical locations. Gopher proved enormously popular in the academic computing labs of the early 1990s and, although largely forgotten today, almost certainly played a role in seeding popular interest in the notion of remote file sharing. By 1993 a thriving Internet subculture had emerged in campus computing environments.
Paradoxically, this computing subculture would never have emerged had it not been for the U.S. Department of Defense, which had funded development of the Internet since the late 1960s. Working under the auspices of the ARPANET project, pioneering engineers such as Vinton G. Cerf and Robert B. Kahn created distributed networking protocols like Transmission Control Protocol/Internet Protocol (TCP/IP) that would ultimately form the backbone for the modern Internet. In 1985 the National Science Foundation created NSFNet, the first national network specifically designed to support institutions of higher education. Other countries opened up similar academic networks, taking advantage of the open Internet technical standards to create a worldwide network capable of handling e-mail and exchanging files between servers distributed around the globe.
In the years before the National Science Foundation opened the Internet up to commercial use in 1991, an enthusiastic group of dedicated hackers, university staff, and scientific researchers were busily experimenting with new tools for sharing, retrieving, and organizing
information online. At the CERN laboratory in Switzerland, a young researcher named Tim Berners-Lee was working on a new software tool that would soon reverberate around the world.
Ted Nelson had predicted that the future of hypertext would spring not from large institutions or top-down computing initiatives but from innovation by free-thinking individuals. Berners-Lee was about to prove him right.
As a child, Tim Berners-Lee’s family had owned a copy of an old nineteenth-century English almanac entitled
Enquire Within Upon Everything.
In sensible Victorian prose, the book contained a compendium of useful tidbits on a host of seemingly unrelated topics. As the editor of the 1875 edition wrote, “Whether you wish to model a flower in wax; to study the rules of etiquette; to serve a relish for breakfast or supper; to plan a dinner for a large party or a small one; to cure a headache; to make a will; to get married; to bury a relative; whatever you may wish to do, make, or to enjoy, provided your desire has relation to the necessities of life, I hope you will not fail to
Enquire Within.
”
54
When Berners-Lee released his first version of his prototypical information-sharing tool in 1980, he decided to call it Enquire. Unfortunately for historians of the Web, all traces of this early proto-Web application disappeared in a calamitous hard drive crash. Undeterred, Berners-Lee started from scratch on developing a new incarnation of his hypertext information retrieval tool. In 1989 he released his program under a new name: WorldWideWeb.
The rest is not quite history.
As of February 2006, there were more than 20 billion Web pages residing on more than 78 million Web sites around the world. An estimated 1.2 billion people have now used the Internet at least once. In less than two decades the program that took shape on Tim Berners-Lee’s desktop has circled the globe and changed the lives of a huge swath of the planet’s population. As the Web has worked its way into the fabric of humanity’s cultural, economic, and political lives, networked information retrieval has radically reshaped the contours of our old institutional hierarchies. The Web has forced dramatic transformations inside many corporations, posed challenges to the author
ity of the press, and completely transformed the way many people interact. Old fixed systems like library catalogs, controlled vocabularies, and manual indexes seem increasingly problematic in the digital age; new fluid systems like Web search engines and collaborative filtering tools seem to be displacing them. Once again, old hierarchies are giving way to emergent networks, with long-term effects that we cannot entirely foresee.
Screen shot of Tim Berners-Lee’s original WorldWideWeb environment running on the NEXT platform (courtesy of CERN).
Ted Nelson, who arguably started it all, has disavowed the current Web as an ill-conceived perversion of his original idea, a simplistic implementation of “chunking hypertext,” the most rudimentary form of hypertext. Staying true to his iconoclastic form, Nelson issued a defiant polemic on his personal Web site, presented in unadorned ASCII text: