Where Wizards Stay Up Late (18 page)

Read Where Wizards Stay Up Late Online

Authors: Matthew Lyon,Matthew Lyon

Tags: #Technology

A month or so after the new group began meeting, it became clear to Crocker and others that they had better start accumulating notes on the discussions. If the meetings themselves were less than conclusive, perhaps the act of writing something down would help order their thoughts. Crocker volunteered to write the first minutes. He was an extremely considerate young man, sensitive to others. “I remember having great fear that we would offend whoever the official protocol designers were.” Of course, there were no official protocol designers, but Crocker didn't know that. He was living with friends at the time and worked all night on the first note, writing in the bathroom so as not to wake anyone in the house. He wasn't worried about what he wanted to say so much as he wanted to strike just the right tone. “The basic ground rules were that anyone could say anything and that nothing was official.”

To avoid sounding too declarative, he labeled the note “Request for Comments” and sent it out on April 7, 1969. Titled “Host Software,” the note was distributed to the other sites the way all the first Requests for Comments (RFCs) were distributed: in an envelope with the lick of a stamp. RFC Number 1 described in technical terms the basic “handshake” between two computers—how the most elemental connections would be handled. “Request for Comments,” it turned out, was a perfect choice of titles. It sounded at once solicitous and serious. And it stuck.

“When you read RFC 1, you walked away from it with a sense of, ‘Oh, this is a club that I can play in too,'” recalled Brian Reid, later a graduate student at Carnegie-Mellon. “It has rules, but it welcomes other members as long as the members are aware of those rules.” The language of the RFC was warm and welcoming. The idea was to promote cooperation, not ego. The fact that Crocker kept his ego out of the first RFC set the style and inspired others to follow suit in the hundreds of friendly and cooperative RFCs that followed. “It is impossible to underestimate the importance of that,” Reid asserted. “I did not feel excluded by a little core of protocol kings. I felt included by a friendly group of people who recognized that the purpose of networking was to bring everybody in.” For years afterward (and to this day) RFCs have been the principal means of open expression in the computer networking community, the accepted way of recommending, reviewing, and adopting new technical standards.

Before long, the assemblage began calling itself the Network Working Group, or NWG. It was a high commission for the country's young and exceptionally talented communication programmers. Its main challenge was to agree in principle about protocols—how to share resources, how to transfer data, how to get things done. In real terms, that meant writing programs, or at least adopting certain rules for the way programs got written, rules to which a majority could consent. Agreement was the
sine qua non.
This was a community of equals. They could all write code—or rewrite the code someone else had written. The NWG was an adhocracy of intensely creative, sleep-deprived, idiosyncratic, well-meaning computer geniuses. And they always half-expected, any day, to be politely thanked for their work and promptly replaced by others whom they imagined to be the field's true professionals. There was no one to tell them that they were as official as it got. The RFC, a simple mechanism for distributing documentation open to anybody, had what Crocker described as a “first-order effect” on the speed at which ideas were disseminated, and on spreading the networking culture.

Anticipating the construction of the network, the Network Working Group continued meeting regularly, and new terms and inventions often emerged by consensus. The very word “protocol” found its way into the language of computer networking based on the need for collective agreement among network users. For a long time the word has been used for the etiquette of diplomacy and for certain diplomatic agreements. But in ancient Greek,
protokollon
meant the first leaf of a volume, a flyleaf attached to the top of a papyrus scroll that contained a synopsis of the manuscript, its authentication, and the date. Indeed, the word referring to the top of a scroll corresponded well to a packet's header, the part of the packet containing address information. But a less formal meaning seemed even more fitting. “The other definition of protocol is that it's a handwritten agreement between parties, typically worked out on the back of a lunch bag,” Cerf remarked, “which describes pretty accurately how most of the protocol designs were done.”

But the first few meetings of the Network Working Group were less than productive. Over the course of the spring and summer of 1969, the group continued struggling with the problems of host-protocol design. Everyone had a vision of the potential for intercomputer communication, but no one had ever sat down to construct protocols that could actually be used. It wasn't BBN's job to worry about that problem. The only promise anyone from BBN had made about the planned-for subnetwork of IMPs was that it would move packets back and forth, and make sure they got to their destination. It was entirely up to the host computer to figure out how to communicate with another host computer or what to do with the messages once it received them. This was called the “host-to-host” protocol.

The computers themselves were extremely egocentric devices. The typical mainframe of the period behaved as if it were the only computer in the universe. There was no obvious or easy way to engage two diverse machines in even the minimal communication needed to move bits back and forth. You could connect machines, but once connected, what would they say to each other? In those days a computer interacted with the devices that were attached to it, like a monarch communicating with his subjects. Everything connected to the main computer performed a specific task, and each peripheral device was presumed to be ready at all times for a fetch-my-slippers type of command. (In computer parlance, this relationship is known as master-slave communication.) Computers were strictly designed for this kind of interaction; they send instructions to subordinate card readers, terminals, and tape units, and they initiate all dialogues. But if another device in effect tapped the computer on the shoulder with a signal that said, “Hi, I'm a computer, too,” the receiving machine would be stumped. The goal in devising the host-to-host protocol was to get the mainframe machines talking as peers, so that either side could initiate a simple dialogue and the other would be ready to respond with at least an acknowledgment of the other machine's existence.

Steve Crocker once likened the concept of a host-to-host protocol to the invention of two-by-fours. “You imagine cities and buildings and houses, and so forth, but all you see are trees and forest. And somewhere along the way, you discover two-by-fours as an intermediate building block, and you say, well, I can get two-by-fours out of all these trees,” Crocker recalled. “We didn't have the concept of an equivalent of a two-by-four, the basic protocols for getting all the computers to speak, and which would be useful for building all the applications.” The computer equivalent of a two-by-four was what the Network Working Group was trying to invent.

In conceiving the protocol, the NWG members had to ask themselves a few basic questions. What form should the common base take? Should there be a single, foundational protocol on which to build all application protocols? Or should it be more complex, subdivided, layered, branched? Whatever structure they chose, they knew they wanted it to be as open, adaptable, and accessible to inventiveness as possible. The general view was that any protocol was a potential building block, and so the best approach was to define simple protocols, each limited in scope, with the expectation that any of them might someday be joined or modified in various unanticipated ways. The protocol design philosophy adopted by the NWG broke ground for what came to be widely accepted as the “layered” approach to protocols.

One of the most important goals of building the lower-layer protocol between hosts was to be able to move a stream of packets from one computer to another without having to worry about what was inside the packets. The job of the lower layer was simply to move generic unidentified bits, regardless of what the bits might define: a file, an interactive session between people at two terminals, a graphical image, or any other conceivable form of digital data. Analogously, some water out of the tap is used for making coffee, some for washing dishes, and some for bathing, but the pipe and faucet don't care; they convey the water regardless. The host-to-host protocol was to perform essentially the same function in the infrastructure of the network.

Designing the host-to-host protocol wasn't the only job before the group. The NWG also had to write the network applications for specific tasks such as transferring files. As the talks grew more focused, it was decided that the first two applications should be for remote log-ins and file transfers.

In the spring of 1969, a few months before Kleinrock and the UCLA host team were expecting to receive the first IMP, a thick envelope arrived from Cambridge. The guys at UCLA had been anticipating it. Inside the package was BBN Report 1822, the newly written set of specifications for connecting host computers to the soon-to-be-delivered IMPs. The ARPA network finally seemed to be coming into place.

After months of guessing, the UCLA team now learned what it was expected to do to get its site ready and build its hardware interface. BBN Report 1822 also instructed the sites in creating a piece of software called a device driver—a collection of code and tables for controlling a peripheral device—to operate the host-to-IMP interface. And, finally, BBN's issuance of the specifications clarified the boundary between the IMP and the host. It was clear that BBN planned to include no special software in the IMP for performing host-to-host communication. That problem would be left, once and for all, to the host computer and, thus, to the NWG.

This meant a lot of summertime work for the students in Los Angeles. They thought they might be able to finish building the host-to-IMP interface in time. But writing the host-to-host protocol had already stalled Crocker, Cerf, and the entire Network Working Group for months. Rather than try to rush something out in time, they decided to tell every site to patch together its own makeshift protocols for the time being.

UCLA asked technicians at Scientific Data Systems, makers of the Sigma-7, to build the interface hardware for their host-to-IMP connection. The company's response was discouraging: It would take months and probably wouldn't be finished in time for the IMP's arrival. Moreover, the company wanted tens of thousands of dollars for the job. So when a graduate student named Mike Wing-field asked for the job, he got it. And why not? Wingfield was a whiz at hardware and had just finished building a complex graphics interface for another computer.

BBN's specification for the host-to-IMP interactions and connections was a splendid blueprint. A cookbook of sorts, written by Bob Kahn in crystalline prose, the document was accompanied by detailed diagrams. Kahn's specification gave Wingfield the basic requirements for mating the Sigma-7 to an IMP. Almost before Wingfield knew it, the summer had flown by and the interface had been built without a hitch.

One week before the IMP was scheduled to arrive on September 1, Wingfield had the hardware finished, debugged, and ready to connect to the IMP. Crocker was so impressed he described it as a “gorgeous” piece of work. But, trying to get the communications software done, Crocker was running behind. He had a tendency to procrastinate anyway, and the absence of the actual IMP had only encouraged this tendency.

Now, like anyone trying to outsmart a deadline, Crocker looked at the calendar and did a few calculations. He counted on having at least one extra day, since September 1 was Labor Day. Moreover, he had heard BBN was having some troubles with the IMP's timing chain. Synchronizer bugs were horribly nasty. Their bug was his good fortune, and with a little luck it might even buy him an extra week or two, he thought. So he was more than mildly surprised when Len Kleinrock told him that BBN was about to put the IMP on an airplane due to arrive in Los Angeles on Saturday, August 30, two days early.

In Cambridge, Frank Heart was preoccupied with the question of how best to ship the IMP to UCLA. After debating for a couple of days, Heart decreed that it should go by air, and that Ben Barker should go with it. A commercial flight was out of the question. The modified Honeywell 516—now officially BBN Interface Message Processor Number One—was just too big for the cargo bay of a passenger plane. It had to go out by air freight. Had Heart been able to, he would have put Barker straight into the cargo plane's hold with his wrist handcuffed to the IMP. Never mind that he had chosen the machine precisely because it was battle-hardened; the rigors of combat were nothing compared to the punishment airline freight handlers could dish out. “He wanted somebody to be there, yelling at the cargo people to make sure they weren't careless with it,” Barker recalled. But Barker would have to travel separately on a commercial passenger flight. Truett Thach, a technician in BBN's Los Angeles office, would be told to meet the plane.

Once the IMP was crated, Barker took a red Magic Marker and in large letters wrote
DO IT TO IT TRUETT
on the side of the crate. It was loaded onto an early-morning flight out of Boston's Logan Airport, and Thach was there to meet it at LAX that afternoon. When he arrived, accompanied by a freight mover, he was relieved to watch the crate come off the plane but appalled when he noticed that Barker's message to him was upside down. “Somewhere along the way, the IMP had been inverted an odd number of times,” he observed. Thach had the shippers right the box before loading it on their truck. Then he followed them to UCLA.

Other books

Eye of the Needle by Ken Follett
The Call of the Weird by Louis Theroux
Sloppy Firsts by Megan McCafferty
The Outcasts by Kathleen Kent
Severed Angel by K. T. Fisher, Ava Manello
Kill Dusty Fog by J. T. Edson
Perv by Becca Jameson