Still a Berkeley freshman, Deutsch had been involved with Genie
only a few months when he was encountered by Butler Lampson.
He
explained that Genie's goal was to refashion the
SDS
930 into a small-scale time-sharing machine and that it was run out of the electrical
engineering department by David Evans and
Mel
Pirtle
—
the first an
unassuming computer science professor who limited himself as much
as possible to such tasks as raising grant money from the government,
the second a garrulous Californian graduate student who designed the
hardware.
Lampson felt irresistibly drawn to this remote corner of the university campus. "I found out from Peter what was going on, then
I
started
to hang around there a lot," he recalled. "After a while it became clear
that this was going to be a lot more interesting than physics."
With Lampson on board, Genie picked up momentum. The group tore
apart the SDS 930, tacked on new hardware, and wrote an entirely new
operating system. "There weren't any spectacularly new ideas in the project," Lampson said later. "The point was to try to take ideas that other
people had, some of which had been implemented on other machines,
and show you could make it all work in a much less grandiose environment." Genie accomplished its goal, which was to bring time-sharing to
the masses by implementing it on the small machine that Taylor and Currie eventually beguiled Palevsky into marketing as the SDS 940.
The Genie team then turned to the eternal question of what to do for
an encore. They were a powerful group of talents, especially after
Deutsch and Lampson, as good at designing and debugging operating
systems as anyone in the field, were joined by Chuck Thacker, whose
hardware skills represented the third side, so to speak, of a very sturdy triangle.
Growing up poor and fatherless in a suburb of Los Angeles, Thacker
had paid his way through school with a succession of jobs at small local
engineering shops, including one that made the devices Civil Defense
would use to measure ground radiation after the Bomb dropped. (This
was the 1950s, after all.) He had always been an electronics nut
—
he
could still remember the day he acquired his very first transistor as a
schoolboy—but it was from these shirt-sleeved shop men that he
learned to pare a decent design into a rnanufacturable one by stripping
it down to its frugal essence. "They were the real engineers engineers," he said.
At Caltech, where he made a short and unsuccessful first ran at
obtaining a bachelor's degree, physics was divided into two distinct
parts. There was the theory side, which involved a lot of math and cosmological speculation, and what Thacker called "the giant tinker toy
side," which involved building immense, elaborately engineered structures like synchrotrons and cyclotrons. That was the part he loved.
In fact, Thacker was animated by the same love of gadgetry that
lured countless other physicists like himself into computing. When he
moved north to Berkeley to get away from the L.A. smog and give his
faltering academic career a fresh start, he fell in among the computing
crowd, a course that led him inexorably to the same unmarked door
Lampson had discovered a few months earlier.
Now it was 1968. Work on the 940 had ended and Dave Evans had
relocated to the University of Utah, leaving Pirtle and the others to
think about working on a much larger canvas than the 940
—
a timesharing system, for example, that would serve not a dozen but 500
users at a time. They imagined a machine with several processors, each
assigned a specific task and all interconnected, like the tentacles of
mating octopuses. It was huge, exciting, innovative, and envisioned not
as an academic or government-funded venture, but strictly as a commercial one. Thus was Berkeley Computer Corporation born.
Although BCC was based on speculative technology, its financial
structure appeared at first glance to be made of sterner stuff. Pirtle
had arranged through his Wall Street connections to secure $2 million
in financing from a company called Data Processing Financial and
General, underwritten by the white-shoe investment firm of White,
Weld & Co. That sounded like plenty, but it was only seed money. The
team figured that bringing the Berkeley 1 computer to market would
consume that sum many times over, which meant they would have to
become very familiar with the demands of bankers and the intricacies
of high finance.
"This was definitely not your two-guys-in-a-garage startup," said
Lampson, who by now held a faculty appointment at Berkeley and set
his own name down as a co-founder. It was, however, something infinitely more risky. The BCC pioneers were about to become victims of
the "second-system effect."
The theory of second systems was formulated by an IBM executive
named Frederick Brooks, whose career supervising large-scale software teams taught him that designers of computer systems tend to
build into their second projects all the pet features that tight finances
or short deadlines forced them to leave out of their first. The result is
an overgrown, inefficient monstrosity that rarely works as expected. As
he put it in his pithy masterpiece,
The Mythical Man-Month-.
"The second is the most dangerous system a man ever designs."
The BCC machine could have sprang full-blown from the pages of
Brooks's text. As Lampson recalled, the designers of the economical
and practical SDS 940 regarded their next machine as an opportunity
to "look at all the things you could make much more wonderful, and
plan to make them all more wonderful by creating a system that could
handle a lot more users and
much
larger programs
and was much
faster
and used computing
resources
much more efficiently
and
was
better
and more wonderful in
even"
possible way.
"It was
not a very realistic
enterprise," he
acknowledged.
"But
at the
time it seemed great, the proper next
step, as
second systems often do."
Their
exuberance made Berkeley
Computer,
by all accounts, a jolly
place
to work, its scientists and
engineers
propelled
by
pure hubris
into
working the kind of inhuman
hours that
would become a Silicon
Valley cliche—when Silicon Valley came
into
its own fifteen years later.
They
believed they were breaking
new ground
in computer design,
and
they were right. Among other things,
their
machine incorporated
"virtual memory," a system for swapping
jobs
from disk to
memory
and
back again that enabled it to accommodate much more activity than its
physical
specifications otherwise would,
like a
house with
the
exterior
dimensions of a bungalow but the
interior floor
plan of
a
regal palace.
In
hardware terms, however, the machine was a beast.
"The
machine
consisted of a number of specialized
processors,
one to handle
the
disk
and drum input/output system, one
to handle
communications, and one
to handle job scheduling," recalled
Thacker,
who designed them.
"And
these
things cooperated with two processors which
were
somewhat
larger, which were the central processors
for
the machine which actually
ran
the user jobs." The processors all had
to
be physically connected to
each other and also to the memory, which required a couple of
miles
of
cable snaking among eight six-foot-tall cabinets full of equipment,
and
dien
out
to
peripherals such as teletypes and line printers.
Some of the workers, including Thacker, could tell early on
that
the
project was getting out of hand. The engineers engineer possessed the
unique trait of aiming for less, not more, in his systems.
"This was
so
unusual for an engineer," recalled Charles Simonyi, a young immigrant
from communist Hungary who assisted Thacker, watching as he chainsmoked through the night designing the machine s logic.
"He had
this
word for what was happening.
He
called it 'biggerism.'
I heard
this
word from him and my English was not that good and
I
always thought
it sounded slightly obscene, because he'd say, you know,
This
project
has been biggered.'"
Adding to the challenge, the hardware comprised an unwieldy mix of
ancient and modern components. The processors were built using a
brand-new, and far from bug-free, technology known as TTL (for
"transistor-transistor logic"). But the read-only memory, which carried
the machine's basic operating code, was an array of diodes soldered
onto some 20 circuit boards—hundreds of thousands of tiny diodes
each representing a digital bit. Editing the operating code during the
debugging phase, Simonyi recalled, meant finding the errant diode on
a foot-square circuit board, clipping it by hand with a the cutter, then
drilling new holes and soldering a fresh diode in place. It was like editing text with a hammer and chisel.