Authors: Douglas Edwards
I was no expert on computer hardware. I had read an article or two about servers, hubs, and routers, but I pronounced "router" as if it rhymed with "tooter" instead of "outer." Given my profound lack of technical expertise and my bad computer karma, why would any company allow me in the same room as its computational nerve center? That requires a bit of explanation.
In late 1999, Google began accelerating its climb to market domination. The media started whispering about the first search engine that actually worked, and users began telling their friends to give Google a try. More users meant more queries, and that meant more machines to respond to them. Jim and Schwim worked balls-to-the-wall to add capacity. Unfortunately, computers had suddenly become very hard to get. At the height of the dot-com madness, suppliers were so busy with big customers that they couldn't be bothered fending off the hellhounds of demand snapping at Google's heels. A global shortage of RAM (memory) made it worse, and Google's system, which had never been all that robust, started wheezing asthmatically.
Part of the problem was that Google had built its system to fail.
"Build machines so cheap that we don't care if they fail. And if they fail, just ignore them until we get around to fixing them." That was Google's strategy, according to hardware designer Will Whitted, who joined the company in 2001. "That concept of using commodity parts and of being extremely fault tolerant, of writing the software in a way that the hardware didn't have to be very good, was just brilliant." But only if you could get the parts to fix the broken computers and keep adding new machines. Or if you could improve the machines' efficiency so you didn't need so many of them.
The first batch of Google servers had been so hastily assembled that the solder points on the motherboards touched the metal of the trays beneath them, so the engineers added corkboard liners as insulation. It looked cheap and flimsy, but it prevented the CPUs (central processing units) from shorting out. Next, Larry focused on using space more efficiently and cutting out as many expensive parts as possible. He, Urs, and a couple of other engineers dumped out all the components on a table and took turns arranging the pieces on the corkboard tray like a jigsaw puzzle.
*
Their goal was to squeeze in at least four motherboards per tray. Each tray would then slide into a slot on an eight-foot-tall metal rack. Since servers weren't normally connected to displays, they eliminated space-hogging monitor cards. Good riddance—except that when something died the ops staff had no way to figure out what had gone wrong, because they couldn't attach a monitor to the broken CPU. Well, they could, but they'd have to stick a monitor card in while the machine was live and running, because Larry had removed the switches that turned the machines off.
"Why would you ever want to turn a server off?" he wondered. Perhaps because plugging a monitor card into an active computer could easily short out the motherboard, killing the whole machine.
After the engineers crammed four boards onto each tray, the one in the back couldn't be reached from the front. To fix it the technician would have to pull the tray out of the rack, but the trays were packed so tightly that yanking on one would cause the trays directly above it and below it to start sliding. With cables wrapped around every surface like lovelorn anacondas, that could unplug everything and shut down the entire rack.
That's how my chance to perform bypass surgery on Google's still-beating heart came about. My comrades and I would be disconnecting the cables one by one and reconnecting them in tightly tied bundles running in plastic troughs along the side of the server trays instead of in front of them, making it easier to move the trays in and out of the racks. Even marketeers could use a twist-tie, so we were encouraged to get our hands dirty mucking out the server farm.
"CableFest '99 lays the groundwork for the frictionless exchange of information on a global scale and will increase the knowledge available to every sentient being on the planet," I assured my wife.
Kristen looked at me and sadly shook her head. She had a PhD in Soviet history, a job as a professor, and a very sensitive bullshit detector. She tried to be supportive, but her maternal instincts were primarily focused on the three children she now worried would see little of their father. "You took a giant pay cut, and now you're working weekends. You know, the
Merc
might still want you back."
Saturday morning came and I pulled into the almost empty parking lot of a large, gray, windowless edifice in Santa Clara. There was no sign in front, but it was Exodus, the co-lo that housed our data center.
*
I joined the movement of people straggling single file through a well-fortified security checkpoint. Marketing, finance, and facilities were all represented. Even Charlie Ayers, our newly hired chef, was there. Photo IDs were checked and badges issued. Stern warnings were given. We were not, repeat,
not
to touch anyone else's stuff.
And then we were in.
Unless you're a sysadmin, electrician, or NSA stenographer, you may never have been inside a server farm. Imagine an enormous, extremely well-kept zoo, with chain-link walls draped from floor to ceiling creating rows of large fenced cages vanishing somewhere in the far, dark reaches of the Matrix. Inside each cage is a mammoth case (or several mammoth cases) constructed of stylish black metal and glass, crouched on a raised white-tile floor into which cables dive and resurface like dolphins. Glowing green and red lights flicker as disks whir, whistle, and stop, but no human voices are ever heard as frigid air pours out of exposed ceiling vents and splashes against shiny surfaces and around hard edges.
The overwhelming impression, as Jim led us past cage after cage of cooled processing power, was of fetishistic efficiency. Clean, pristine, and smoothly sculpted, these were more than machines, they were totems of the Internet economy. Here was eBay. Here Yahoo. Here Inktomi. Welcome to Stonehenge for the Information Age.
The common design element seemed to be a mechanized monolith centered in each cage, surrounded by ample space to set up a desk and a few chairs, with enough room left over for a small party of proto-humans to dance about beating their chests and throwing slide rules into the air.
At last we arrived at Google's cage. Less than six hundred square feet, it felt like a shotgun shack blighting a neighborhood of gated mansions. Every square inch was crammed with racks bristling with stripped-down CPUs. There were twenty-one racks and more than fifteen hundred machines, each sprouting cables like Play-Doh pushed through a spaghetti press. Where other cages were right-angled and inorganic, Google's swarmed with life, a giant termite mound dense with frenetic activity and intersecting curves. Narrow aisles ran between the rows of cabinets, providing barely enough space to pass if you didn't mind shredding clothes and skin on projecting screws and metal shards.
It was improbably hot after our stroll through a freezer to get there, and we were soon sweating and shedding outerwear. On the floor, sixteen-inch metal fans vibrated and vainly pushed back against the heat seeping out from the racks around us—their feeble force doing little more than raise the temperature of Inktomi's adjacent cage by a few degrees.
We went to work. First the ops team attached Panduit cable troughs to the sides of the cabinets with adhesive tape. Then we began gently placing the free-hanging cables in the troughs and twist-tying them together so they no longer draped over the face of the machines like the bangs of a Harajuku Girl.
I tackled the rack labeled "U." It has long since been retired, but I like to think that those user queries routed to U got their answers a nanosecond or two faster because of my careful combing of the cables.
Why, you might ask, did Google do things this way? In addition to the efficiency gained by running cheap, redundant servers, Google was exploiting a loophole in the laws of co-lo economics. Exodus, like most hosting centers, charged tenants by the square foot. So Inktomi paid the same amount for hosting fifty servers as Google paid for hosting fifteen hundred. And the kicker? Power, which becomes surprisingly expensive when you gulp enough to light a neighborhood, was included in the rent. When Urs renegotiated the lease with Exodus, Jim spelled out exactly how much power he needed. Not the eight twenty-amp circuits normally allocated to a cage the size of Google's; he wanted fifty-six.
"You just want that in case there's a spike, right?" asked the Exodus sales rep with a look of surprise. "There's no way you really need that much power for a cage that size."
"No," Jim told him. "I really need all fifty-six to run our machines."
It's rumored that at one point Google's power consumption exceeded Exodus's projections fifty times over.
*
It didn't help that Google sometimes started all of its machines at once, which blew circuit breakers left and right until Google instituted five-second delays to keep from burning down the house.
Air-conditioning came standard, too. Again, Exodus based their calculations on a reasonability curve. No reasonable company would cram fifteen hundred micro-blast furnaces into a single cage, because that would require installing a separate A/C unit. Google did. We were a high-maintenance client.
CableFest '99 was the one and only time I entered a Google data center. It gave me an appreciation of the magnitude of what we were building and how differently we were doing it. I can't say it inspired confidence to lay my untrained hands on our cheap little generic servers, lying open to the controlled elements on crumbly corkboards, while next door, Inktomi's high priests tended to sleek state-of-the-art machines that loomed like the Death Star. But the arrangement seemed to work pretty well for us, and I decided not to worry about things that were beyond my ken.
Very smart people were obsessing about the viability of Google's back end, and unbeknownst to me, I would soon be obsessing about the viability of my own.
"Once she had accomplished that," Cindy was explaining to our small marketing team, "she had the world by its oyster."
I smiled. New fodder for the quote board I'd pinned up on my cubicle wall, which still featured Cindy's last pronouncement, "That's what happens when that happens."
Our department consisted of a small cadre with mixed levels of experience in marketing. Cindy was the boss and acting VP. She was close to my age, very funny (usually intentionally), and always in a hurry, which led to an alarming number of emails in which her fingers failed to keep up with her thoughts. She had started as a print journalist, then done PR duty under some of the most notorious tycoons in the Valley, where she had become personally acquainted with every reporter who talked or typed about technology. She focused on public relations, which Larry and Sergey supported as the most cost-effective way to promote the company.
Cindy exuded a wholesome Laura Petrie vibe that I found comforting, and I felt a connection with her because of our common history at newspapers. As she bounced around the department, a whirling dynamo of positive energy, she urged us to take risks, try new things, and let nothing stand in our way. We started referring to her as "Small. But mighty." Those qualities cut both ways.
"Larry and Sergey were always skeptical about traditional marketing," Cindy recalls. "They wanted Google to stand apart from others by not doing what everyone else was doing ... Let the other guys with inferior products blow their budgets on noise-making, while we stayed focused on building a better mousetrap." That skepticism translated into constant questioning about everything marketing proposed. The department only existed because someone (a board member or a friend from Stanford) had insisted the founders needed people to do all the stuff that wasn't engineering.
Cindy pushed back against the constant pressure to prove her department was not a waste of payroll, but she also let us know that expectations were high. When we performed below her professional standards, she rebuked us for "Mickey Mouse behavior" with an intensity as devastating and unexpected as the tornados that swept her native Nebraska. I learned to keep an eye out for storm warnings.
My counterpart on the offline branding side of things was Shari Fujii, a thin, thoughtful, hyperkinetic marketing professional with an MBA and a tendency to exclaim that the impact of any given action would be "
huge.
" We often commiserated about Larry and Sergey's abysmal lack of regard for our department and its work. Coming out of a company run by journalists, I found it more of the same, but Shari struggled to make it fit with her experience at brand-driven companies, where marketing summoned the sun to begin each new day.
The other key player in my world was Karen White, the webmaster. Karen had been a casino dealer in North Dakota when she decided to teach herself the ins and outs of creating web pages. Cindy had discovered her at a previous job and brought her to Google. I soon understood why. Karen had the organizational skills and disposition of a NASA launch coordinator. Industrious, objective, unflappable, and willing to stretch her day across multiple time zones, Karen took all the words I threw together and arranged them in pretty columns on our website. She had more influence on the overall look of Google than anyone who worked on it after Larry and Sergey's original "non-design" design.
Other than Susan Wojcicki, who had put her MBA to work at Intel, our group was new to marketing. Google hired Stanford grads in bulk and set them loose in the halls. If they didn't secure a role elsewhere, they rolled downhill to our department, where the assumption seemed to be that no special skills were required.