The Higgs Boson: Searching for the God Particle (29 page)

Illustration by Slim Fims

In December 1994 a full prototype
section of the LHC was operated for 24
hours, demonstrating that the key technical
choices for the magnets are correct.
Since then, tests on prototypes have simulated
about 10 years of running the
LHC. Magnets that surpass the design
criteria are now being produced in industry
and delivered to CERN for final
testing and subsequent installation.

With the 1993 demise of the planned
40-TeV SSC, the 14-TeV LHC became
the only accelerator project in the world
that can support a diverse research program
at the high-energy frontier. The
LHC’s intense beams present those designing
the experiments with remarkable
challenges of data acquisition. The beams
will consist of proton bunches strung like
beads on a chain, 25 billionths of a second
apart. At each collision point, pairs of
these bunches will sweep through each
other 40 million times a second, each time
producing about 20 proton-proton collisions.
Collisions will happen so often that
particles from one collision will still be flying
through the detectors when the next
one occurs!

Of these 800 million collisions a second,
only about one in a billion will involve
a head-on quark collision. To keep
up with this furious pace, information
from the detector will go into electronic
pipelines that are long enough to hold the
data from a few thousand collisions. This
will give “downstream” electronics
enough time to decide whether a collision
is interesting and should be recorded before
the data reach the end of the pipeline
and are lost. LHC detectors will have tens
of millions of readout channels. Matching
up all the pipelined signals that originate
from the same proton-proton collision
will be a mind-boggling task.

When Quarks Collide

A toroidal LHC apparatus (ATLAS) detector uses a novel toroidal magnet system. Protons collide in the center, producing a spray of particles. The concentric layers of ATLAS detect different species of particles, some precisely tracking the particle trajectories, others (“calorimeters”) measuring the energy carried

Illustration by Slim Films

Particle detectors are the physicists’
electronic eyes, diligently watching
each collision for signs of interesting
events. LHC will have four particle detectors.
Two will be giants, each built like
a Russian matryoshka doll, with modules
fitting snugly inside modules and a beam
collision point at the center. Each module,
packed with state-of-the-art technology,
is custom-built to perform specific observations
before the particles fly out to the
next layer. These general-purpose detectors,
ATLAS and CMS, standing up to 22
meters high, will look for Higgs particles
and supersymmetry and will be on the
alert for the unexpected, recording as
much as possible of the collision debris.
Two smaller detectors, ALICE and LHCb,
will concentrate on different specific areas
of physics.

Both ATLAS and CMS are optimized
to detect energetic muons, electrons and
photons, whose presence could signal the
production of new particles, including
Higgs bosons. Yet they follow very different
strategies. Years of computer simulations
of their performance have shown
that they are capable of detecting whatever
new phenomena nature may exhibit.
ATLAS (a toroidal LHC apparatus) is
based on an enormous toroidal magnet
equipped with detectors designed to identify
muons in air. CMS (compact muon solenoid)
follows the more traditional approach
of using chambers inside the
return yoke of a very powerful solenoidal
magnet to detect muons.

COMPACT MUON SOLENOID
(CMS) detector uses a more traditional
magnet design than ATLAS does and is
optimized for detecting muons. CMS has muon
detectors (yellow) interleaved with iron layers (orange) that
channel the magnetic field produced by the superconducting
solenoid coil. The electromagnetic calorimeter (blue) contains 80,000
lead-tungstate crystals for detecting electrons and photons. Above, a computer
simulation shows a collision in which a Higgs particle decays into two muons
(the tracks at about “4 o’clock”) and two jets of hadrons (at about “11 o’clock”).

Illustration by Slim Films

Part of the CMS detector will consist
of crystals that glow, or scintillate, when
electrons and photons enter them. Such
crystals are extremely difficult to make,
and CMS benefits from the experience
gained from a recent CERN experiment,
L3, which also used crystals. (The L3 detector
was one of four that operated from
1989 to 2000 at the LEP collider, performing
precision studies of the weak
force that told us that exactly three types
of zero- or low-mass neutrino exist.) Before
L3, such crystals had been made only
in small quantities, but L3 needed 11,000
of them. Crystals of the type developed
for L3 have been widely used in medical
imaging devices. CMS needs more than
seven times as many crystals made of a
more robust material. In due course the
superior CMS crystals are likely to have
an even bigger effect on the medical field.

ALICE (
a l
arge ion collider
e
xperiment)
is a more specialized experiment
that will come into its own when the LHC
collides nuclei of lead with the colossal energy
of 1,150 TeV. That energy is expected
to “melt” the more than 400 protons
and neutrons in the colliding nuclei, releasing
their quarks and gluons to form a
globule of quark-gluon plasma (QGP),
which dominated the universe about 10
microseconds after the big bang. ALICE
is based around the magnet of the L3 experiment,
with new detectors optimized
for QGP studies.

There is good evidence that experiments
at CERN have already created a
quark-gluon plasma. Over the coming
years, Brookhaven National Laboratory’s
Relativistic Heavy Ion Collider (RHIC)
has a good chance of studying QGP in detail
by packing 10 times more energy per
nucleon into its collisions than CERN
does. The LHC will extend this by a further
factor of 30. The higher energy at the
LHC will complement the more varied
range of experiments at RHIC, guaranteeing
a thorough study of an important
phase in the universe’s early evolution.

B
mesons, the subject of LHCb’s investigations,
could help tell us why the
universe is made of matter instead of
equal amounts of matter and antimatter.
Such an imbalance can arise only if heavy
quarks and antiquarks decay into their
lighter cousins at different rates. The Standard
Model can accommodate this phenomenon,
called CP violation, but probably
not enough of it to account completely
for the dominance of matter in the
universe. Physicists observed CP violation
in the decay of strange quarks in the 1960s,
but data on heavy “bottom” quarks and
antiquarks, the constituents of B mesons,
are also needed to establish whether the
Standard Model description is correct.

In 1999 experiments began at two
B
factories in California and Japan that can
produce tens of millions of
B
mesons a
year. These experiments have observed
the CP violation predicted by the Standard
Model in one B meson decay mode.
The high luminosity of the LHC beams
can churn out a
trillion B
mesons a year
for LHCb. This will allow much higher
precision studies in a wider variety of circumstances
and perhaps uncover crucial
exotic decay modes too rare for the other
factories to see clearly.

A Laboratory for the World

Scientific experiments as ambitious
as the LHC project are too expensive
to be palatable for any one country.
Of course, international collaboration
has always played a role in particle
physics, scientists being attracted to the
facilities best suited to their research interests,
wherever situated. As detectors
have become larger and costlier, the size
and geographic spread of the collaborations
that built them have grown correspondingly.
(It was the need to facilitate
communication between the LEP collaborations
that stimulated the invention of
the World Wide Web by Tim Berners-Lee at CERN.)

The LHC accelerator originally had
funding only from CERN’s (then) 19 European
member states, with construction
to occur in two phases on a painfully slow
timetable—a poor plan scientifically and
more expensive in toto than a faster, single-phase development. Fortunately, additional
funds from other countries
(which will provide some 40 percent of
the LHC’s users) will speed up the project.
Contributions of money or labor have
been agreed to by Canada, India, Israel,
Japan, Russia and the U.S. For example,
Japan’s KEK laboratory will supply 16
special focusing magnets. The U.S., with
more than 550 scientists already involved,
will furnish the largest national
group; accelerator components will be
designed and fabricated by Brookhaven,
Fermilab and Lawrence Berkeley National
Laboratory.

Furthermore, 5,000 scientists and engineers
in more than 300 universities and
research institutes in 50 countries on six
continents are building the ATLAS and
CMS detectors. When possible, components
will be built in the participating institutions,
close to students (who get great
training by working on such projects) and
in collaboration with local industries. The
data analysis will also be dispersed. It will
be a formidable challenge to manage
these projects, with their stringent technical
requirements and tight schedules,
while maintaining the democracy and
freedom for scientific initiatives that are
essential for research to flourish.

Until now, CERN has been primarily
a European laboratory. With the LHC, it
is set to become a laboratory for the
world. Already its 7,000 scientific users
amount to more than half the world’s experimental
particle physicists. In 1994
John Peoples, Jr., then director of Fermilab,
summed it up nicely: “For 40 years,
CERN has given the world a living demonstration
of the power of international
cooperation for the advancement of human
knowledge. May CERN’s next 40
years bring not only new understanding
of our Universe, but new levels of understanding
among nations.”

-Originally published: Scientific American 13, 52-59 (March 2003)

The Discovery Machine

by Graham P. Collins

You could think of it as the biggest, most
powerful microscope in the history of
science. The Large Hadron Collider
(LHC), now being completed underneath a circle
of countryside and villages a short drive from
Geneva, will peer into the physics of the shortest
distances (down to a
nano
-nanometer) and the
highest energies ever probed. For a decade or
more, particle physicists have been eagerly awaiting
a chance to explore that domain, sometimes
called the tera scale because of the energy range
involved: a trillion electron volts, or 1 TeV. Signifi
cant new physics is expected to occur at these
energies, such as the elusive Higgs particle
(believed to be responsible for imbuing other particles
with mass) and the particle that constitutes
the dark matter that makes up most of the material
in the universe.

The mammoth machine, after a nine-year
construction period, is scheduled (touch wood)
to begin producing its beams of particles later
this year. The commissioning process is planned
to proceed from one beam to two beams to colliding
beams; from lower energies to the terascale;
from weaker test intensities to stronger
ones suitable for producing data at useful rates
but more difficult to control. Each step along the
way will produce challenges to be overcome by
the more than 5,000 scientists, engineers and
students collaborating on the gargantuan effort.
When I visited the project last fall to get a firsthand
look at the preparations to probe the highenergy frontier, I found that everyone I spoke to
expressed quiet confidence about their ultimate
success, despite the repeatedly delayed schedule.
The particle physics community is eagerly awaiting
the first results from the LHC. Frank Wilczek
of the Massachusetts Institute of Technology
echoes a common sentiment when he speaks
of the prospects for the LHC to produce “a golden
age of physics.”

A Machine of Superlatives

To break into the new territory that is the terascale,
the LHC’s basic parameters outdo those
of previous colliders in almost every respect. It
starts by producing proton beams of far higher
energies than ever before. Its nearly 7,000 magnets,
chilled by liquid helium to less than two
kelvins to make them superconducting, will
steer and focus two beams of protons traveling
within a millionth of a percent of the speed of
light. Each proton will have about 7 TeV of
energy—7,000 times as much energy as a proton
at rest has embodied in its mass, courtesy of
Einstein’s
E=mc
2
. That is about seven times the
energy of the reigning record holder, the Tevatron
collider at Fermi National Accelerator Laboratory
in Batavia, Ill. Equally important, the
machine is designed to produce beams with 40
times the intensity, or luminosity, of the Tevatron’s
beams. When it is fully loaded and at
maximum energy, all the circulating particles
will carry energy roughly equal to the kinetic
energy of about 900 cars traveling at 100 kilometers
per hour, or enough to heat the water for
nearly 2,000 liters of coffee.

The protons will travel in nearly 3,000
bunches, spaced all around the 27-kilometer
circumference of the collider. Each bunch of up
to 100 billion protons will be the size of a needle,
just a few centimeters long and squeezed
down to 16 microns in diameter (about the same
as the thinnest of human hairs) at the collision
points. At four locations around the ring, these
needles will pass through one another, producing
more than 600 million particle collisions every
second. The collisions, or events, as physicists
call them, actually will occur between particles
that make up the protons—quarks and
gluons. The most cataclysmic of the smashups
will release about a seventh of the energy available
in the parent protons, or about 2 TeV. (For
the same reason, the Tevatron falls short of exploring
tera scale physics by about a factor of
five, despite the 1-TeV energy of its protons and
antiprotons.)

Four giant detectors—the largest would
roughly half-fill the Notre Dame cathedral in Paris,
and the heaviest contains more iron than the
Eiffel Tower—will track and measure the thousands
of particles spewed out by each collision
occurring at their centers. Despite the detectors’
vast size, some elements of them must be positioned
with a precision of 50 microns.

The nearly 100 million channels of data
streaming from each of the two largest detectors
would fill 100,000 CDs every second, enough to
produce a stack to the moon in six months. So
instead of attempting to record it all, the experiments
will have what are called trigger and dataacquisition
systems, which act like vast spam filters,
immediately discarding almost all the information
and sending the data from only the
most promising-looking 100 events each second
to the LHC’s central computing system at
CERN, the European laboratory for particle
physics and the collider’s home, for archiving
and later analysis.

A “farm” of a few thousand computers at
CERN will turn the filtered raw data into more
compact data sets organized for physicists to
comb through. Their analyses will take place on
a so-called grid network comprising tens of
thousands of PCs at institutes around the world,
all connected to a hub of a dozen major centers
on three continents that are in turn linked to
CERN by dedicated optical cables.

Journey of a Thousand Steps

In the coming months, all eyes will be on the
accelerator. The final connections between adjacent
magnets in the ring were made in early
November, and as we go to press in mid-December
one of the eight sectors has been cooled
almost to the cryogenic temperature required for
operation, and the cooling of a second has
begun. One sector was cooled, powered up and
then returned to room temperature earlier in
2007. After the operation of the sectors has been
tested, first individually and then together as an
integrated system, a beam of protons will be
injected into one of the two beam pipes that carry
them around the machine’s 27 kilometers.

The series of smaller accelerators that supply
the beam to the main LHC ring has already been
checked out, bringing protons with an energy of
0.45 TeV “to the doorstep” of where they will be
injected into the LHC. The first injection of the
beam will be a critical step, and the LHC scientists
will start with a low-intensity beam to reduce
the risk of damaging LHC hardware. Only
when they have carefully assessed how that “pilot”
beam responds inside the LHC and have
made fine corrections to the steering magnetic
fields will they proceed to higher intensities. For
the first running at the design energy of 7 TeV,
only a single bunch of protons will circulate in
each direction instead of the nearly 3,000 that
constitute the ultimate goal.

As the full commissioning of the accelerator
proceeds in this measured step-by-step fashion,
problems are sure to arise. The big unknown is
how long the engineers and scientists will take
to overcome each challenge. If a sector has to be
brought back to room temperature for repairs,
it will add months.

The four experiments—ATLAS, ALICE,
CMS and LHCb—also have a lengthy process of
completion ahead of them, and they must be
closed up before the beam commissioning begins.
Some extremely fragile units are still being
installed, such as the so-called vertex locator detector
that was positioned in LHCb in mid-November.
During my visit, as one who specialized
in theoretical rather than experimental physics
many years ago in graduate school, I was struck
by the thick rivers of thousands of cables required
to carry all the channels of data from the detectors—
every cable individually labeled and needing
to be painstakingly matched up to the correct
socket and tested by present-day students.

Although colliding beams are still months in
the future, some of the students and postdocs already
have their hands on real data, courtesy of
cosmic rays sleeting down through the Franco-
Swiss rock and passing through their detectors
sporadically. Seeing how the detectors respond
to these interlopers provides an important reality
check that everything is working together
correctly—from the voltage supplies to the detector
elements themselves to the electronics of
the readouts to the data-acquisition software
that integrates the millions of individual signals
into a coherent description of an “event.”

All Together Now

When everything is working together, including
the beams colliding at the center of each detector,
the task faced by the detectors and the data-processing
systems will be Herculean. At the design
luminosity, as many as 20 events will occur with
each crossing of the needlelike bunches of protons.
A mere 25 nanoseconds pass between one
crossing and the next (some have larger gaps).
Product particles sprayed out from the collisions
of one crossing will still be moving through the
outer layers of a detector when the next crossing
is already taking place. Individual elements in
each of the detector layers respond as a particle
of the right kind passes through it. The millions
of channels of data streaming away from the
detector produce about a megabyte of data from
each event: a petabyte, or a billion megabytes, of
it every two seconds.

The trigger system that will reduce this flood
of data to manageable proportions has multiple
levels. The first level will receive and analyze
data from only a subset of all the detector’s components,
from which it can pick out promising
events based on isolated factors such as whether
an energetic muon was spotted flying out at a
large angle from the beam axis. This so-called
level-one triggering will be conducted by hundreds
of dedicated computer boards—the logicembodied in the hardware. They will select
100,000 bunches of data per second for more
careful analysis by the next stage, the higher-level
trigger.

The higher-level trigger, in contrast, will receive
data from all of the detector’s millions of
channels. Its software will run on a farm of computers,
and with an average of 10 microseconds
elapsing between each bunch approved by the
level-one trigger, it will have enough time to reconstruct”
each event. In other words, it will
project tracks back to common points of origin
and thereby form a coherent set of data—energies,
momenta, trajectories, and so on—for the
particles produced by each event.

The higher-level trigger passes about 100
events per second to the hub of the LHC’s global
network of computing resources—the LHC
Computing Grid. A grid system combines the
processing power of a network of computing
centers and makes it available to users who may
log in to the grid from their home institutes.

The LHC’s grid is organized into tiers. Tier 0
is at CERN itself and consists in large part of
thousands of commercially bought computer
processors, both PC-style boxes and, more recently,
“blade” systems similar in dimensions to
a pizza box but in stylish black, stacked in row
after row of shelves. Computers are still being purchased and
added to the system. Much like a home user, the
people in charge look for the ever moving sweet
spot of most bang for the buck, avoiding the
newest and most powerful models in favor of
more economical options.

The data passed to Tier 0 by the four LHC experiments’
data-acquisition systems will be archived
on magnetic tape. That may sound oldfashioned
and low-tech in this age of DVD-RAM
disks and fl ash drives, but François Grey of the
CERN Computing Center says it turns out to be
the most cost-effective and secure approach.

Tier 0 will distribute the data to the 12 Tier 1
centers, which are located at CERN itself and at
11 other major institutes around the world, including
Fermilab and Brookhaven National
Laboratory in the U.S., as well as centers in Europe,
Asia and Canada. Thus, the unprocessed
data will exist in two copies, one at CERN and
one divided up around the world. Each of the
Tier 1 centers will also host a complete set of the
data in a compact form structured for physicists
to carry out many of their analyses.

The full LHC Computing Grid also has Tier 2
centers, which are smaller computing centers at
universities and research institutes. Computers at
these centers will supply distributed processing
power to the entire grid for the data analyses.

Too Much Information

With up to 20 collisions occurring at 25-nanosecond intervals at the center of each
detector, the LHC produces more data than can be recorded. So-called trigger systems
select the tiny fraction of the data that has promising features and discard the rest.
A global network of computers called a grid provides thousands of researchers around
the world with access to the stored data and the processing power to analyze it.

Other books

Loving Her Crazy by Kira Archer
The Darkest Pleasure by Gena Showalter
The Boat Builder's Bed by Kris Pearson
The Sun Dog by Stephen King
The Traitor by Sydney Horler
The Day Human Way by B. Kristin McMichael
Feast by Jeremiah Knight