A Brief Guide to the Great Equations (31 page)

Read A Brief Guide to the Great Equations Online

Authors: Robert Crease

Tags: #General, #Science

DESCRIPTION:
How the quantum state of a system – interpreted, for instance, as the probability of a particle being detected at a certain location – evolves over time. discoverer: Erwin Schrödinger

DATE:
1926

The Schrödinger equation is the basic equation of quantum theory. The study of this equation plays an exceptionally important role in modern physics. From a mathematician’s point of view the Schrödinger equation is as inexhaustible as mathematics itself.

– F. A. Berezin and M. A. Shubin,
The Schrödinger Equation

The journey taken by the scientific community from Planck’s introduction of the quantum to Schrödinger’s assertion of its universal presence took barely a quarter-century.

When Planck introduced the idea in 1900, it was a tiny speck on the horizon. He used it to make classical theory work for black body radiation. The theory worked if we say that whatever absorbs and emits light (which he treated as ‘resonators’) does so selectively
– only in integer multiples of a certain amount of energy. Many scientists saw this as a fudge, as problem avoidance rather than real science, and assumed that eventually they could discard the idea and it would drop back off the horizon.

Growing Extension of the Quantum

But in 1905, in a paper on the photoelectric effect, Einstein extended the idea. The quantum is not due to the selectivity of the resonators, he proposed, but to the fact that light itself is ‘grainy.’ By decade’s end, the quantum had shown up in several different branches of physics. Many who had dismissed it now took notice.

In 1911, a landmark step was taken by Walther Nernst, a Prussian physical chemist who initially (like others) had dismissed quantum theory as the offspring of a ‘grotesque’ formula, but who had used the theory to address what Thomson had called ‘Cloud No. 2’, or the application of classical molecular theory of heat to experimental results involving low-temperature solids, gases, and metals. Nernst declared that, in the hands of Planck and Einstein (and, he should have mentioned, his own), the theory had proven ‘so fruitful’ that ‘it is the duty of science to take it seriously and to subject it to careful investigations.’
1
He organized a conference of leading scientists to do so, holding it in Brussels with the support of a wealthy Belgian industrialist named Ernest Solvay.

The conference, a milestone event, signaled that the quantum – the idea of a fundamental graininess to light and all other forms of energy – was in science to stay.

It was one of those events whose significance was immediately clear. Participants communicated the excitement to others who had not attended. Nobel laureate Ernest Rutherford, returning to Cambridge, England, described the discussions in ‘vivid’ terms to a spellbound, 27-year-old Danish newcomer to his lab named Niels Bohr. In Paris, Henri Poincaré wrote that the quantum hypothesis appeared to involve ‘the greatest and most radical revolution in natural
philosophy since the time of Newton.’
2
Many scientists who were not present at the meeting caught its spirit from the proceedings. One was a 19-year-old Sorbonne student named Louis de Broglie, a recent convert to physics from an intended civil service career. De Broglie later wrote that the proceedings convinced him to devote ‘all my energies’ to quantum theory.

But the quantum fit uneasily on the Newtonian horizon, even when it solved key problems. It was like a guest whom you could not get around inviting to an event, but who you also knew would be awkward and whose presence you would have to manage carefully. Consider what happened when Niels Bohr used it to explain Rutherford’s until-then obscure idea about atomic structure. In 1911, Rutherford had proposed that atoms were like miniature solar systems, with a central core or ‘nucleus’ surrounded by electrons. This contradicted a basic principle of classical mechanics: why didn’t the orbiting electrons radiate away energy, as they should according to Maxwell’s theory, and fall into the nucleus? Because, Bohr proposed, using the quantum idea, electrons could only absorb and emit radiation in specific amounts, and thus could only fit in a small number of stationary orbits or states inside the atom, able to absorb and emit only the energy required to jump between such states. It was an odd assumption indeed. It implied that atomic electrons – to employ an image that the American philosopher William James used to describe the stream of consciousness, which may have influenced Bohr – made ‘flights and perchings’ amongst these states, without taking clear paths between them.
3
The states were what mattered, not the trajectories – whence the phrase, ‘quantum leap.’ Bohr applied this idea to the classic atomic test case – the hydrogen atom, a single electron orbiting a single proton. He showed how his assumption predicted the Balmer formula, an empirical formula for the spectral lines of hydrogen devised by a schoolteacher and numerologist.
4

The flitting and perching things – now applied only to light, but soon to matter – would soon create a classification problem. In the
classical horizon, the tiniest things came in two basic types: particles and waves. Particles were discrete things: each had its own definite position and momentum, and always followed a specific path in space and time. Waves were continuous things: they spread out spherically from their source without specific position or direction, smoothly broadening and thinning in space and time. Scientists used different theories to describe particles and waves. Particles were addressed by Newtonian theories that assumed masses were located at specific points and pushed by forces and had a definite momentum and position at every moment. Waves were addressed by Maxwellian theories that used continuous functions to describe how processes smoothly evolve in space and time. Both theories were well developed and deterministic: you input information about the initial state, turn the crank, and out popped a prediction of future behaviour.

In which bin should the flitting and perching things be placed? They seemed to have aspects of each. How was that possible?

Einstein provided some of the answer in his 1905 photoelectric effect paper. Traditional optics, he said, treats light as waves because it involves light in large amounts and averaged over time. But when light interacts with matter, as when it is emitted and absorbed, it does so on very short timescales, when it may well be grainy, localized in space, and with energies in integer multiples of
hv
(‘quanta’ of light later called ‘photons’). This idea, he proudly wrote to a friend, was ‘very revolutionary.’
5

For the next 20 years, physicists tended to be partisans of either particle theory or wave theory, trying to extend one or the other to cover quantum phenomena.

Einstein carried the theoretical banner for the particle side, though not without some reluctance. In an important paper of 1916, he extended his idea that light is absorbed and emitted in the form of physically real quanta, each having a particular direction and momentum (a multiple of
hv/c
), and – making a general if somewhat overstated point – proclaimed that ‘radiation in the form of spherical waves does not exist.’
6
This process conserved energy, he
now showed, for the amount emitted at one end equaled the amount absorbed at the other. But Einstein also found that he had to incorporate statistics in his theory to make it work, in the form of ‘probability coefficients’ that described the emission and absorption of quanta.
7
He found this to be a painful sacrifice, but hoped it would be temporary, expecting that his work would soon be replaced by some deeper understanding. Einstein’s experimental allies included Arthur H. Compton, who in 1923 demonstrated the ‘Compton effect’, that when photons bounce off electrons they both come from, and rebound in, definite directions.
8

One champion of waves was physicist Charles G. Darwin, grandson of his more famous naturalist namesake – though he, too, was somewhat dissatisfied with his role. Darwin believed light was emitted as waves, but realized that accommodating quantum phenomena such as the photoelectric effect would severely tax wave theory. In 1919, he wrote a ‘Critique of the Foundations of Physics’, in which he foresaw fundamental changes ahead. Quantum phenomena, he prophesied, might force physicists to abandon long-cherished principles. They might have to entertain wild ideas, he wrote – tongue-in-cheek – such as to ‘endow electrons with free will.’
9
The least wild thing, he finally decided, would be to keep wave theory by abandoning the conservation of energy for individual events, having it conserved only on average.

Darwin found a sympathizer in Niels Bohr. In 1924, Bohr enlisted two others – Hendrik Kramers and John Slater – in an attempt to eradicate Einstein’s radical idea and develop a more conventional approach that used wave theory to account for how light is emitted and absorbed, and for the photoelectric and Compton effects.
10
The authors found they had to pay a heavy price to murder Einstein’s idea; they would indeed have to abandon the conservation of energy, conserving it only on average, along with any hope of having a visualizable picture of the mechanics of how light is emitted and absorbed.

The word ‘visualizable’ –
anschaulich
, for the Germans – became
something of a technical term in physics around this time. For something in a theory to be visualizable or intuitable two things had to happen: the variables in the theory had to be connected with physical things like mass, position, energy, etc.; and the operations in the theory had to be connected with familiar operations, such as point-by-point movement, action at a distance, and so forth. Thus for something to be visualizable, or
anschaulich
, it did not necessarily have to be Newtonian, because something strange and non-Newtonian can still be visualized, as long as it unfolds in space and time. If something were
anschaulich
it merely meant that a flip-book-like description could be created in which the pages were like slices of time, locating where everything in an event is at every moment – and that when you ruffle the pages, what is on each page blends smoothly into what is on the next.

But the sacrifices of the Bohr-Kramers-Slater theory – abandonment of the conservation of energy and of
anschaulichkeit
, were regarded as too extreme not only by most physicists but even by at least one of its authors; Slater later claimed to have been coerced into signing his name. Few were surprised when, less than a year after publication, the Bohr-Kramers-Slater proposal was decisively refuted by experiment.

The Bohr-Kramers-Slater paper is a unique document in the history of science. It is renowned among historians for being both obviously wrong and strongly influential. It was strongly influential because it brought to a head the conflict between particle and wave theory. It said:
This
is the kind of sacrifice you have to pay in order to keep what you have. The partisans of each side were only being cautious and conservative, trying to preserve those elements of classical theory which they thought most robust. But quantum phenomena were resisting.

At the end of its first quarter-century, indeed, quantum theory was a mess. Historian Max Jammer called it ‘a lamentable hodgepodge of hypotheses, principles, theorems, and computational recipes rather than a logical consistent theory.’ Each problem had to be
first solved as if it were a classical situation, then filtered through a ‘mysterious sieve’ in which quantum conditions were applied, weeding out forbidden states and leaving only a few permissible ones. This process involved not systematic deduction but ‘skillful guessing and intuition’ which resembled ‘special craftsmanship or even artistic technique.’
11
A theory was needed that gave the right states from the start. To put it another way, quantum theory was more like a set of instructions for coming up with a way to get from point A to point B, when what you really wanted was a map.

Then, in 1925, came two dramatic breakthroughs from two very different people: Werner Heisenberg and Erwin Schrödinger. Each, struggling to act conservatively by sacrificing as little of the classical framework as possible, ended up a revolutionary.

Heisenberg, who at age twenty-four was young even by physics standards, tried to save classical mechanics by abandoning it at Nature’s bottom rung. Inside the atom, he declared, not only do particles and electron orbits have no meaning, but neither do even such basic classical properties as position, momentum, velocity, and space and time. And because our imaginations require a space-time container, this atomic world cannot be pictured. We have to base our theories, he said, on what he called ‘quantum-theoretical quantities’ that are unvisualizable, or
unanschaulich
. The next chapter outlines the steps Heisenberg took in developing his approach. At one point, Heisenberg noticed an odd feature: certain sets of quantum-theoretical quantities were noncommutative under the peculiar definition of ‘multiplication’ they obeyed: the order in which they were multiplied mattered. He initially found this feature awkward, and tried to ignore it – but soon came to embrace the feature as the keystone of quantum mechanics. In 1925 he wrote ‘On the Quantum-Mechanical Reinterpretation of Kinematic and Mechanical Relations’, which provided a method for calculating quantum states that lacked both particles and waves. It utilized mathematical methods that we call matrices to provide a formal, mathematical apparatus into which one plugged experimental data,
turned the mathematical crank, and out popped the allowed states. His supervisor, Max Born, quickly saw that Heisenberg had rediscovered matrices. But matrix mechanics, as it was called, was difficult to use, and many physicists resisted a theory that told them they could not picture Nature’s bottom rung.

Schrödinger, then thirty-eight, was ancient by physics standards. His approach covered much the same territory, but used a familiar tool of classical mechanics: a wave equation that he had developed, with continuous functions that described processes unfolding smoothly in space and time. For Schrödinger, the bottom rung was made of something quite
anschaulich
: waves.

Other books

Ruin Me Please by Nichole Matthews
The Promise by Fayrene Preston
The Roar of a Dragon by Robert Blanchard
The Perfect Murder by Brenda Novak
Of Stars & Lies by R. M. Grace
The Strangler Vine by Carter, M. J.
The Nature of Cruelty by L. H. Cosway
Father Unknown by Fay Sampson