Read The New Penguin History of the World Online

Authors: J. M. Roberts,Odd Arne Westad

The New Penguin History of the World (188 page)

Ford, like many other great revolutionaries, had brought other men’s ideas to bear on his own. In the process he also transformed the workplace. Stimulated by his example, assembly lines became the characteristic way of making consumer goods. On those set up by Ford, the motor car moved steadily from worker to worker, each one of them carrying out in the minimum necessary time the precisely delimited and, if possible, simple task in which he (or, later, she) was skilled. The psychological effect on the worker was soon deplored, but Ford saw that such work was very boring and paid high wages (thus also making it easier for his workers to buy his cars). This was a contribution to another fundamental social change, with cultural consequences of incalculable significance – the fuelling of economic prosperity by increasing purchasing power and, therefore, demand.

COMMUNICATION

Some assembly lines nowadays are ‘manned’ entirely by robots. The single greatest technological change to affect the major industrial societies since 1945 has come in the huge field of what is comprehensively called information technology, the complex science of devising, building, handling and managing electronically powered machines that process information. Few innovatory waves in the history of technology have rolled in so fast. Applications of work done only during the Second World War were widely diffused in services and industrial processes over a couple of decades. This was most obvious in the spread of ‘computers’, electronic data processors of which the first only appeared in 1945. Rapid increases in power and speed, reductions in size and improvements in visual display capacity brought a huge increase in the amount of information that could be ordered and processed in a given time. Quantitative change, though, brought qualitative transformation. Technical operations hitherto unfeasible because of the mass of data involved now became possible. Intellectual activity had never been so suddenly accelerated. Moreover, at the same time as there was revolutionary growth in the power of computers, so there was in their availability, cheapness and portability. Within thirty years a ‘microchip’
the size of a credit card was doing the job that had at first required a machine the size of the average British living room. It was observed in 1965 that the processing power of a ‘chip’ doubled every eighteen months; the 2000 or so transistors carried by a chip thirty years ago have now multiplied to millions. The transforming effects have been felt exponentially, and in every human activity – from money- and war-making, to scholarship and pornography.

Computers are, of course, only part of another long story of development and innovation in communication of all kinds, beginning with advances in the physical and mechanical movement of solid objects – goods and people. The major nineteenth-century achievements were the application of steam to land and sea communication, and later electricity and the internal combustion engine. In the air, there had been balloons, and the first ‘dirigible’ airships flew before 1900, but it was only in 1903 that the first flight was made by a man-carrying ‘heavier than air’ machine (that is, one whose buoyancy was not derived from bags of a gas lighter than air). This announced a new age of physical transport; eighty years later, the value of goods moving through London’s biggest airport was greater than that through any British seaport. Millions now regularly travel by air on business and professional concerns, as well as for leisure, and flight has given a command of space to the individual only faintly imaginable as the century began.

The communication of information had already advanced far into another revolution. The essence of this was the separation of the information flow from any physical connection between source and signal. In the middle of the nineteenth century, poles carrying the wires for the electric telegraph were already a familiar sight beside railway lines, and the process of linking the world together with undersea cables had begun. Physical links were still fundamental. Then, Hertz identified the radio-magnetic wave. By 1900, scientists were exploiting electromagnetic theory to make possible the sending of the first, literally, ‘wireless’ messages. The transmitter and the receiver no longer needed any physical connection. Appropriately, it was in 1901, the first year of a new century to be profoundly marked by this invention, that Marconi sent the first radio message across the Atlantic. Thirty years later, most of the millions who by then owned wireless receivers had ceased to believe that they needed to open windows for the mysterious ‘waves’ to reach them and large-scale broadcasting systems existed in all major countries.

A few years before this the first demonstration had been made of the devices on which television was based. In 1936, the BBC opened the first regularly scheduled television broadcasting service; twenty years later the
medium was commonplace in leading industrial societies and now that is true worldwide. Like the coming of print, the new medium had huge implications, but for their full measurement they must be placed in the context of the whole modern era of communications development. Like the coming of print, the implications were incalculable, though they were politically and socially neutral or, rather, double-edged. Telegraphy and radio made information more quickly available, and this could be advantageous both to governments and to their opponents. The ambiguities of television became visible even more rapidly. Its images could expose things governments wanted to hide to the gaze of hundreds of millions, but it was also believed to shape opinion in the interests of those who controlled it. By the end of the twentieth century, too, it was clear that the Internet, the latest major advance in information technology, also had ambiguous possibilities. From its origins in the Arpanet – developed by the Advanced Research Projects Agency of the US Department of Defense in 1969 – by 2000 the Internet had 360 million regular users, mostly in the developed countries. By then, the ease of communication that it offered had helped revolutionize world markets and strongly influence world politics, especially in those regions that were moving towards more open political systems. E-commerce – the buying and selling of consumer goods and services through the Internet – became a major part of commerce in the United States in the early 2000s, with companies such as Amazon and eBay among the wealthiest and most influential in the market. By 2005, electronic mail had replaced postal services as the preferred way of communication in North America, Europe and parts of East Asia. But at the same time much of the ever-increasing speed capacity of Internet transfers was used for watching pornographic films or playing interactive games. And with much of this capacity wasted, the social differences between those who spend much of their day online and those who have no acces to the Internet is increasing rapidly.

SCIENCE AND NATURE

By 1950 modern industry was already dependent on science and scientists, directly or indirectly, obviously or not, acknowledged or not. Moreover, the transformation of fundamental science into end products was by then often very rapid, and has continued to accelerate in most areas of technology. A substantial generalization of the use of the motor car, after the grasping of the principle of the internal combustion engine, took about half a century; in recent times, the microchip made hand-held computers
possible in about ten years. Technological progress is still the only way in which large numbers of people become aware of the importance of science. Yet there have been important changes in the way in which it has come to shape their lives. In the nineteenth century, most practical results of science were still often by-products of scientific curiosity. Sometimes they were even accidental. By 1900 a change was underway. Some scientists had seen that consciously directed and focused research was sensible. Twenty years later, large industrial companies were beginning to see research as a proper call on their investment, albeit a small one. Some industrial research departments were in the end to grow into enormous establishments in their own right as petrochemicals, plastics, electronics and biochemical medicine made their appearance. Nowadays, the ordinary citizen of a developed country cannot lead a life that does not rely on applied science. This all-pervasiveness, coupled with its impressiveness in its most spectacular achievements, was one of the reasons for the evergrowing recognition given to science. Money is one yardstick. The Cavendish Laboratory at Cambridge, for example, in which some of the fundamental experiments of nuclear physics were carried out before 1914, had then a grant from the university of about £300 a year – roughly $1500 at rates then current. When, during the war of 1939–45, the British and Americans decided that a major effort had to be mounted to produce nuclear weapons, the resulting ‘Manhattan Project’ (as it was called) is estimated to have cost as much as all the scientific research previously conducted by mankind from the beginnings of recorded time.

Such huge sums – and there were to be even larger bills to meet in the post-war world – mark another momentous change, the new importance of science to government. After being for centuries the object of only occasional patronage by the state, it now became a major political concern. Only governments could provide resources on the scale needed for some of the things done since 1945. One benefit they usually sought was better weapons, which explained much of the huge scientific investment of the United States and the Soviet Union. The increasing interest and participation of governments has not, on the other hand, meant that science has grown more national; indeed, the reverse is true. The tradition of international communication among scientists is one of their most splendid inheritances from the first great age of science in the seventeenth century, but even without it, science would jump national frontiers for purely theoretical and technical reasons.

Once again, the historical context is complex and deep. Already before 1914 it was increasingly clear that boundaries between the individual sciences, some of them intelligible and usefully distinct fields of study since
the seventeenth century, were tending to blur and then to disappear. The full implications of this have only begun to appear very lately, however. For all the achievements of the great chemists and biologists of the eighteenth and nineteenth centuries, it was the physicists who did most to change the scientific map of the twentieth century. James Clerk Maxwell, the first professor of experimental physics at Cambridge, published in the

1870s the work in electromagnetism which first broke effectively into fields and problems left untouched by Newtonian physics. Maxwell’s theoretical work and its experimental investigation profoundly affected the accepted view that the universe obeyed natural, regular and discoverable laws of a somewhat mechanical kind and that it consisted essentially of indestructible matter in various combinations and arrangements. Into this picture had now to be fitted the newly discovered electromagnetic fields, whose technological possibilities quickly fascinated laymen and scientists alike.

The crucial work that followed and that founded modern physical theory was done between 1895 and 1914, by Röntgen who discovered X-rays, Becquerel who discovered radioactivity, Thomson who identified the electron, the Curies who isolated radium, and Rutherford who investigated the structure of the atom. They made it possible to see the physical world in a new way. Instead of lumps of matter, the universe began to look more like an aggregate of atoms, which were tiny solar systems of particles held together by electrical forces in different arrangements. These particles seemed to behave in a way that blurred the distinction between matter and electromagnetic fields. Moreover, such arrangements of particles were not fixed, for in nature one arrangement might give way to another and thus elements could change into other elements. Rutherford’s work, in particular, was decisive, for he established that atoms could be ‘split’ because of their structure as a system of particles. This meant that matter, even at this fundamental level, could be manipulated. Two such particles were soon identified: the proton and the electron; others were not isolated until after 1932, when Chadwick discovered the neutron. The scientific world now had an experimentally validated picture of the atom’s structure as a system of particles. But as late as 1935 Rutherford said that nuclear physics would have no practical implications – and no one rushed to contradict him.

What this radically important experimental work did not at once do was supply a new theoretical framework to replace the Newtonian system. This only came with a long revolution in theory, beginning in the last years of the nineteenth century and culminating in the 1920s. It was focused on two different sets of problems, which gave rise to the work designated by the terms relativity and quantum theory. The pioneers were Max Planck
and Albert Einstein. By 1905 they had provided experimental and mathematical demonstration that the Newtonian laws of motion were an inadequate framework for explanation of a fact no longer to be contested: that energy transactions in the material world took place not in an even flow but in discrete jumps –
quanta
, as they came to be termed. Planck showed that radiant heat (from, for example, the sun) was not, as Newtonian physics required, emitted continuously; he argued that this was true of all energy transactions. Einstein argued that light was propagated not continuously but in particles. Though much important work was to be done in the next twenty or so years, Planck’s contribution had the most profound effect and it was again unsettling. Newton’s views had been found wanting, but there was nothing to put in their place.

Other books

Six Celestial Swords by T. A. Miles
Will of Man - Part Two by William Scanlan
The Ground She Walks Upon by Meagan McKinney
Sueño del Fevre by George R.R. Martin
The Witch's Reward by Liz McCraine
Bonjour Tristesse by Francoise Sagan