The Second Book of General Ignorance (24 page)

Read The Second Book of General Ignorance Online

Authors: John Lloyd,John Mitchinson

The nose may be our fastest-evolving organ, but further analysis of the human genome shows that we are also evolving elsewhere. Our hair is becoming less thick, but our hearing (perhaps as a result of developing language) is much better than that of chimpanzees. More alarmingly, the Y chromosome, the one that makes a person male (men have both an X and a Y chromosome) is shrinking. It’s lost 1,393 of its original 1,438 genes over the last 300 million years. The geneticist Steve Jones points out that one consequence of this is that women are now genetically closer to chimpanzees than men are, because the two X chromosomes they possess have changed much less rapidly.

There’s a widespread assumption that human beings have stopped evolving, because technological advances have insulated us from the environmental pressures that drive natural selection. However, the latest genome research suggests the rate of evolutionary change among humans is much the same as that observed in the rest of nature.

A good example of this is lactose tolerance (inability to digest milk) in adults, which has arisen in some parts of the world (but not others) as the result of a single genetic mutation that took place no more than 5,000 years ago.

What were Bronze Age tools made of?

Stone, mostly.

The Bronze Age, which in Europe is dated at 2300–600
BC
, began when mankind first discovered how to make and use bronze, but this would have been a
gradual
industrial revolution. For much of the period, old technology (using stone and bone) would have been more widespread than metal. Bronze would have been rare and expensive, so most everyday tools and weapons would still have been made from flint and other familiar materials.

And, just as stone flourished in the Bronze Age, so bronze-working didn’t reach its peak until well into the Iron Age (1200
BC

AD
400).

We still use all three materials today. In the twenty-first century, alongside plastic bags and silicon chips, we still continue to produce iron railings, bronze bearings and statues, gravestones and grinding stones. The last people in Britain to make a living working with flint were the flintknappers who supplied the gunflints for firearms. It was a profession that only died out in the nineteenth century, when the percussion cap replaced the flintlock.

The ‘Three-Age System’ – in which the Bronze Age follows the Stone Age, and is succeeded by the Iron Age – stems from the early nineteenth century. It was the brainchild of Christian Jürgensen Thomsen (1788–1865), a Danish museum curator, who was looking for a nice neat way of arranging his exhibits. It was never intended to be more than a fairly crude means of placing artefacts in a chronological relationship with each other, by classifying them according to the relative sophistication of their manufacture.

Many archaeologists believe that the Stone Age – which is itself split into three eras (the Old, Middle and New Stone ages) – was probably more of a Wood Age, but that wood’s
predominant role in pre-history has been hidden by the fact that wooden artefacts rot, while stone ones don’t.

What was not Made in China and not made
of
china?

Glass.

Though the Chinese invented the compass, the flushing toilet, gunpowder, paper, the canal lock and the suspension bridge long before anyone else, the scientific revolution that transformed the West between the sixteenth and eighteenth centuries completely passed them by.

The reason for this is that they also invented tea.

The earliest known glass artefacts are Egyptian and date back to 1350
BC
, but it was the Romans who first produced transparent glass. They liked the way it enabled them to admire the colour of their wine.

By the time the Egyptians worked out how to make glass, the Chinese had been drinking tea (traditionally they began in 2737
BC
) for almost 1,400 years. Its colour was less important to them than temperature, and they found it was best served in their most famous invention of all: fine porcelain, or ‘china’.

Since they had no particular use for it, early Chinese glass was thick, opaque and brittle. They mainly used it for making children’s toys – and soon gave up on it altogether. For almost 500 years, from the end of the fourteenth century until the nineteenth, no glass was made in China at all.

Meanwhile, in 1291 the Republic of Venice, concerned about the fire risk to its wooden buildings, moved its glass furnaces offshore to the island of Murano. Here, inspired by
migrant Islamic craftsmen, the inhabitants learned to make the finest glass in the world, giving them a monopoly that lasted for centuries

The impact of high-grade glass on Western culture cannot be overstated. The invention of spectacles towards the end of the thirteenth century added at least fifteen years to the academic and scientific careers of men whose work depended on reading. The precise reflection of glass mirrors led to the discovery of perspective in Renaissance painting. Glass beakers and test tubes transformed ancient alchemy into the modern science of chemistry.

The microscope and the telescope, invented within a few years of each other at the end of the sixteenth century, opened up two new universes: the very distant and the very small.

By the seventeenth century, European glass had become cheap enough for ordinary people to use it for windowpanes (as opposed to mere holes in the wall or the paper screens of the Orient). This protected them from the elements and flooded their houses with light, initiating a great leap forward in hygiene. Dirt and vermin became visible, and living spaces clean and disease free. As a result, plague was eliminated from most of Europe by the early eighteenth century.

In the mid-nineteenth century, transparent, easily sterilised swan-necked glass flasks allowed the French chemist Louis Pasteur to disprove the theory that germs spontaneously generated from putrefying matter. This led to a revolution in the understanding of disease and to the development of modern medicine. Not long afterwards, glass light bulbs changed both work and leisure forever.

Meanwhile, new trade links between East and West in the nineteeth century meant that a technologically backward China soon caught up. Today it is the world’s third-largest industrial power and its largest exporter, with total exports in 2009 of £749 billion.

It is also the world’s largest producer of glass, controlling 34 per cent of the global market.

What’s the name of the chemical that’s bad for you and is found in Chinese food?

Despite its reputation in the press, monosodium glutamate is much less harmful than ordinary table salt.

The list of charges against MSG is a long one. It has been accused of causing obesity, nerve damage, high blood pressure, migraine, asthma and altering hormone levels. But every concerned public body that ever investigated it has given it a clean bill of health.

For centuries, it was agreed that there were only four basic tastes – sweet, sour, bitter and salty – until in 1908 Dr Kikunae Ikeda of Tokyo University discovered a fifth one: a ‘meaty’ taste that he named ‘
umami
’. This is the taste of MSG. Like soy sauce, it just makes your food a little more delicious.

The MSG scare arose out of so-called ‘Chinese Restaurant Syndrome’. Dr Robert Ho Man Kwok coined the term in 1968, when a number of his patients complained of palpitations and numbness in their neck and arms after eating a large Chinese meal. Dr Kwok blamed monosodium glutamate, and though all subsequent research has proved that to generate such symptoms would require a concentration of MSG in food that would render it completely inedible, the stigma has somehow remained.

We now know that glutamate is present in almost every natural food stuff (it is particularly high in parmesan and tomato juice) and that the protein is so vital to our
functioning that our own bodies produce 40 grams of it a day. Human milk contains lots of glutamate, which it uses as an alternate enhancement to sugar – MSG and sugar are the two things that get babies drinking.

A much more dangerous substance is recklessly sprinkled on food every time we eat. Excessive salt intake increases the risk of high blood pressure, strokes, coronary artery disease, heart and kidney failure, osteoporosis, stomach cancer and kidney stones. We’d be safer replacing it in our cruets with MSG.

In the European Union, monosodium glutamate is classified as a food additive – E621.

The dreaded ‘e-numbers’ listed on jars and tin cans are almost all completely benign; the ‘E’ stands for nothing more sinister than ‘European’. It is simply an international way of labelling the different substances (by no means all of them artificial) that are found in our foods. If you wanted to avoid E numbers altogether, you couldn’t: 78 per cent of the air we breathe is E941 (nitrogen) and even the purest water is made entirely from E949 (hydrogen) and E948 (oxygen).

Salt, apparently, doesn’t have an e-number.

JOHNNY VEGAS
This is why I don’t wanna do shows like this!

STEPHEN
Why is that, Johnny?

JOHNNY
Well,
’ cause now, I’m gonna lie awake at night, fearing
that I’m lactating poison! I feel like I’ve already hurt people
enough in my lifetime.

STEPHEN
It’s not poison; it’s good. We’re trying to suggest that
MSG is not as bad as it’s been painted. You may not like the
flavour, in which case, certainly, don’t have any.

JOHNNY
Yeah, but I don’t want meaty-tasting breasts!

Does eating chocolate give you acne?

No. Nothing we eat ‘causes’ acne (but go easy on the breakfast cereals).

Acne affects over 96 per cent of teenagers at some time during their adolescence.

Each human hair grows in an individual pouch in the skin called a follicle (from the Latin for ‘little bag’). Feeding into each follicle is a gland that secretes a waxy substance called sebum (Latin for ‘grease’ or ‘suet’). Next to each follicle is another gland, which carries sweat up to the surface of the skin through a tiny pore (from
poros
, Greek for ‘passage’).

During puberty, testosterone levels increase in both boys and girls giving rise to an over-production of sebum. This spills out into the sweat pores, clogging them up with oily compost ripe for bacteria. The result is a pimple. A colony of these is called acne vulgaris (‘acne’ for short). Boys have higher levels of testosterone than girls, which is why they also tend to have worse acne.

So it’s not chocolate, but testosterone, that ‘causes’ acne. But diet is a factor too, and some foods definitely make it worse.

In 1981 Professor David Jenkins, a Toronto-based nutritionist, measured the effects of carbohydrates on blood-sugar levels. He found that starchy foods (like white bread, cereals and potatoes) raised blood-sugar levels dramatically; but sugary foods had much less effect. Starchy foods have a simpler chemical structure and are easier for the digestive system to convert into glucose, the most absorbable form of sugar. Protein, fats and more complex sugars (like chocolate) are harder to absorb. From this, Jenkins devised a scale called the GI, or Glycaemic Index (from Greek
glykys
, ‘sweet’, and
haima
, ‘blood’).

Foods with a high GI score – the ones that raise blood-
sugar levels most – create a surge in the production of insulin, the hormone that regulates the body’s intake of glucose. Insulin is itself controlled by testosterone, and dairy products are in turn thought to stimulate testosterone. So at breakfast, it’s the cereal and the milk (a double dose of hormonal stimulants) rather than the sugar that may aggravate the acne.

The English word acne was first used in 1835, but it comes from a 1,500-year-old Assyrian spelling mistake. In the sixth century, Aëtius Amidenus, a physician from the city of Amida (now in modern Turkey), accidentally coined a new word –
akne
– to describe a pimple. He had meant to write
akme
(Greek for ‘point’).

Munching your favourite chocolate bar produces endorphins, which help relieve pain, reduce stress and lower the risk of heart disease and cancer. But pure cacao doesn’t have the same effect. Mere chemicals are not enough to satisfy the craving: we also need taste, texture and memories to set our hearts (literally) racing. In 2007 a research company, The Mind Lab, showed that, for some people – especially women – eating a piece of dark chocolate made the heart beat faster, and for longer, than a passionate kiss.

Who gets over-excited by sugary drinks?

Parents.

There isn’t a shred of scientific evidence that children become ‘hyperactive’ when given sugary drinks, sweets or snacks.

In one test, a group of children were all given the same sugar-rich drink, but the parents of half the sample were told
they’d been given a sugar
-free
drink. When questioned afterwards, the parents who thought their children
hadn’t
had any sugar (even though they had) reported far less hyperactive behaviour.

In another study, some children were put on high-sugar diets and others sugar-free ones. No difference in behaviour was observed. Not even when (according to the
British Medical
Journal
in 2008) the children had already been diagnosed with attention deficit hyperactivity disorder (ADHD). Because parents expect sugar to cause hyperactivity, that’s what they see.

It all began in 1973, when a US allergy specialist called Benjamin Feingold (1899–1982) first showed that hyperactivity in children is linked to what they eat, and proposed a diet for preventing it. He recommended cutting out all artificial colourings and flavourings, including sweeteners such as aspartame. The Feingold Diet didn’t ban sugar but, as medical opinion gradually came to accept the connection between hyperactivity and diet, sugar somehow became confused in the public mind with ‘sweeteners’.

No one has ever come up with a decent theory to explain exactly
how
sugar might have this effect on youngsters. If high blood-sugar levels were the cause, they’d be more likely to go ballistic after a bowl of rice or a baked potato.

Throughout the centuries, food has been blamed for causing the behaviour that people were most worried about at the time. The sixteenth-century herbalist John Gerard warned against the herb chervil, which ‘has a certain windiness, by means whereof it provoketh lust’. Buddhist monks are forbidden to eat any member of the onion family, because they, too, are thought to cause lust when cooked – and anger when raw.

In the nineteenth century, moralising Victorians attributed ‘degeneracy and idleness’ in the Irish to the supposed
soporific effect of the potato. Englishwomen, by contrast, were warned off eating meat. Such ‘stimulating’ food was liable to bring on debilitating periods, nymphomania and insanity.

ALAN
Speaking as an uncle, I am often discouraged from giving
them too much chocolate, because they go, in quotes, ‘mental’.

Other books

At Face Value by Franklin, Emily
Blade Runner by Oscar Pistorius
Hell Bent by Becky McGraw
Checkmate by Steven James
Tongues of Fire by Peter Abrahams
Uphill All the Way by Sue Moorcroft