Fat, Fate, and Disease : Why we are losing the war against obesity and chronic disease (28 page)

Yet not all such old-fashioned ideas are wrong. Take for example the concept of self-control. This is critical to adults in being able to manage their lifestyles—how to resist that extra cookie or to walk up two flights of stairs rather than take the elevator. There are many other aspects to self-control—control of emotions is a critical part of one’s persona. Self-control is one of a number of higher skills, including social skills, task management, and impulse control, that are termed non-cognitive skills (cognitive skills relate to measures of intelligence). Work from developmental psychologists, and compiled by the Nobel laureate in economics James Heckman from Chicago, has shown that these non-cognitive capabilities are heavily influenced
by the nature of upbringing in the first few years after birth. Vulnerable children from families with intergenerational deprivation and living in poverty, which are often also subject to various forms of discrimination, are far more likely to have deficiencies in the maturation of their non-cognitive skills. This has important implications for their health, their capacity to learn, whether they graduate from high school, the relationships they form, their later earnings, and the likelihood of adverse encounters with the law.

Intensive interventions in these families early in a child’s life can have major beneficial effects, including on nutrition. When nutritional advice is offered to mothers and infants through so-called ‘Headstart’ programmes, lasting reductions in the risks of obesity are seen. This is clear evidence of the importance of those early years in developing self-control, along with other non-cognitive capabilities associated with healthy behaviours. As every developed country has families in this situation, there is an obvious need to adopt such programmes more extensively. They may seem expensive in the short term but, as Heckman’s calculations have shown, they have real economic dividends in the longer term.

The two-legged stool

There is a second reason why the importance of biological development to chronic disease risk has been ignored—namely that a much more simplistic model of gene and environment has been far more comfortable for many scientists and doctors to grasp. Most doctors and medical scientists receive hardly any training in the science of development and there are strong historical reasons in both science and medicine for why development has tended to be ignored. It complicates matters too much for many of us.

Since the Human Genome Project has been completed, the technological capacity to look for genetic variation across our 21,000 genes, any of which might relate to disease, has become a
science in its own right. It operates at an industrial scale, devouring enormous amounts of funding and sometimes generating confusing and overstated headlines. But the future offered by this project was projected in a hyperbolic manner even a decade ago. We had the spectacle of Tony Blair and Bill Clinton announcing the unravelling of the human genome to the world—it was certainly an important technological and informatics breakthrough but it is more symbolic than meaningful in its own right.

We heard Nobel laureates making exaggerated claims that they would now find rather embarrassing—namely, that once we had sequenced the genome we would understand essentially everything we needed to know about human destiny. It would be possible, they claimed, to screen populations to determine the risk of disease, then to warn them in time, and to intervene. It would be possible to study the genomic patterns in samples taken from humans with a particular disease, and this might provide information about how the symptoms arose and give us new clues about causation and new treatments. It would be possible to screen the DNA of patients under treatment, to see how their small genomic differences were related to the side effects of the treatments which they experienced. A brave new world for medicine was opening up. Who needed the blurring of focus which developmental biology would add to the picture? It was far too neat and tidy without it.

But as we saw in
Chapter 6
, the hype and the extravagant claims made at the inception of the Human Genome Project are now dissipating. Hard questions are beginning to be asked. Isn’t the possibility of predicting the majority of future cases of chronic diseases, based on the human genome sequence, a mirage? When we get closer, it just disappears. With the wisdom of hindsight we can see that it was always going to be this way—genetic variation is only one leg of a three-legged stool on which our health rests: our development and the way we live form the other two. No stool will stand with only two legs. And it is pointless to ask which leg is most important—they
are interdependent. The two-legged stool gives us no solid basis for understanding variations in risk between individuals and groups, or in looking for opportunities to intervene. There cannot be any firm understanding without including development.

Contributing to this scientific confusion has been a widely held attitude to developmental science which meant that it was left out of the scientific agenda during the early and mid 20th century. Part of the reason was the focus on the genetic basis of inheritance which followed the concept of the gene as the unit of that inheritance, and part of it was the view of many evolutionary biologists that development was irrelevant because it was just a pathway to adulthood—and in their view natural selection only acted on the adult. We now know that both these ideas were wrong.

But there were other reasons for looking askance at development. Part of the reason for dismissing developmental biology followed some well-publicized cases of possible scientific fraud in the first half of the 20th century, which made the subject look very dubious indeed.

Faking it

One of the most controversial biologists of the early 20th century was the Austrian Paul Kammerer, who conducted what even now seems like ground-breaking and fundamental work on development and reproduction in amphibians such as frogs and toads. Working at the privately funded extramural Institute of Experimental Biology (
Vivarium
), Kammerer was convinced that a change in the environment could bring out hidden characteristics in these animals, which could then be passed on to the next generation. In his most famous (and infamous) experiment he induced midwife toads to breed in water by raising its temperature, and he claimed that when this happened the males developed ‘nuptial pads’ on their feet which enabled them to grasp the slippery females—a feature present in their evolutionary ancestors but not normally seen in modern toads.

Kammerer’s finding was dramatic and seemed to support the increasingly unfashionable, and by the 1920s predominantly rejected, theory of the inheritance of acquired characteristics, usually connected with Lamarck. In 1926, it caused bitter scientific controversy and the opposition was led by one of the most famous geneticists of that time, William Bateson—who coined the term ‘genetics’. It was then claimed publicly that Kammerer’s experiments were fraudulent, a view that was endorsed when it was found that the only surviving toad, albeit one pickled in a jar, had had Indian ink injected into the apparent nuptial pads to make them visible. We do not know whether this was done by a technician to enhance the pads once preserved or whether it was part of a complex fraud. The scientific feud became very personal and very nasty. We may never learn the full truth, as Kammerer committed suicide in September 1926, but recent research indicates that he may have been the victim of a right-wing conspiracy at the University of Vienna against left-wing and especially Jewish scientists.

There were even bigger problems for the status of developmental biology emerging in the Soviet Union. At that time, some of the most important work at the nexus between development and evolution was being undertaken in the relatively young Soviet state where a critical mass of highly innovative thinkers in genetics, evolution, and development had been established. But with Stalin in power, science and politics became confused, and science became subservient to Leninist theory and ideology. Genetic determinism was seen as bourgeois and in conflict with the Marxist belief that the State could mould its citizens to form a new society. Many scientists tried to resist this intrusion of political belief into the process of science, but biological science had become dominated by one man, Trofim Lysenko, who was to have a profound influence on generations of Soviet biology.

Lysenko was an ambitious but poorly trained agricultural biologist who worked on ways to optimize food production. He had come to Stalin’s attention because of his claim to have increased
production in the collectivized farms. Lysenko’s so-called ‘science’ suited Stalin’s politics and, in a rather short time, Stalin was to give him a great deal of control over the Soviet science apparatus. It did not matter that this bogus science had become divorced from reality.

In Stalin’s Russia, crop yields were low and this often meant hunger for much of the population, particularly following the failure of the mass collectivization which had led to poor practices. Lysenko was aware of a well-tried and tested procedure called vernalization, by which seeds exposed to cold before they are planted germinate earlier. He wondered if the harsh winters of Russia could be harnessed to condition seeds so that two crops could be planted and harvested each year—what a difference it could make if that were possible! Lysenko’s early experiments looked promising, and became dominant in the Russian Academy of Sciences, despite the justified scepticism of its better and more independent scientists.

Lysenko’s work attracted the attention of Stalin, who was keen to find ways of optimizing the output of the agricultural sector of the state economy to match the factories of the cities. His science fitted the ideology of the day whereas that of genetics did not—merely breeding new strains of crops took too long and did not yield results as dramatic as those claimed by Lysenko. So Lysenko rose to power in the Soviet system. He aggressively rejected mainstream genetics and biological theory, banishing rivals until he controlled the Russian Academy of Sciences. As time progressed his claims became more exaggerated, going beyond wheat and potatoes to fruit trees and lumber plantations. But Lysenko’s work had not gone unnoticed in the West, and shortly after the war delegations of European scientists were dispatched to see it in action for themselves. They couldn’t. There was virtually no ongoing work to see and certainly no convincing data published.

It appeared that many of Lysenko’s claims that the environment during the earliest phase of plant development could alter growth
and improve crop yields were totally false. The scandal broke in 1948 but rumbled on for many months afterwards. With that, Soviet biology lost all credibility in the West and the ‘Lysenko affair’ became symbolic not only of a widespread distrust of Soviet science, but of the role of the environment in development itself. Tragically, it set Soviet biology back decades and some outstanding work by early Russian evolutionary biologists was essentially lost. Even now, Russian biology suffers from this legacy.

Stories like these are very bad for science. Science requires absolute integrity from those who practise it. Scientists pride themselves on reporting what they have found with great accuracy and even the hint that a researcher is not doing so can be enough to damage their reputation seriously. If there is any question about the scientific probity of an individual, young researchers are advised to have nothing to do with them, that is, they are not to refer to their work and not to engage with them in any way.

Whatever the rights and wrongs of the Kammerer and Lysenko cases, there have been too many examples of unequivocal fraud for any risks to be taken. Unfortunately developmental science seems to have had more than its fair share—perhaps because of its complexity. Recently, for example, the claim by Hwang Woo Suk, a stem cell biologist from South Korea, that he could clone human embryonic stem cells, a technical feat of great importance to the emergent field of stem cell research, caused much excitement when he published his work in the world’s leading scientific journal. South Korea started to build a whole industry around him. But all this excitement rapidly turned to justifiable disgust when his work was found to be fraudulent. His reputation was in tatters and he was subject to legal proceedings.

Sadly, there are just too many stories like this, as the stakes in science appear to be getting higher. The commercialization of science, the potential fiscal and career rewards that can now arise from a great breakthrough, drive some scientists to break this code
of integrity—after all, science is a human endeavour and some scientists, like other professionals, succumb to temptation. But the code of practice must be protected at all costs if science is to progress and be trusted. The more important the area of research, the more competitive and the greater its implications, the more tempting it is for scientists to step over the line between presenting their ideas in the best possible light and actually fabricating the data.

Scientific fraud is very bad for society too because it slows down progress. But because so many of these cases involved that most difficult of sciences, development, it gained a bad reputation. Scientific opinion leaders turned their attention in other directions.

Splitting up

For much of the 20th century, what makes us what we are was considered in terms of only two processes. One was genetic; the other was the environment. Between them they were considered to hold the keys to whether we are healthy or sick. We were born with a genetic make-up and then had to face the world. For example, if we had the gene that allowed us to absorb the lactose in milk as most people from European stock do, we are likely to have a high dairy intake. If we do not, like most people from Asia, then we have to avoid milk products because they upset our digestion. We term this problem ‘lactose intolerance’, although in fact the ‘natural’ evolved state for most humans is to lose the capacity to absorb lactose after weaning. It is only in those descended from some proto-European dairy farmers around 8,000 years ago and some East African dairy farmers around 2,000 years ago who have the genetic mutations that allow them to digest lactose as adults.

Other books

A Promise Given by Samantha James
The Black Stallion by Walter Farley
Small Mercies by Joyce, Eddie
Taking Courage by S.J. Maylee
Flash Virus: Episode One by Steve Vernon
An Affair of Deceit by Jamie Michele
A Pure Double Cross by John Knoerle
The Gardener by Bodeen, S.A.