When it comes to our dining habits, there is a giant mismatch between thought and deed, between knowledge and behavior. “Eat food. Not too much. Mostly plants,” says influential food writer Michael Pollan. It’s a wise and simple mantra, much repeated; yet for many it seems anything but simple to follow in daily life. To adhere to it you need to: “Like real food. Not enjoy feeling overstuffed. And appreciate vegetables.” These are skills that many people have not yet acquired, however intelligent or advanced in years they may be. There’s another complication, too. The “not too much” part of Pollan’s dictum needs modifying to take account of those who have learned to eat
too little
, or at least not enough of the right foods. I am not just talking about the underweight. The term “malnutrition” now covers obesity as well as starvation; there is evidence that obese populations across the world suffer disproportionately from micronutrient deficiencies, notably vitamins A and D, plus zinc and iron. Learning how to eat better is not about reducing consumption across the board. While we undoubtedly need to eat less of many foods—sugar springs to mind—we need more of others. Among other lost eating skills—see also not “spoiling your appetite” and not “wolfing down your dinner”—we seem to have lost the old-fashioned concept of “nourishing” ourselves.
A tone of judgmental impatience often creeps into discussions of obesity. “It’s not exactly rocket science, is it?” is a frequent observation on newspaper comment boards, from some of those lucky ones who have never struggled to change their eating, followed by the quip that all that needs to be done to fix the situation is to “eat less and move more.” The implication is that those who do
not
eat less and move more are somehow lacking in moral fiber or brains. But consider this. American firefighters, who are not people notably lacking in courage or quick-wittedness, have higher rates of obesity and being overweight—at 70 percent—than the
general population. The way we eat is not a question of worthiness but of routine and preference, built over a life-span. As the philosopher Caspar Hare has said, “It is not so easy to acquire or drop preferences, at will.”
Once we accept that eating is a learned behavior, we see that the challenge is not to grasp information but to learn new habits. Governments keep trying to fix the obesity crisis with well-intentioned recommendations. But advice alone never taught a child to eat better (“I strongly advise you to finish that cabbage and follow it with a glass of milk!”), so it’s strange that we think it will work on adults. The way you teach a child to eat well is through example, enthusiasm, and patient exposure to good food. And when that fails, you lie. In Hungary, children are taught to enjoy eating carrots by being told that they bestow the ability to whistle. The point is that before you can become a carrot eater, the carrots have to be desirable.
When this book started taking shape in my head, I thought
my subject was childhood food. Bit by bit, I started to see that many of the joys and pitfalls of children’s eating were still there for adults. As grown-ups, we may still reward ourselves with treats, just as our parents did, and continue to “clean our plates,” though they are no longer there to watch us. We still avoid what disgusts us, though we probably know better than to throw it under the table when no one is looking. Put a lit-up birthday cake in front of anyone and they are young again.
One of the questions I wanted to explore was the extent to which children are born with hardwired food preferences. As I trawled through endless academic papers in the library, I predicted fierce disagreement among contemporary scientists. On one side I would find those who argued that food likes and dislikes were innate; on the other, those who insisted they were acquired: nature versus nurture. To my astonishment, I found that this was not the case. Far from controversy, there was a near-universal consensus—from psychologists, from neuroscientists, from anthropologists and biologists—that our appetite for specific foods is learned. Within this broad agreement, there are, as you might expect, still plenty of scholarly disputes, such as the brouhaha
over whether our love-hate relationship with bitter vegetables such as Brussels sprouts has a genetic underpinning. There are also competing theories on the extent to which our food learning is mediated by particular genes, hormones, and neurotransmitters. But the fundamental insight that human food habits
are
a learned behavior is not the subject of scientific debate.
This scientific consensus is remarkable, given that it is the opposite of how we usually discuss eating habits in everyday conversation. There’s a common assumption—shared, curiously enough, by those who are struggling to eat healthily, as well as many of the nutritionists who are trying to get them to eat better—that we are doomed by our biology to be hooked on junk food. The usual story goes something like this: our brains evolved over thousands of years to seek out sweetness, because in the wild, we would have needed a way to distinguish wholesome sweet fruits from bad bitter toxins. In today’s world, where sugary food is abundant, or so the thinking goes, our biology makes us powerless to turn down these “irresistible” foods. We know that tasting something sweet activates the pleasure-generating parts of the brain and even acts as an analgesic, comparable to drugs or alcohol. Paleolithic brain + modern food = disaster.
What’s missing from this account is the fact that, while the taste for sweetness is innate to all human beings and common to all cultures, when it comes to actual sweet foods—and other unhealthy processed foods—we show profoundly varied responses. As one 2012 study of food preferences states, our attitudes to sweetness vary “in terms of perception, liking, wanting and intake.” Different people enjoy sweetness in very different forms. Sweetness could mean a whole cob of corn at the height of summer, or a plate of milky-fresh mozzarella, or fennel cooked long and slow until it is toffee-brown. Our love of sweetness may be universal, but there are vast individual differences in how we learn to ingest it. Put another way: not everyone wants to get their sweet hit in the form of Froot Loops.
Nutritionists use the word “palatable” to describe foods high in sugar, salt, and fat, as if it were impossible to prefer a platter of crunchy greens dressed with tahini sauce to a family-sized bar of chocolate. Yet around a third of the population—Paleolithic brain or not—manages to navigate
the modern food world just fine and select a balanced diet for themselves from what’s available.
I’m not saying that to be thin is necessarily healthy. Some of the non-overweight may be anorexic or bulimic. Others avoid food through cigarettes and drugs, or burn off a junk-food habit with manic exercise. When we speak of an “obesity epidemic,” along with making dieters feel worse than they already do, we miss the fact that the situation is more complex than thin = good, fat = bad. Robert Lustig, a leading specialist on the effects of sugar on the human body and a professor of clinical pediatrics at the University of California, San Francisco, points out that up to 40 percent of normal-weight people have exactly the same metabolic dysfunctions as those associated with obesity, including “diabetes, hypertension, lipid problems, cardiovascular disease . . . cancer and dementia,” while around 20 percent of obese people get none of these diseases and have a normal life-span.
So we cannot assume that everyone who is “normal weight” has a healthy relationship with food. (Incidentally, given that these people are in a minority, isn’t it time that we stopped calling them “normal”? How about “exceptional”?) The situation is more complicated than the numbers suggest. But I’d still hazard that this exceptional one-third of the population has something important to tell us. There are hundreds of millions of individuals who somehow swim against the tide of the dysfunctional modern food supply and feed themselves pretty well. There are those who can eat an ice cream cone on a hot day without needing to punish themselves for being “naughty,” who automatically refuse a sandwich because it isn’t lunchtime yet, who usually eat when they are hungry and stop when they are full, who feel that an evening meal without vegetables isn’t really a meal. These individuals have learned eating skills that can protect them in this environment of plenty.
Viewed through the lens of psychology, eating is a classic
form of learned behavior. There is a stimulus—an apple tart, say, glazed with apricot jam. And there is a response—your appetite for it. Finally, there is reinforcement—the sensory pleasure and feeling of fullness that
eating the tart gives you. This reinforcement encourages you to seek out more apple tarts whenever you have the chance and—depending on just how great you feel after eating them—to choose them over other foods in the future. In lab conditions, rats can be trained to prefer a less sweet diet over a sweeter one when it is packed with more energy and therefore leaves them more satisfied: this is called post-ingestive conditioning.
We know that a lot of this food-seeking learning is driven by dopamine, a neurotransmitter connected in the brain with motivation. This is a hormone that is stimulated in the brain when your body does something rewarding, such as eating, kissing, or sipping brandy. Dopamine is one of the chemical signals that passes information between neurons to tell your brain that you are having fun, and its release is one of the mechanisms that “stamps in” our flavor preferences and turns them into habits. Once animals have been trained to love certain foods, the dopamine response can be fired up in the brain just by the sight of those foods: monkeys have a dopamine response when they see the yellow skin of bananas; the surge begins to take place as they anticipate the reward. Anticipating dopamine release is the incentive that makes lab rats work hard for another treat by pressing a lever.
Humans, needless to say, are not lab rats.
b
In our lives, the stimulus-response behavior around food is as infinitely complex as the social world in which we learn to eat. It’s been calculated that by the time we reach our eighteenth birthday, we will have had 33,000 learning experiences with food (based on five meals or snacks a day). Human behavior is not just a clear-cut matter of cue and consequence, because human beings are not passive objects, but deeply social beings. Our conditioning is often indirect or vicarious. We learn not just from the foods we put in our own mouths, but from what we see others eat, whether in our own families, at school, or on TV.
As children watch and learn, they pick up many things about food besides how it will taste. A rodent can press a lever to get a sweet reward, but it takes an animal as strange and twisted as a human being to inject
such emotions as guilt and shame into the business of eating. Before we take our first bite of a certain food, we may have rehearsed eating it in our minds many times. Our cues about when to eat and what to eat and how much extend beyond such drives as hunger and hormones into the territory of ritual (eggs for breakfast), culture (hotdogs at a baseball game), and religion (turkey at Christmas, lamb at Eid).
It soon became clear to me that I could not get the answers
I sought about how we learn to eat without exploring our wider food environment, which is a matter of mealtimes and cuisine, and parenting and gender, as well as neuroscience.
Our modern food environment is fraught with contradictions. The burden of religious guilt that has been progressively lifted from our private lives has become ever more intense in the realm of eating. Like hypocritical temperance preachers, we demonize many of the things we consume most avidly, leaving us at odds with our own appetites. Numerous foods that were once reserved for celebrations—from meat to sweets—have become everyday commodities, meaning not only that we overconsume them, but that they have lost much of their former sense of festal joy. The idea that you don’t eat between meals now seems as outdated as thinking you must wear a hat when you step out of the house.
Yet, while the nutritional content of our food supply has changed hugely over the past fifty or so years, other aspects of eating have not changed fast enough to keep pace with the new conditions of modern life. Parents are still using a range of traditional feeding methods—such as urging children to finish what’s on their plate—that were devised for a situation where famine was always around the corner. As we’ll see, such feeding techniques are contributing directly to child obesity in cultures as diverse as China and Kuwait.
The theme I revisit more than any other is families. Most of what we learn about food happens when we’re children—when we’re sitting at the kitchen table (if your family is lucky enough to have one), being fed. Every bite is a memory, and the most powerful memories are the first ones. At this table, we are given both food and love, and we could be
forgiven if, later in life, we have trouble distinguishing the two. It is here that we develop our passions and our disgusts, and get a sense of whether it is more of a waste to leave something on the side of the plate, or eat it up when we are not hungry.
Our parents—like our governments—hope we will learn about food from all the things they tell us, but what we see and taste matters more than what we hear. In many ways, children are powerless at the table. They cannot control what is put in front of them, or where they sit, or whether they are spoken to kindly or harshly as they eat. Their one great power is the ability to reject or accept. One of the biggest things many children learn at that table is that their own choices about food—to eat or not to eat—can unleash deep emotions in the grown-ups close to them. We find that we can please our parents or drive them to rage just by refusing dessert. And then the adults complain that
we
are difficult at mealtimes!
After a certain point in our lives, it is us, and not our parents, spooning food into our mouths. We discover the glorious liberation of being able to choose whatever we want to eat—budget permitting. But our tastes and our food choices are still formed by those early childhood experiences. Rather alarmingly, it seems that our food habits when we were two—whether we played with our food, how picky we were, the amount of fruit we ate—are a pretty accurate gauge of how we will eat when we are twenty.