The Fascinating Science Behind Your Chocolate Craving
The latest research on humanity’s hardwired—and dangerous—love of sweetness.
Dana Small was 11 years old when she knew she wanted to study the brain. Six years earlier, her mother had developed epilepsy. Small watched as a woman she’d known to be quick witted and brilliant struggled to get out of bed in the morning. “That’s what sparked my interest in the fragility of consciousness,” Small says, “which is what led to the brain.”
In 2001, she received her PhD from McGill University, after conducting one of her first and most famous experiments. In it, she gave subjects Lindt chocolates to suck on as their brains were being scanned. The scans revealed that with each successive chocolate, pleasure—or “reward value,” as the study called it—diminished. The parts of the brain that were brought to glowing life by the first chocolate became more and more dim. The study, published in Brain: A Journal of Neurology, was a breakout success. To this day, Small remembers Wolfram Schultz, one of the world’s most prominent neuroscientists, telling her that it was the most comprehensive study yet of motivation in humans. Since publication in 2001, it has been cited more than 1,300 times by other scientists exploring the nature of pleasure, not to mention by an untold number of journalists who use it as proof of humanity’s hardwired and dangerous love of sweetness.
Despite the widespread acclaim, Small often found herself unsure of what the experiment actually showed. Was sweetness simply a superficial pleasure that diminished with continued exposure, the way a song can become boring when overplayed? Or was something deeper going on? Was the brain’s opinion of those chocolates influenced by signals it was receiving from the body and computations it was making about its needs?
It would take years, but those thoughts eventually crystallized, sending Small down a twisting experimental path that would lead to the biggest finding of her career.
Lessons From Sweet-Blind Mice
It began in 2008 at Yale University, where Small had become a professor. Next door to her office, a Brazilian neuroscientist named Ivan de Araujo had been running some odd-sounding experiments with mice that were “sweet-blind”—they had been genetically engineered to lack the ability to taste sugar. Araujo then fed them, of all things, sugar.
Araujo was interested in “post-ingestive” effects. In 1970, a French physiologist named Michel Cabanac ran an experiment and found that sugar water extinguished appetite only if it was swallowed—if people spat it out, their hunger would persist. That meant sugar’s biological interaction with the human body went beyond the tongue. A message, it seemed, was delivered to the brain that said, “Sugar obtained—cancel hunger.” If the sugar water was spat out, however, no such message was sent. This was what led Araujo to do something as seemingly pointless as give sugar water to mice who were incapable of tasting sweetness.
The mice were put in cages with three sippers. One was filled with water, another filled with sugar water, and a third one with even sweeter sugar water. The mice, being sweet-blind, had no idea which was which, so they fumbled between dispensers, licking them all equally and randomly.
But only for a little while. After six days, the mice’s behaviour was no longer random. They were deliberately drinking sugar water. They consumed it with the same frequency and in the same quantity as mice who were not sweet-blind. Somehow, the “blind” mice knew which sipper was which.
Araujo discovered that the brain chemical dopamine, previously thought to trigger the feeling of “wanting,” was keeping a running tally of the energy contained in the food the mice had consumed. With time, dopamine did enough bookkeeping that deep in the brains of the mice, predictions were formed, such as sipper on the left = calories. When those mice became hungry, “wanting” kicked in, compelling them to the sipper with calories, which they drank from, all the while not tasting a thing.
His mice “wanted” calories and ingested them from the sweet sipper until “wanting” turned off. But their desire was never quenched by enjoyment. “We didn’t see the classic liking signs,” Araujo says. There was no paw-licking, no gleeful tongue-poking. The mice were like whiskered little robots.
Araujo followed this study up with one that was even stranger. This time, he used regular lab mice and put them in cages with two sippers; one contained water mixed with sucralose, an artificial sweetener, while the other contained an extremely bitter chemical called denatonium benzoate. The bitter sipper, however, was configured in such a way that when a mouse drank from it, a little burst of sugar would be injected into its stomach. Araujo thus “rewrote” the rules of taste. Sweetness now indicated no calories, while bitterness did the opposite. Once again, the mice behaved like intelligent energy-seeking robots. They gave up on the sweet but ultimately useless sipper and opted instead for the wincingly bitter taste of the denatonium benzoate, as though magnetically drawn to the calories it provided.
The rules of taste, it seemed, weren’t carved in stone. Sweetness may be hardwired, but it is just a cue, a label whose meaning can be overwritten. The stomach wasn’t some dumb and unfillable pit—it was an active participant, sending information to the brain, information that was recorded with each meal and used to make predictions. What the brain ultimately cares about isn’t how food tastes. It cares whether food is useful.
Unexpected Results
Araujo’s findings brought Small’s thoughts back to her Lindt chocolate experiment. What was it exactly, she wondered, that she’d witnessed in the brains of her chocolate eaters? Was it a simple attraction to sweetness fading in those brain scans? Or was it a deeper attraction to calories that was fading to black?
It is a fine and good question, but how do you answer it? How could Small measure in humans what Ivan de Araujo had so exquisitely observed in genetically engineered mice?
This is how: Small created five separate drinks, each with a distinct flavour and colour. She then added a precise amount of the artificial sweetener sucralose, so that each drink tasted as sweet as a drink that contained about 75 calories’ worth of sugar. Finally—this was the key step—she added varying amounts of a chemical called maltodextrin, a simple, high-calorie starch invented in the late 1960s that the subjects could not taste, which allowed her to manipulate the calorie count of each drink while leaving the taste unchanged.
Small had thus created a little arsenal of drinks that tasted equally sweet, but each carried a different energy payload: zero, 37.5, 75, 112.5 and 150 calories. Small gave samples of the drinks to her test subjects so that their brains could “learn” the caloric value of each. Next, she scanned their brains as they sampled each beverage. If she detected any differences in the “wanting” areas of the brain, she knew it would have to be due to the calories and not the sweetness, because the drinks all tasted equally sweet.
It was ingenious. Small had found a way to separate calories from sweetness in human test subjects, and now she would be able to measure which was doing what. Everything about the experiment was perfect, except for one thing: it didn’t turn out at all the way she expected.
Small had anticipated that the highest-calorie drink would trigger the biggest brain response. A hundred and fifty calories are more biologically useful, after all, than zero calories, 37.5 calories and so forth. Yet it was the 75-calorie drink that generated the clearest spike of brain activity. What was going on? If calories were driving the desirability of the drinks, then the 75-calorie drink should have produced less motivational oomph than the 150-calorie drink. But it produced more. If calories had nothing to do with desirability, why would a 75-calorie drink be more desirable than a zero-calorie drink? It made no sense.
The more Small thought about it, however, the more a single number came into focus: 75. The drinks had all been designed to taste as though they had 75 calories’ worth of sugar, and it was the 75-calorie drink that produced the biggest brain response: 75 and 75. Was this more than just a coincidence?
To answer that question, Small moved from the brain to the body and measured how each of the drinks was metabolized. It was a simple experiment. Subjects would enter the lab, consume one of Small’s drinks, then be connected to a machine called an indirect calorimeter, a device that analyzes the heat a person’s body produces and from this can estimate the quantity of calories being burned. It is a textbook response called the thermic effect of food. When the body takes in calories and uses them, it generates heat as a by-product, the same way the engine of a car heats up after it’s been running. The more calories a person consumes, the greater the thermic effect.
That, at least, is what the textbooks say. But that is not what Small found. She can vividly remember the day her laboratory assistant showed her the initial results. “It blew my mind,” Small says. “I knew right away we were onto something new and exciting.”
“Nutritive Mismatch”
A few days earlier, a female test subject, a woman in her 20s, had consumed the 75-calorie drink and was subsequently connected to the indirect calorimeter. On cue, her body produced a little plume of heat, indicating that the 75 calories were being burned.
Days later, the same woman returned and drank the 150-calorie drink and, once again, was connected to the indirect calorimeter. There should have been a gradual uptick in heat production. Her body should have produced more heat with the 150-calorie drink than it had with the 75-calorie drink. Then Small’s lab assistant shared data that seemed almost impossible: the indirect calorimeter measured nothing. It was as though the woman had not consumed a single calorie.
The findings were so odd Small did the experiment again, but the results did not change. Over time, a distinct pattern had emerged: when people consumed the drinks in which the sweetness and calories were not in sync, the calories those drinks delivered would not be properly metabolized. Small calls the phenomenon “nutritive mismatch.” The maltodextrin would splash into their stomachs, where enzymes would convert it into sugar, and the sugar would be absorbed into the blood. But then, oddly, it wouldn’t get burned. Like a film of gasoline floating atop sea water, the sugar just circulated in the blood. When the drinks were “matched,” on the other hand—when the level of sweetness correctly indicated the caloric payload—the calories were burned as expected.
Small’s research journey had taken a 90-degree turn. By attempting to discern what it was about sweet foods that made them desirable, she had unexpectedly discovered something more fundamental. Sweetness wasn’t just some enjoyable but arbitrary taste sensation. It was a metabolic signal, the first spark in a string of biochemical processes by which sugar is turned into energy. Sweetness was like the trumpeter at the castle gates. It heralded not only the arrival of calories but the specific quantity and began making arrangements for how they would be used.
When sweetness and calories match, it all hums along: calories are burned, the brain registers it, and the brain remembers. But when there was an unexpected variance between what the tongue sensed and what the stomach received, the entire metabolic process seemed to shut down. “It’s like the system just threw up its hands,” Small says, “and didn’t know what to do.”
Did nutritive mismatch have long-term consequences? This question prompted Small’s next study, which looked for a hallmark of diabetes called insulin sensitivity, a condition in which cells no longer respond properly to this crucial hormone. She tested drinks with sugar, drinks with no calories, and drinks where sweetness and calories were mismatched. Once again, the results were as amazing as they were alarming. The mismatched drinks—and only the mismatched drinks—impaired insulin sensitivity.
Finally, Small fed mismatched beverages to teenage boys and girls. This was a particularly relevant investigation because adolescents are in a period of heightened body and brain development and so have an outsize calorie appetite, which is one reason teens drink a lot of sugary beverages. The study had barely even begun when Small and her team drew blood from three subjects and discovered, to their great alarm, that two had already become prediabetic. An ethics board reviewed the results and deemed the health risk to be so great that it would be unethical to continue.
A Brief History of Calories
If the thought of spiking blood sugar and prediabetic teenagers alarms you, there is more bad news. Those drinks didn’t even taste good. The highest-scoring beverage inched above “like slightly” but didn’t crack “like moderately.” Small’s brain scans showed plenty of cerebral action, but “liking” wasn’t invited to the party. It was “wanting.” Her beverage-drinking volunteers were like those sweet-blind mice—drawn to consume drinks they did not particularly enjoy.
It makes one wonder: where did this idea that obesity is an excess of pleasure come from in the first place? This is the root of a long-standing stigma against obese people. They indulge themselves to excess. They are too selfish to say, “I may want more, but enough is enough.”
Obesity, the experts keep telling us, is caused by an overabundance of “highly palatable foods”: pizza, ice cream, chicken fingers, cheeseburgers and the like. Lately it has become fashionable to refer to such foods as “hyperpalatable,” the idea being that these ultraconcentrated wallops of sweet and salty calories deliver a hit of bliss so strong it “sensitizes” the brain, just like addictive drugs.
If only life were that simple. It is easy to paint calories as humanity’s enemy, which perhaps explains why we have gleefully been doing so for decades. But to malign the lowly calorie is to fundamentally misunderstand the evolutionary story that brought our species into existence.
Several million years ago, the brains of our evolutionary ancestors were roughly a third the size they are now. Brains are energy hogs. They burn lots of calories. Having a small brain meant our ancestors could survive on a low-calorie, fibrous plant diet. They spent much more of their time foraging and eating, and they had long, slow-moving digestive tracts that were needed to extract nutrients from this kind of diet.
As humans evolved, however, a trade-off took place. Our brains got much bigger, while our guts got smaller and faster. A big brain paired to a small gut meant we had to upgrade to food that packed a bigger caloric punch: fatty meat, nuts, seeds, grains, sweet fruit and the like. Four entirely separate populations of humans—one in Europe, two in Africa and one on the Arabian Peninsula—evolved the ability to digest milk into adulthood, giving them the lifetime ability to consume one of the few sources of fat blended with carbohydrates found in nature.
Eating such a calorie-rich diet brought with it a great advantage: time. We spent less of the day obtaining food. We saved countless hours of needless chewing. Instead, we began to do all the things that make us human: we fashioned tools, erected structures, shared stories, created myths and played games. We invented cooking, which made our rich, easy-to-digest food even easier to digest.
Calories made humanity possible. Calories are what fuelled our big brains. Our calorie-rich diet didn’t reinforce the compulsion to eat; it released us from a food-gripped existence. It gave us the time to do big-brainy things. Just because we require calories does not mean our basic programming compels us to overconsume them, for the same reason that requiring oxygen does not compel people to perpetually hyperventilate.
As we became more advanced, there were even more reasons to refrain from overindulgence. Food had to be shared with other members of the tribe, then the village, then the town, especially with children, whose dependence on adults for resources lasts an eternity compared to that of other species.
Eventually, we reached one of the great landmarks in our species’ development: we figured out how to store food. Eleven thousand years ago, we stockpiled grain in purpose-built facilities that kept it dry and free from rodents so we could eat it weeks and months after harvest. Ancient Egyptians collected honey from their apiaries and stored it in clay pots. Five thousand years ago, Indigenous people living on the Great Plains smashed bison bones into pieces and boiled them in steaming vats fashioned from animal hides. When the rendered fat rose to the top, they scooped it off and blended it with dried meat and berries to make a carb-, protein-, and fat-rich calorie bomb called pemmican.
This innovation brought with it an incredible leap forward in energy efficiency. When calories are stored as fat, a great deal of energy is wasted just hefting all that extra weight around. But when food energy is preserved outside the body—in clay pots, in granaries—you achieve far greater efficiency.
Storing food, however, also required an essential mental capability: the ability to resist eating it. Ancient Native Americans didn’t sit there for days at a time stuffing themselves full of pemmican until none was left. They feasted after the bison hunt, but they retained the ability to set aside enough for the brutal winter yet to come. There was an evolutionary fitness advantage in being able to say, “I will eat this later.” If we didn’t have that ability—if we really were slaves to our unending appetite for calories—the human species would have died off long ago.
All of which leaves us with the following paradox: why were more primitive humans able to resist consuming too many calories but advanced humans are not? In Dana Small’s research, we at last have the beginning of an answer.
Ever since organisms began sensing food as it entered their body, the information they gathered was reliable. This is why the ability to taste evolved in the first place. No creature has 40 minutes to sit and digest a meal so its brain can figure out if it should keep eating. It is much more efficient to take a reading as the food comes in. The ability to sense food is so crucial, in fact, that more DNA is devoted to systems that sense the taste and flavour of food—the nose and the mouth—than any other part of the human body. Tasting food engages more grey matter than any other activity.
This system is designed for accuracy, but it evolved in an environment in which food provided the senses with accurate information. Small’s research shows what happens when that changes: when the food we eat tells the brain a nutritional lie, the system fails. Calories enter the blood but are not burned. “If sweeteners are disrupting how carbohydrates are being metabolized,” Small explains, “then this could be an important mechanism behind the metabolic dysfunction we see in diets high in processed foods.”
To make matters even worse, we appear to be entering a golden age of nutritive mismatch, and ironically, it is in large part due to the widespread panic over sugar. To a food company, nothing could make more sense than blending sugar with artificial sweeteners. Both the calories and the amount of “sugar” appearing in nutritional info panels decreases, but the sweetness stays constant and the food or drink remains as tasty as ever. Creating nutritive mismatch is, simply, good for business. There is mismatch popping up all over the food and beverage aisles.
So here we have found a fundamental aspect of food and eating that has changed: the sensed nutritional value. For as long as humans and their ancestors have existed, the taste of a calorie matched the energy it delivered. In the span of just a few decades, that has changed. Calories don’t “mean” what they used to.
We have tampered with the very way the brain perceives food. This is what has set so many of us on a path to weight gain. We changed food, and it changed us.
Excerpted from The End of Craving. Copyright © 2021, Mark Schatzker. Published by Avid Reader Press an imprint of Simon&Schuster. Reproduced by arrangement with the Publisher. All rights reserved.
Now that you know why you’re craving chocolate, find out 25 ways sugar is making you sick.