Grain Brain: The Surprising Truth about Wheat, Carbs, and Sugar--Your Brain's Silent Killers

PART I

THE WHOLE GRAIN TRUTH

CHAPTER 4

Not a Fruitful Union

This Is Your Brain on Sugar (Natural or Not)

Evolutionarily, sugar was available to our ancestors as fruit for only a few months a year (at harvest time), or as honey, which was guarded by bees. But in recent years, sugar has been added to nearly all processed foods, limiting consumer choice. Nature made sugar hard to get; man made it easy.

—DR. ROBERT LUSTIG ET AL.1

SUGAR. WHETHER IT’S FROM A LOLLIPOP, Lucky Charms, or a slice of cinnamon-raisin bread, we all know that this particular carbohydrate is not the healthiest of ingredients, especially when it’s consumed in excess or comes from refined or processed forms such as high-fructose corn syrup. We also know that sugar is partly to blame for challenges with our waistlines, appetites, blood sugar control, obesity, type 2 diabetes, and insulin resistance. But what about sugar and the brain?

In 2011, Gary Taubes, the author of Good Calories, Bad Calories,2 wrote an excellent piece for the New York Times titled “Is Sugar Toxic?”3 In it, he chronicles not just the history of sugar in our lives and food products, but the evolving science behind understanding how sugar affects our bodies. In particular, he showcases the work of Robert Lustig, a specialist in pediatric hormone disorders and the leading expert in childhood obesity at the University of California, San Francisco, School of Medicine, who makes a case for sugar being a “toxin” or a “poison.” But Lustig doesn’t harp so much on the consumption of these “empty calories”; his issue with sugar is that it has unique characteristics, specifically in the way the various kinds of sugar are metabolized by the human body.

Lustig likes to use the phrase “isocaloric but not isometabolic” when he describes the difference between pure glucose, the simplest form of sugar, and table sugar, which is a combination of glucose and fructose. (Fructose, which I’ll get to in a moment, is a type of naturally occurring sugar found exclusively in fruit and honey.) When we eat 100 calories of glucose from a potato, for instance, our bodies metabolize it differently—and experience different effects—than if we were to eat 100 calories of sugar comprising half glucose and half fructose. Here’s why.

Your liver takes care of the fructose component of sugar. Glucose from other carbs and starches, on the other hand, is processed by every cell in the body. So consuming both types of sugar (fructose and glucose) at the same time means your liver has to work harder than if you ate the same number of calories from glucose alone. And your liver will also be taxed if it’s hit with liquid forms of these sugars, those found in soda or fruit juices. Drinking liquid sugar is not the same as eating, say, an equivalent dose of sugar in whole apples. Fructose, by the way, is the sweetest of all naturally occurring carbohydrates, which probably explains why we love it so much. But contrary to what you might think, it has the lowest glycemic index of all the natural sugars. The reason is simple: Because the liver metabolizes most of the fructose, it has no immediate effect on our blood sugar and insulin levels, unlike sugar or high-fructose corn syrup, whose glucose ends up in general circulation and raises blood sugar levels. Don’t let that fact fool you, however. While fructose may not have an immediate effect, it has more long-term effects when it’s consumed in sufficient quantities from unnatural sources. And the science is well documented: Consuming fructose is associated with impaired glucose tolerance, insulin resistance, high blood fats, and hypertension. And because it does not trigger the production of insulin and leptin, two key hormones in regulating our metabolism, diets high in fructose lead to obesity and its metabolic repercussions. (I will clarify later what this means for those who enjoy eating lots of fruit. Fortunately, for the most part, you can have your fruit and eat it, too. The quantity of fructose in most whole fruit pales in comparison to the levels of fructose in processed foods.)

We hear about sugar and its effects on virtually every other part of the body except for the brain. This, again, is a subject area that’s gotten remarkably little attention in the press. The questions to ask, and which I’ll answer in this chapter, are:

·   What does excess sugar consumption do to the brain?

·   Can the brain distinguish between different types of sugar? Does it “metabolize” sugar differently depending on where it’s coming from?

If I were you, I’d put down that biscuit or biscotti you’re having with your coffee and buckle up. After reading this chapter, you’ll never look at a piece of fruit or sugary treat in quite the same way.

SUGAR AND CARBS 101

Let me begin by defining a few terms. What, exactly, is the difference between table sugar, fruit sugar, high-fructose corn syrup, and the like? Good question. As I’ve said, fructose is a type of sugar naturally found in fruit and honey. It’s a monosaccharide just like glucose, whereas table sugar (sucrose)—the white granulated stuff we sprinkle in coffee or dump into a bowl of cookie batter—is a combination of glucose and fructose, thus making it a disaccharide (two molecules linked together). High-fructose corn syrup, which is what we find in our sodas, juices, and many processed foods, is yet another combination of molecules dominated by fructose—it’s 55 percent fructose, 42 percent glucose, and 3 percent other carbohydrates.

High-fructose corn syrup was introduced in 1978 as a cheap replacement for table sugar in beverages and food products. No doubt you’ve heard about it in the media, which has attacked this artificially manufactured ingredient for being the root cause of our obesity epidemic. But this misses the point. While it’s true we can blame our bulging waistlines and diagnoses of related conditions such as obesity and diabetes on our consumption of high-fructose corn syrup, we can also point to all other sugars as well since they are all carbohydrates, a class of biomolecules that share similar characteristics. Carbohydrates are simply long chains of sugar molecules, as distinguished from fat (chains of fatty acids), proteins (chains of amino acids), and DNA. But you already know that not all carbohydrates are created equal. And not all carbohydrates are treated equally by the body. The differentiating feature is how much a certain carbohydrate will raise blood sugar and, in effect, insulin. Meals that are higher in carbohydrate, and especially those that are higher in simple glucose, cause the pancreas to increase its insulin output in order to store the blood sugar in cells. During the course of digestion, carbohydrates are broken down and sugar is liberated into the bloodstream, again causing the pancreas to increase its output of insulin so glucose can penetrate cells. Over time, higher levels of blood sugar will cause increased production of insulin output from the pancreas.

The carbs that trigger the biggest surge in blood sugar are typically the most fattening for that very reason. They include anything made with refined flour (breads, cereals, pastas); starches such as rice, potatoes, and corn; and liquid carbs like soda, beer, and fruit juice. They all get digested quickly because they flood the bloodstream with glucose and stimulate a surge in insulin, which then packs away the excess calories as fat. What about the carbs in a vegetable? Those carbs, especially the ones in leafy green vegetables such as broccoli and spinach, are tied up with indigestible fiber, so they take longer to break down. The fiber essentially slows down the process, causing a slower funneling of glucose into the bloodstream. Plus, vegetables contain more water relative to their weight than starches, and this further dampens the blood sugar response. When we eat whole fruits, which obviously contain fruit sugar, the water and fiber will also “dilute” the blood sugar effect. If you take, for instance, a peach and a baked potato of equal weight, the potato will have a much bigger effect on blood sugar than the watery, fibrous peach. That’s not to say the peach, or any other fruit for that matter, won’t cause problems.4

Our caveman ancestors did in fact eat fruit, but not every day of the year. We haven’t yet evolved to be able to handle the copious amounts of fructose we consume today—especially when we get our fructose from manufactured sources. Natural fruit has relatively little sugar, when compared to, say, a can of regular soda, which has a massive amount. A medium-sized apple contains about 44 calories of sugar in a fiber-rich blend thanks to the pectin; conversely, a 12-ounce can of Coke or Pepsi contains nearly twice that—80 calories of sugar. If you juice several apples and concentrate the liquid down to a 12-ounce beverage (thereby losing the fiber), lo and behold you get a blast of 85 sugar calories that could just as well have come from a soda. When that fructose hits the liver, most of it gets converted to fat and sent to our fat cells. No wonder fructose was called the most fattening carbohydrate more than forty years ago by biochemists. And when our bodies get used to performing this simple conversion with every meal, we can fall into a trap in which even our muscle tissue becomes resistant to insulin. Gary Taubes describes this domino effect brilliantly in Why We Get Fat: “So, even though fructose has no immediate effect on blood sugar and insulin, over time—maybe a few years—it is a likely cause of insulin resistance and thus the increased storage of calories as fat. The needle on our fuel-partitioning gauge will point toward fat storage, even if it didn’t start out that way.”5

The most disturbing fact about our addiction to sugar is that when we combine fructose and glucose (which we often do when we eat foods made with table sugar), the fructose might not do much to our blood sugar right away, but the accompanying glucose takes care of that—stimulating insulin secretion and alerting the fat cells to prepare for more storage. The more sugars we eat, the more we tell our bodies to transfer them to fat. This happens not only in the liver, leading to a condition called fatty liver disease, but elsewhere in the body as well. Hello, love handles, muffin tops, beer bellies, and the worse kind of fat of all—invisible visceral fat that hugs our vital organs.

I love how Taubes draws a parallel between the cause-and-effect relationship uniting carbohydrates and obesity, and the link between smoking and cancer: If the world had never invented cigarettes, lung cancer would be a rare disease. Likewise, if we didn’t eat such high-carb diets, obesity would be a rare condition.6 I’d bet that other related conditions would be uncommon as well, including diabetes, heart disease, dementia, and cancer. And if I had to name the kingpin here in terms of avoiding all manner of disease, I’d say “diabetes.” That is to say, don’t become diabetic.

THE DEATH KNELL IN DIABETES

I cannot reiterate enough the importance of avoiding the path to diabetes, and if diabetes is already a card you’re playing with, then keeping blood sugars balanced is key. In the United States there are close to 11 million adults sixty-five years or older with type 2 diabetes, which speaks volumes to the magnitude of the potential catastrophe on our hands if all of these individuals—plus the ones who haven’t been officially diagnosed yet—develop Alzheimer’s. The data that supports the relationship between diabetes and Alzheimer’s disease is profound, but it’s important to understand that diabetes is a powerful risk factor for simple cognitive decline. This is especially true in individuals whose diabetes is under poor control. Case in point: In June 2012, the Archives of Neurology published an analysis of 3,069 elderly adults to determine if diabetes increased the risk of cognitive decline and if poor blood sugar control was related to worse cognitive performance.7 When first evaluated, about 23 percent of the participants actually had diabetes, while the remaining 77 percent did not (the researchers intentionally chose a “diverse group of well-functioning older adults”). A small percentage, however, of that 77 percent went on to develop diabetes during the nine-year study. At the beginning of the study a panel of cognitive tests was performed, and over the next nine years these tests were repeated.

The conclusion stated the following: “Among well-functioning older adults, DM [diabetes mellitus] and poor glucose control among those with DM are associated with worse cognitive function and greater decline. This suggests that severity of DM may contribute to accelerated cognitive aging.” The researchers demonstrated a fairly dramatic difference in the rate of mental decline among those with diabetes as compared to the non-diabetics. More interesting still, they also noted that even at the start of the study, baseline cognitive scores of the diabetics were already lower than the controls.’ The study also found a direct relationship between the rate of cognitive decline and higher levels of hemoglobin A1C, a marker of blood glucose control. The authors stated, “Hyperglycemia (elevated blood sugar) has been proposed as a mechanism that may contribute to the association between diabetes and reduced cognitive function.” They went on to state that “hyperglycemia may contribute to cognitive impairment through such mechanisms as the formation of advanced glycation end products, inflammation, and microvascular disease.”

Before I get to explaining what advanced glycation end products are and how they are formed, let’s turn to one more study done earlier, in 2008. This one, from the Mayo Clinic and published in theArchives of Neurology, looked at the effects of the duration of diabetes. In other words, does how long one has diabetes play into the severity of cognitive decline? You bet. The numbers are eye-popping: According to the Mayo’s findings, if diabetes began before a person was sixty-five years old, the risk for mild cognitive impairment was increased by a whopping 220 percent. And the risk of mild cognitive impairment in individuals who had diabetes for ten years or longer was increased by 176 percent. If people were taking insulin, their risk was increased by 200 percent. The authors described a proposed mechanism to explain the connection between persistent high blood sugar and Alzheimer’s disease: “increased production of advanced glycation end products.”8 Just what are these glycation end products cropping up in the medical literature in reference to cognitive decline and accelerated aging? I mentioned them briefly in the previous chapter, and I will explain their significance in the next section.

ONE MAD COW AND MANY CLUES TO NEUROLOGICAL DISORDERS

I remember the hysteria that swept the globe in the mid-1990s when fears of mad cow disease spread quickly as people in Britain began to document evidence of transmission of the disease from cattle to humans. In the summer of 1996, Peter Hall, a twenty-year-old vegetarian, died of the human form of mad cow, called variant Creutzfeldt-Jakob disease. He’d contracted it from eating beef burgers as a child. Soon thereafter, other cases were confirmed, and countries, including the United States, started banning beef imports from Britain. Even McDonald’s stopped serving burgers momentarily in some areas until scientists could ferret out the origins of the outbreak and measures were taken to eradicate the problem. Mad cow disease, also called bovine spongiform encephalopathy, is a rare bovine disorder that infects cattle; the nickname comes from the odd behavior sick cows express when infected. Both forms are types of prion diseases, which are caused by deviant proteins that inflict damage as they spread aggressively from cell to cell.

While mad cow disease isn’t usually classified with classic neurodegenerative diseases such as Alzheimer’s, Parkinson’s, and Lou Gehrig’s disease, all the conditions have a similar deformation in the structure of proteins needed for normal, healthy functioning. Granted, Alzheimer’s, Parkinson’s, and Lou Gehrig’s disease are not transmissible to people like mad cow is, but they nevertheless result in similar features that scientists are just beginning to understand. And it all boils down to deformed proteins.

Much in the way we now know that dozens of degenerative diseases are linked by inflammation, we also know that dozens of those same diseases—including type 2 diabetes, cataracts, atherosclerosis, emphysema, and dementia—are linked to deformed proteins. What makes prion diseases so unique is the ability of those abnormal proteins to confiscate the health of other cells, turning normal cells into misfits that lead to brain damage and dementia. It’s similar to cancer in that one cell hijacks the normal regulation of another cell and creates a new tribe of cells that don’t act like healthy ones. Working in laboratories with mice, scientists are finally collecting evidence to show that major neurodegenerative conditions follow parallel patterns.9

Proteins are among the most important structures in the body—they practically form and shape the entire body itself, carrying out functions and acting like master switches to our operating manual. Our genetic material, or DNA, codes for our proteins, which are then produced as a string of amino acids. They need to achieve a three-dimensional shape to carry out their tasks, such as regulating the body’s processes and guarding against infection. Proteins gain their shape through a special folding technique; in the end, each protein achieves a distinctive shape that helps determine its unique function.

Obviously, deformed proteins cannot serve their function well or at all, and unfortunately, mutant proteins cannot be fixed. If they fail to fold properly into their correct shape, at best they are inactive and at worst, toxic. Usually cells have built-in technology to extinguish deformed proteins, but aging and other factors can interfere with this process. When a toxic protein is capable of inducing other cells to create mis-folded proteins, the result can be disastrous. Which is why the goal for many scientists today is to find a way to stop the cell-to-cell spread of misshapen proteins and literally halt these diseases in their tracks.

Stanley Prusiner, the director of the Institute for Neurodegenerative Diseases at the University of California, San Francisco, discovered prions, which earned him the Nobel Prize in 1997. In 2012, he was part of a team of researchers who authored a landmark paper presented in the Proceedings of the National Academy of Sciences that showed that amyloid-beta protein associated with Alzheimer’s shares prion-like characteristics.10 In their experiment, they were able to follow the progression of disease by injecting amyloid-beta protein into one side of mice’s brains and observing its effects. Using a light-generating molecule, they could see the marauding proteins collect as the mice’s brains lit up—a toxic chain of events that’s similar to what happens in the Alzheimer’s brain.

This discovery holds clues to more than brain disease. Scientists who focus on other areas of the body also have been looking at the impact of shape-shifting proteins. In fact, “mad” proteins may play a role in a range of diseases. Type 2 diabetes, for example, can be seen from this perspective when we consider the fact that people with diabetes have demented proteins in their pancreas that can negatively affect insulin production (which begs the question: Does chronic high blood sugar cause the deformation?). In atherosclerosis, the cholesterol buildup typical of the disease could be caused by protein mis-folding. People with cataracts have rogue proteins that collect in the eye lens. Cystic fibrosis, a hereditary disorder caused by a defect on the DNA, is characterized by improper folding of the CFTR protein. And even a type of emphysema owes its devastation to abnormal proteins that build up in the liver and never reach the lungs.

Okay, so now that we’ve established that wayward proteins play a role in disease and especially neurological degeneration, the next question is, what causes the proteins to mis-fold? With a condition like cystic fibrosis, the answer is more clear-cut because we have identified a specific genetic defect. But what about other ailments that have mysterious origins, or that don’t manifest until later in life? Let’s turn to those glycation end products.

Glycation is the biochemical term for the bonding of sugar molecules to proteins, fats, and amino acids; the spontaneous reaction that causes the sugar molecule to attach itself is sometimes referred to as the Maillard reaction. Louis Camille Maillard first described this process in the early 1900s.11 Although he predicted that this reaction could have an important impact on medicine, not until 1980 did medical scientists turn to it when trying to understand diabetic complications and aging.

This process forms advanced glycation end products (commonly shortened, appropriately, to AGEs), which cause protein fibers to become misshapen and inflexible. To get a glimpse of AGEs in action, simply look at someone who is prematurely aging—someone with a lot of wrinkles, sagginess, discolored skin, and a loss of radiance for their age. What you’re seeing is the physical effect of proteins hooking up with renegade sugars, which explains why AGEs are now considered key players in skin aging.12 Or check out a chain-smoker: The yellowing of the skin is another hallmark of glycation. Smokers have fewer antioxidants in their skin, and the smoking itself increases oxidation in their bodies and skin. So they cannot combat the by-products of normal processes like glycation because their bodies’ antioxidant potential is severely weakened and, frankly, overpowered by the volume of oxidation. For most of us, the external signs of glycation show up in our thirties, when we’ve accumulated enough hormonal changes and environmental oxidative stress, including sun damage.

Glycation is an inevitable fact of life, just like inflammation and free radical production to some degree. It’s a product of our normal metabolism and fundamental in the aging process. We can even measure glycation using technology that illuminates the bonds formed between sugars and proteins. In fact, dermatologists are well versed in this process. With Visia complexion-analysis cameras, they can capture the difference between youth and age just by taking a fluorescent image of children and comparing it to the faces of older adults. The children’s faces will come out very dark, indicating a lack of AGEs, whereas the adults’ will beam brightly as all those glycation bonds light up.

Clearly, the goal is to limit or slow down the glycation process. Many anti-aging schemes are now focused on how to reduce glycation and even break those toxic bonds. But this cannot happen when we consume a high-carb diet, which speeds up the glycation process. Sugars in particular are rapid stimulators of glycation, as they easily attach themselves to proteins in the body (and here’s a good bit of trivia: The number one source of dietary calories in America comes from high-fructose corn syrup, which increases the rate of glycation by a factor of ten).

When proteins become glycated, at least two important things happen. First, they become much less functional. Second, once proteins become bonded to sugar, they tend to attach themselves to other similarly damaged proteins and form cross-linkages that further inhibit their ability to function. But perhaps far more important is that once a protein is glycated, it becomes the source of a dramatic increase in the production of free radicals. This leads to the destruction of tissues, damaging fat, other proteins, and even DNA. Again, glycation of proteins is a normal part of our metabolism. But when it’s excessive, many problems arise. High levels of glycation have been associated with not only cognitive decline, but also kidney disease, diabetes, vascular disease, and, as mentioned, the actual process of aging itself.13Keep in mind that any protein in the body is subject to being damaged by glycation and can become an AGE. Because of the significance of this process, medical researchers around the world are hard at work trying to develop various pharmaceutical ways to reduce AGE formation. But clearly, the best way to keep AGEs from forming is to reduce the availability of sugar in the first place.

Beyond just causing inflammation and free radical–mediated damage, AGEs are associated with damage to blood vessels and are thought to explain the connection between diabetes and vascular issues. As I noted in the previous chapter, the risk of coronary artery disease is dramatically increased in diabetics, as is the risk of stroke. Many individuals with diabetes have significant damage to the blood vessels supplying the brain, and while they may not have Alzheimer’s, they may suffer from dementia caused by this blood supply issue.

Earlier I explained that LDL—the so-called bad cholesterol—is an important carrier protein bringing vital cholesterol to brain cells. Only when it becomes oxidized does it wreak havoc on blood vessels. And we now understand that when LDL becomes glycated (it’s a protein, after all), this dramatically increases its oxidation.

The link between oxidative stress and sugar cannot be overstated. When proteins are glycated, the amount of free radicals formed is increased fiftyfold; this leads to loss of cellular function and eventually cell death.

This calls our attention to the powerful relationship between free radical production, oxidative stress, and cognitive decline. We know that oxidative stress is directly related to brain degeneration.14 Studies show that damage to lipids, proteins, DNA, and RNA by free radicals happens early in the journey to cognitive impairment, and long before signs of serious neurological disorders such as Alzheimer’s, Parkinson’s, and Lou Gehrig’s disease. Sadly, by the time a diagnosis is made, the damage is already done. The bottom line is that if you want to reduce oxidative stress and the action of free radicals harming your brain, you have to reduce the glycation of proteins. Which is to say, you have to diminish the availability of sugar. Pure and simple.

Most doctors employ a measurement of one glycated protein routinely in their medical practice. I’ve already mentioned it: hemoglobin A1C. This is the same standard laboratory measurement used to measure blood sugar control in diabetics. So, while your doctor may be measuring your hemoglobin A1C from time to time to get an understanding of your blood sugar control, the fact that it’s glycated protein has vast and extremely important implications for your brain health. But hemoglobin A1C represents more than just a simple measurement of average blood sugar control over a 90-to 120-day period.

Hemoglobin A1C is the protein found in the red blood cell that carries oxygen and binds to blood sugar, and this binding is increased when blood sugar is elevated. While hemoglobin A1C doesn’t give a moment-to-moment indication of what the blood sugar is, it is extremely useful in that it shows what the “average” blood sugar has been over the previous ninety days. This is why hemoglobin A1C is frequently used in studies that try to correlate blood sugar control to various disease processes like Alzheimer’s, mild cognitive impairment, and coronary artery disease.

It’s well documented that glycated hemoglobin is a powerful risk factor for diabetes, but it’s also been correlated with risk for stroke, coronary heart disease, and death from other illnesses. These correlations have been shown to be strongest with any measurement of hemoglobin A1C above 6.0 percent.

We now have evidence to show that elevated hemoglobin A1C is associated with changes in brain size. In one particularly profound study, published in the journal Neurology, researchers looking at MRIs to determine which lab test correlated best with brain atrophy found that the hemoglobin A1C demonstrated the most powerful relationship.15 When comparing the degree of brain tissue loss in those individuals with the lowest hemoglobin A1C (4.4 to 5.2) to those having the highest hemoglobin A1C (5.9 to 9.0), the brain loss in those individuals with the highest hemoglobin A1C was almost doubled during a six-year period. So hemoglobin A1C is far more than just a marker of blood sugar balance—and it’s absolutely under your control!

image

An ideal hemoglobin A1C would be in the 4.8 to 5.4 range. Keep in mind that reducing carbohydrate ingestion, weight loss, and physical exercise will ultimately improve insulin sensitivity and lead to a reduction of hemoglobin A1C.

You also should know that there’s now documented evidence proving a direct relationship between hemoglobin A1C and the future risk of depression. One study looked at more than four thousand men and women whose average age was sixty-three years and showed a direct correlation between hemoglobin A1C and “depressive symptoms.”16 Poor glucose metabolism was described as a risk factor for the development of depression in these adults. The bottom line: The glycation of proteins is bad news for the brain.

EARLY ACTION

As I’ve already described, having normal blood sugar levels may mean that the pancreas is working overtime to keep that blood sugar normal. Based upon this understanding, you can see that high insulin levels will happen long before blood sugar rises and a person becomes diabetic. That’s why it’s so important to check not only your fasting blood sugar, but also your fasting insulin level. An elevated fasting insulin level is an indicator that your pancreas is trying hard to normalize your blood sugar. It’s also a clear signal that you are consuming too much carbohydrate. And make no mistake about it: Even being insulin resistant is a powerful risk factor for brain degeneration and cognitive impairment. It’s not good enough to look at the diabetes data as it relates to brain disease and be confident that your risk has been ameliorated because you are not diabetic. And if your blood sugar happens to be normal, the only way you will know if you are insulin resistant is to have your fasting blood insulin level checked. Period.

Need more evidence? Consider a study done a few years ago that looked at 523 people aged seventy to ninety years who did not have diabetes or even elevated blood sugar.17 Many of them were insulin resistant, however, as determined by their fasting insulin levels. The study revealed that those individuals who were insulin resistant had a dramatically increased risk of cognitive impairment compared to those within the normal range. Overall, the lower the insulin level, the better. The average insulin level in the United States is about 8.8 micro international units per milliliter (µIU/mL) for adult men and 8.4 for women. But with the degree of obesity and carbohydrate abuse in America, it’s safe to say that these “average” values are likely much higher than what should be considered ideal. Patients who are being very careful about their carbohydrate intake might have insulin levels indicated on their lab report as less than 2.0. This is an ideal situation—a sign that the individual’s pancreas is not being overworked, blood sugars are under excellent control, there is very low risk for diabetes, and there is no evidence of insulin resistance. The important point is that if your fasting insulin level is elevated—anything over five should be considered elevated—it can improve, and I will show you how to do that in chapter 10.

THE FATTER YOU ARE, THE SMALLER YOUR BRAIN

Most everyone has a pretty good idea that carrying around extra weight is unhealthy. But if you needed just one more reason to drop the excess pounds, perhaps the fear of losing your mind—physically and literally—will help motivate you.

When I was studying to be a doctor, the prevailing wisdom was that fat cells were primarily storage bins where unwanted masses of excess could hang out silently on the sidelines. But that was a grossly misguided perspective. Today we know that fat cells do more than simply store calories; they are far more involved in human physiology. Masses of body fat form complex, sophisticated hormonal organs that are anything but passive. You read that right: Fat is an organ.18 And it could very well be one of the body’s most industrious organs, serving a lot of functions beyond keeping us warm and insulated. This is especially true of visceral fat—the fat wrapped around our internal, “visceral” organs such as the liver, kidneys, pancreas, heart, and intestines. Visceral fat has also gotten a lot of press lately: We know now that this type of fat is the most devastating to our health. We may lament our thunder thighs, under-arm curtains, love handles, cellulite, and big butts, but the worst kind of fat is the kind many of us cannot even see, feel, or touch. In extreme cases we do see it in the bulging bellies and muffin tops that are the outward signs of fat-enveloped internal organs belowdecks. (For this very reason, waist circumference is often a measurement of “health,” as it predicts future health challenges and mortality; the higher your waist circumference, the higher your risk for disease and death.19)

It’s well documented that visceral fat is uniquely capable of triggering inflammatory pathways in the body as well as signaling molecules that disrupt the body’s normal course of hormonal actions.20 This, in turn, keeps the cascade of negative effects from visceral fat going. In addition, visceral fat does more than just generate inflammation down the road through a chain of biological events; visceral fat itself becomes inflamed. This kind of fat houses tribes of inflammatory white blood cells. In fact, the hormonal and inflammatory molecules produced by visceral fat get dumped directly into the liver, which, as you can imagine, responds with another round of ammunition (i.e., inflammatory reactions and hormone-disrupting substances). Long story short: More than merely a predator lurking behind a tree, it is an enemy that is armed and dangerous. The number of health conditions now linked to visceral fat is tremendous, from the obvious ones such as obesity and metabolic syndrome to the not-so-obvious—cancer, autoimmune disorders, and brain disease.

The dots connecting excessive body fat, obesity, and brain dysfunction are not hard to follow given the information you’ve already learned in this book. Excessive body fat increases not only insulin resistance, but also the production of inflammatory chemicals that play directly into brain degeneration.

In a 2005 study, the waist-to-hip ratios of more than 100 individuals were compared to structural changes in their brains.21 The study also looked at brain changes in relation to fasting blood sugar and insulin levels. What the authors wanted to determine was whether or not a relationship existed between the brain’s structure and the size of a person’s belly. And the results were striking. Essentially, the larger a person’s waist-to-hip ratio (i.e., the bigger the belly), the smaller the brain’s memory center, the hippocampus. The hippocampus plays a critical role in memory, and its function is absolutely dependent upon its size. As your hippocampus shrinks, your memory declines. More striking still, the researchers found that the higher the waist-to-hip ratio, the higher the risk for small strokes in the brain, also known to be associated with declining brain function. The authors stated: “These results are consistent with a growing body of evidence that links obesity, vascular disease, and inflammation to cognitive decline and dementia.” Other studies since then have confirmed the finding: For every excess pound put on the body, the brain gets a little smaller. How ironic that the bigger the body gets, the smaller the brain gets.

In a joint research project between UCLA and the University of Pittsburgh, neuroscientists examined brain images of ninety-four people in their seventies who had participated in an earlier study of cardiovascular health and cognition.22 None of the participants had dementia or other cognitive impairments, and they were followed for five years. What these researchers found was that the brains of obese people—defined by having a body mass index above 30—looked sixteen years older than their healthy counterparts of normal weight. And those who were overweight—defined by having a body mass index between 25 and 30—looked eight years older than their leaner counterparts. More specifically, the clinically obese people had 8 percent less brain tissue, while the overweight had 4 percent less brain tissue compared to normal-weight individuals. Much of the tissue was lost in the frontal and temporal lobe regions of the brain, the place from which we make decisions and store memories, among other things. The authors of the study rightfully pointed out that their findings could have serious implications for aging, overweight, or obese individuals, including a heightened risk for Alzheimer’s disease.

Without a doubt, vicious cycles are at play here, each of which is contributing to the other. Genetics could affect one’s propensity to overeat and gain weight, and this then factors into activity levels, insulin resistance, and risk for diabetes. Diabetes then affects weight control and blood sugar balance. Once a person becomes diabetic and sedentary, it’s inevitable that a breakdown in tissues and organs occurs, and not just in the brain. What’s more, once the brain begins to degenerate and physically shrink, it begins to lose its ability to function properly. That is to say, the brain’s appetite and weight-control centers won’t be firing on all cylinders and could actually be misfiring, and this again feeds the vicious cycle.

It is important to understand that weight loss needs to happen right now, as changes take place as soon as an individual begins to carry excess body fat. To some degree, we can predict whose brain will suffer thirty years from now simply by measuring body fat. In a 2008 report, California scientists combed through the records of more than sixty-five hundred people who were evaluated in the mid-1960s to 1970s.23 They wanted to know: Who got dementia? When these folks were first evaluated an average of thirty-six years earlier, various measurements were made of their bodies to determine how much fat they had. These included the size of the belly, thigh circumference, and height and weight. Roughly three decades later, those individuals who had more body fat had a dramatically increased risk for dementia. Of the original group, 1,049 were diagnosed as having dementia. When the scientists compared the group with the least body fat to the group with the highest body fat, they found that those in the highest body fat group had an almost twofold increased risk of dementia. The authors reported, “As is the case for diabetes and cardiovascular disease, central obesity [belly fat] is also a risk factor for dementia.”

THE POWER OF WEIGHT LOSS (BESIDES WHAT YOU ALREADY KNOW)

As study after study has proven, weight loss through dieting can have a dramatic effect on insulin signaling and insulin sensitivity. In one report, doctors evaluated 107 obese individuals at least sixty-five years of age over a one-year period and studied how they responded in terms of insulin to an oral dose of glucose.24 The researchers wanted to measure the difference among three distinct groups: those who were put on a weight-loss program, those who were assigned an exercise program, and those put on a diet and exercise program. A fourth group of people was designated as the control for purposes of further comparison. The results six months later? People in the weight-loss group had a 40 percent increase in their insulin sensitivity. This also happened in the group that went on a weight-loss program and exercised. The group that didn’t embark on a weight-loss program but did exercise, however, showed no change in insulin sensitivity. When the study was finally concluded after one year, insulin sensitivity had improved by a whopping 70 percent in those who lost weight; people who were exercising while on a diet and lost weight had an 86 percent improvement in their sensitivity to insulin. But the third group, the one that engaged in physical activity without dieting and losing weight, remained far behind. Even after one year they had absolutely no change in their insulin sensitivity.

So the take-home lesson is clear: You can improve insulin sensitivity and reduce your risk of diabetes (not to mention all manner of brain diseases) simply by making lifestyle changes that melt that fat away. And if you add exercise to the dieting, you’ll stand to gain even bigger benefits. By now you should know that I’m going to prescribe a low-carb diet rich in healthy fats, including cholesterol. And don’t take my word for it. Just turn to the latest studies proving the power of this type of diet. Last year the Journal of the American Medical Association published the effects of three popular diets on a group of overweight or obese young adults.25 Each of the participants tried each of the diets for a month—one was low-fat (60 percent of the calories came from carbohydrate, 20 percent from fat, and 20 percent from protein), one was low-glycemic (40 percent of the calories came from carbohydrate, 40 percent from fat, and 20 percent from protein), and the third was a very low carbohydrate diet (10 percent of the calories came from carbohydrates, 60 percent from fat, and 30 percent from protein). All of the diets provided the same number of calories, but those on the low-carb, high-fat diet burned the most calories. The study also looked at insulin sensitivity during the four-week period on each diet, finding that the low-carb diet triggered the biggest improvement in insulin sensitivity—almost twice that of the low-fat diet. Triglycerides, a powerful cardiovascular risk marker, averaged 66 in the low-carb group and 107 in the low-fat group. (As an aside, elevated triglyceride levels are also a hallmark of too many carbs in the diet.) The authors pointed out that the lab results they measured in the low-fat diet showed changes in people’s blood chemistry that left them vulnerable to weight gain. Clearly, the best diet for maintaining weight loss is a low-carbohydrate, high-fat one.

Many other studies have arrived at the same conclusion: A low-carb, high-fat diet will outperform a low-fat, high-carb diet any day, and by virtually every measure in the body, from its internal chemistry to its external waistline. And when we consider all of the parameters that affect health, and specifically brain health, such as weight loss, insulin sensitivity, blood sugar control, and even C-reactive protein, a low-carbohydrate diet is substantially more effective than any other diet. Those other diets will result in outcomes that heighten your risk for a multitude of brain dysfunctions, from daily nuisances like headaches to chronic migraines, anxiety disorders, ADHD, and depression. And if the thought of being as sharp as a whip until your last breath on earth isn’t enough to motivate you, then consider all the benefits that your heart (and virtually every organ in your body) will gain by ditching a low-fat diet. In March 2013, the New England Journal of Medicine published a large landmark study showing that people age fifty-five to eighty who ate a Mediterranean diet were at lower risk of heart disease and stroke—by as much as 30 percent—than those on a typical low-fat diet.26 The results were so profound that scientists halted the study early because the low-fat diet proved too damaging for the people eating lots of commercially baked goods rather than sources of healthy fats. The Mediterranean diet is famous for being rich in olive oil, nuts, beans, fish, fruits and vegetables, and even wine with meals. Although it does allow room for grains, it’s very similar to my dietary protocol. In fact, if you modify the traditional Mediterranean diet by removing all gluten-containing foods and limiting sugary fruits and non-gluten carbs, you have yourself the perfect grain-brain-free diet.

AN APPLE A DAY?

No, an apple a day may not keep the doctor away. Now that I’ve held so many of your favorite foods in contempt, I can hear the uncertainty: “How can the body live on fat and never get fat?” Ah, it’s an excellent question. I’m going to tackle that very conundrum shortly and settle any confusion about how you can live—and thrive—on fats. It sounds absurd to think we can live on virtually no carbs but copious amounts of fat and cholesterol in our diet. But we can, and we should if we’re going to protect our genome. Despite what food marketers would have you believe, we’ve had a fat-based diet shaping our genome for the past 2.6 million years. Why change that? As you’ve already read, when we did we got fat.

The story of reversing this trend and gaining back the lean, toned, lithe bodies we’re designed to have, and a sharp brain to boot, starts with a look at the brain’s fundamental properties.