the number that is
It’s important to understand that when you have a blood cholesterol test, the number that is
represented is actually 75 to 80 percent derived from what your body manufactures and not
necessarily what you’ve eaten. In fact, foods that are high in cholesterol actually decrease the body’s
production of cholesterol. We all make up to 2,000 grams of cholesterol every day because we
desperately need it, and this is several times the amount found in our diets. But despite this amazing
ability, it’s critical to obtain cholesterol from dietary sources. Our bodies much prefer that we “spoonfeed”
our cholesterol from the foods we eat rather than manufacture it internally, which is a complex
multistep biological process that taxes the liver. Dietary cholesterol is so important that your body
absorbs as much as it can for use.
So what happens if you restrict your cholesterol intake, as so many people do today? The body
sends out an alarm that indicates crisis (famine). Your liver senses this signal and begins to produce
an enzyme called HMG-CoA reductase, which helps make up for the deficit by using carbohydrates in
the diet to produce an excess supply of cholesterol. (This is the same enzyme that statins target.) As
you can likely predict, it’s a Molotov cocktail in the works: As you eat excessive carbohydrates while
lowering your cholesterol intake, you incite a steady and punishing overproduction of cholesterol in
the body. The only way to stop this internal pathway run amok is to consume an adequate amount of
dietary cholesterol and back way off on carbs. Which explains why my “high-cholesterol” patients
who go on my diet can safely return their levels to normal without drugs while enjoying cholesterolrich
foods.
IS THERE SUCH A THING AS DANGEROUSLY “HIGH
CHOLESTEROL”?
Cholesterol is at most a minor player in coronary heart disease and represents an extremely poor predictor of heart attack
risk. Over half of all patients hospitalized with a heart attack have cholesterol levels in the “normal” range. The idea that
aggressively lowering cholesterol levels will somehow magically and dramatically reduce heart attack risk has now been
fully and categorically refuted. The most important modifiable risk factors related to heart attack risk include smoking,
excess alcohol consumption, lack of aerobic exercise, overweight, and a diet high in carbohydrates.
So when I see patients with cholesterol levels of, say, 240 mg/dl or higher, it’s almost a given that they will have
received a prescription for a cholesterol-lowering medication from their general practitioners. This is wrong in thought and
action. As discussed, cholesterol is one of the most critical chemicals in human physiology, especially as it relates to brain
health. The best lab report to refer to in determining one’s health status is hemoglobin A1C, not cholesterol levels. It is
rarely, if ever, appropriate to consider high cholesterol alone to be a significant threat to health.
A good question: Who suffers from high cholesterol? Thirty years ago, the answer was anyone
whose cholesterol level was more than 240 and who had other risk factors, such as being overweight
and smoking. The definition changed after the Cholesterol Consensus Conference in 1984; then it
became anyone with a cholesterol level over 200, regardless of other risk factors. Today, the threshold
is down to 180. And if you’ve had a heart attack, you’re in a totally different category: No matter how
low your cholesterol level is, you’ll likely be prescribed a cholesterol-lowering medicine and told to
maintain a low-fat diet.
SEX ED: IT’S ALL IN YOUR HEAD
Okay. So cholesterol is a good thing. But it’s not just about your brain’s wit, physical health, and
future longevity. It’s also about a very important part of your lifestyle that typically gets shoved under
the carpet in serious health books. I’m talking about your sex life. How sparky is it?
Although I’m a neurologist, I treat a fair share of people who suffer from sexual dysfunction and
are either impotent and avoid sex altogether or who hoard bottles of pills to help them out. You know
about these pills—the ones that get advertised like candy on the evening news and promise to
transform your sex life. My patients with sexual health woes obviously don’t come to me for that
specifically, but it’s a noted problem when I ask them about that part of their life in addition to any
neurological issues I am addressing.
A quick anecdote. A seventy-five-year-old retired engineer came to see me with a variety of
complaints, including insomnia and depression. He had been taking sleeping pills for the past forty
years, and his depression had worsened in the two to three months prior to his appointment. At the
time I saw him, he was actually taking a few drugs: an antidepressant, a medication for anxiety, and
Viagra for erectile dysfunction. I first checked him for gluten sensitivity and discovered, to his
surprise, a positive panel. He agreed to adopt a gluten-free, high-fat diet, and we next communicated
by telephone after about one month. That’s when he had magnificent news: His depression had
improved, and he no longer needed to take Viagra in order to have sex with his wife. He thanked me
very much.
Most everyone can agree that sex has everything to do with what’s going on in the brain. It’s an act
that’s deeply tied to emotions, impulses, and thoughts. But it’s also inexorably connected to hormones
and blood chemistry. Without question, if you’re depressed and not sleeping well, like my engineer
patient, sex is the last thing on your mind. But one of the most common reasons for impotence is
actually neither of these two conditions. It’s what I’ve been talking about through much of this
chapter: abysmally low cholesterol levels. And the studies to date have achieved proof of concept:
Unless you have healthy testosterone levels (this goes for both men and women), you’re not going to
have a hot sex life, if any at all. And what makes testosterone? Cholesterol. What are millions of
Americans doing today? Lowering their cholesterol levels through diet and/or taking statins. In the
meanwhile, they are lowering their libido and ability to perform. Is it any wonder there’s an epidemic
of erectile dysfunction (ED) and demand for ED drugs today, not to mention (perhaps ironically)
testosterone replacement therapy?
Plenty of studies have confirmed these connections.
36 Decreased libido is one of the most common
complaints among those taking statins, and lab reports have repeatedly demonstrated low testosterone
in statin consumers.
37 Those on statins are twice as likely to have low testosterone levels. Luckily,
this condition is reversible by stopping the statin and increasing cholesterol intake. There are actually
two ways that statins can lower testosterone. The first is by directly lowering levels of cholesterol.
The second is by interfering with the enzymes that create active testosterone.
One study that came out in the United Kingdom in 2010 looked at 930 men with coronary heart
disease and measured their testosterone levels.
38 Low testosterone was found in 24 percent of the
patients, and the risk of dying was 12 percent in those with normal testosterone but 21 percent in those
with low testosterone. The conclusion was staring them in the face: If you have coronary disease and
low testosterone, you’re at much greater risk of dying. So again we are giving statin medications to
lower cholesterol, which lowers testosterone… and lower testosterone increases the risk of dying. Is
this crazy or what?
I rest my case.
THE SWEET TRUTH
I’ve covered a lot of ground in this chapter, mostly dealing with the role of fats on the brain. But we
now have to ask ourselves the following: What happens when you inundate the brain with sugar
instead? I started this chapter by addressing the ills of carbohydrates on our bodies, but I’ve saved the
conversation about this particularly devastating carbohydrate for its own chapter. Unfortunately, this
is a subject area that’s gotten remarkably little attention in the press. We increasingly hear about the
relationship between sugar and “diabesity,” sugar and heart disease, sugar and fatty livers, sugar and
metabolic syndrome, sugar and risk for cancer, etc…. but sugar and brain dysfunction? It’s time you
got up close and personal with your brain on sugar.
CHAPTER 4
Not a Fruitful Union
This Is Your Brain on Sugar (Natural or Not)
Evolutionarily, sugar was available to our ancestors as fruit for only a few months a year (at
harvest time), or as honey, which was guarded by bees. But in recent years, sugar has been
added to nearly all processed foods, limiting consumer choice. Nature made sugar hard to get;
man made it easy.
—DR. ROBERT LUSTIG ET AL.
1
SUGAR. WHETHER IT’S FROM A LOLLIPOP, Lucky Charms, or a slice of cinnamon-raisin bread, we all
know that this particular carbohydrate is not the healthiest of ingredients, especially when it’s
consumed in excess or comes from refined or processed forms such as high-fructose corn syrup. We
also know that sugar is partly to blame for challenges with our waistlines, appetites, blood sugar
control, obesity, type 2 diabetes, and insulin resistance. But what about sugar and the brain?
In 2011, Gary Taubes, the author of Good Calories, Bad Calories,
2 wrote an excellent piece for the
New York Times titled “Is Sugar Toxic?”
3
In it, he chronicles not just the history of sugar in our lives
and food products, but the evolving science behind understanding how sugar affects our bodies. In
particular, he showcases the work of Robert Lustig, a specialist in pediatric hormone disorders and the
leading expert in childhood obesity at the University of California, San Francisco, School of
Medicine, who makes a case for sugar being a “toxin” or a “poison.” But Lustig doesn’t harp so much
on the consumption of these “empty calories”; his issue with sugar is that it has unique characteristics,
specifically in the way the various kinds of sugar are metabolized by the human body.
Lustig likes to use the phrase “isocaloric but not isometabolic” when he describes the difference
between pure glucose, the simplest form of sugar, and table sugar, which is a combination of glucose
and fructose. (Fructose, which I’ll get to in a moment, is a type of naturally occurring sugar found
exclusively in fruit and honey.) When we eat 100 calories of glucose from a potato, for instance, our
bodies metabolize it differently—and experience different effects—than if we were to eat 100 calories
of sugar comprising half glucose and half fructose. Here’s why.
Your liver takes care of the fructose component of sugar. Glucose from other carbs and starches, on
the other hand, is processed by every cell in the body. So consuming both types of sugar (fructose and
glucose) at the same time means your liver has to work harder than if you ate the same number of
calories from glucose alone. And your liver will also be taxed if it’s hit with liquid forms of these
sugars, those found in soda or fruit juices. Drinking liquid sugar is not the same as eating, say, an
equivalent dose of sugar in whole apples. Fructose, by the way, is the sweetest of all naturally
occurring carbohydrates, which probably explains why we love it so much. But contrary to what you
might think, it has the lowest glycemic index of all the natural sugars. The reason is simple: Because
the liver metabolizes most of the fructose, it has no immediate effect on our blood sugar and insulin
levels, unlike sugar or high-fructose corn syrup, whose glucose ends up in general circulation and
raises blood sugar levels. Don’t let that fact fool you, however. While fructose may not have an
immediate effect, it has more long-term effects when it’s consumed in sufficient quantities from
unnatural sources. And the science is well documented: Consuming fructose is associated with
impaired glucose tolerance, insulin resistance, high blood fats, and hypertension. And because it does
not trigger the production of insulin and leptin, two key hormones in regulating our metabolism, diets
high in fructose lead to obesity and its metabolic repercussions. (I will clarify later what this means
for those who enjoy eating lots of fruit. Fortunately, for the most part, you can have your fruit and eat
it, too. The quantity of fructose in most whole fruit pales in comparison to the levels of fructose in
processed foods.)
We hear about sugar and its effects on virtually every other part of the body except for the brain.
This, again, is a subject area that’s gotten remarkably little attention in the press. The questions to ask,
and which I’ll answer in this chapter, are:
What does excess sugar consumption do to the brain?
Can the brain distinguish between different types of sugar? Does it “metabolize” sugar
differently depending on where it’s coming from?
If I were you, I’d put down that biscuit or biscotti you’re having with your coffee and buckle up.
After reading this chapter, you’ll never look at a piece of fruit or sugary treat in quite the same way.
SUGAR AND CARBS 101
Let me begin by defining a few terms. What, exactly, is the difference between table sugar, fruit sugar,
high-fructose corn syrup, and the like? Good question. As I’ve said, fructose is a type of sugar
naturally found in fruit and honey. It’s a monosaccharide just like glucose, whereas table sugar
(sucrose)—the white granulated stuff we sprinkle in coffee or dump into a bowl of cookie batter—is a
combination of glucose and fructose, thus making it a disaccharide (two molecules linked together).
High-fructose corn syrup, which is what we find in our sodas, juices, and many processed foods, is yet
another combination of molecules dominated by fructose—it’s 55 percent fructose, 42 percent
glucose, and 3 percent other carbohydrates.
High-fructose corn syrup was introduced in 1978 as a cheap replacement for table sugar in
beverages and food products. No doubt you’ve heard about it in the media, which has attacked this
artificially manufactured ingredient for being the root cause of our obesity epidemic. But this misses
the point. While it’s true we can blame our bulging waistlines and diagnoses of related conditions
such as obesity and diabetes on our consumption of high-fructose corn syrup, we can also point to all
other sugars as well since they are all carbohydrates, a class of biomolecules that share similar
characteristics. Carbohydrates are simply long chains of sugar molecules, as distinguished from fat
(chains of fatty acids), proteins (chains of amino acids), and DNA. But you already know that not all
carbohydrates are created equal. And not all carbohydrates are treated equally by the body. The
differentiating feature is how much a certain carbohydrate will raise blood sugar and, in effect,
insulin. Meals that are higher in carbohydrate, and especially those that are higher in simple glucose,
cause the pancreas to increase its insulin output in order to store the blood sugar in cells. During the
course of digestion, carbohydrates are broken down and sugar is liberated into the bloodstream, again
causing the pancreas to increase its output of insulin so glucose can penetrate cells. Over time, higher
levels of blood sugar will cause increased production of insulin output from the pancreas.
The carbs that trigger the biggest surge in blood sugar are typically the most fattening for that very
reason. They include anything made with refined flour (breads, cereals, pastas); starches such as rice,
potatoes, and corn; and liquid carbs like soda, beer, and fruit juice. They all get digested quickly
because they flood the bloodstream with glucose and stimulate a surge in insulin, which then packs
away the excess calories as fat. What about the carbs in a vegetable? Those carbs, especially the ones
in leafy green vegetables such as broccoli and spinach, are tied up with indigestible fiber, so they take
longer to break down. The fiber essentially slows down the process, causing a slower funneling of
glucose into the bloodstream. Plus, vegetables contain more water relative to their weight than
starches, and this further dampens the blood sugar response. When we eat whole fruits, which
obviously contain fruit sugar, the water and fiber will also “dilute” the blood sugar effect. If you take,
for instance, a peach and a baked potato of equal weight, the potato will have a much bigger effect on
blood sugar than the watery, fibrous peach. That’s not to say the peach, or any other fruit for that
matter, won’t cause problems.
4
Our caveman ancestors did in fact eat fruit, but not every day of the year. We haven’t yet evolved
to be able to handle the copious amounts of fructose we consume today—especially when we get our
fructose from manufactured sources. Natural fruit has relatively little sugar, when compared to, say, a
can of regular soda, which has a massive amount. A medium-sized apple contains about 44 calories of
sugar in a fiber-rich blend thanks to the pectin; conversely, a 12-ounce can of Coke or Pepsi contains
nearly twice that—80 calories of sugar. If you juice several apples and concentrate the liquid down to
a 12-ounce beverage (thereby losing the fiber), lo and behold you get a blast of 85 sugar calories that
could just as well have come from a soda. When that fructose hits the liver, most of it gets converted
to fat and sent to our fat cells. No wonder fructose was called the most fattening carbohydrate more
than forty years ago by biochemists. And when our bodies get used to performing this simple
conversion with every meal, we can fall into a trap in which even our muscle tissue becomes resistant
to insulin. Gary Taubes describes this domino effect brilliantly in Why We Get Fat: “So, even though
fructose has no immediate effect on blood sugar and insulin, over time—maybe a few years—it is a
likely cause of insulin resistance and thus the increased storage of calories as fat. The needle on our
fuel-partitioning gauge will point toward fat storage, even if it didn’t start out that way.”
5
The most disturbing fact about our addiction to sugar is that when we combine fructose and
glucose (which we often do when we eat foods made with table sugar), the fructose might not do much
to our blood sugar right away, but the accompanying glucose takes care of that—stimulating insulin
secretion and alerting the fat cells to prepare for more storage. The more sugars we eat, the more we
tell our bodies to transfer them to fat. This happens not only in the liver, leading to a condition called
fatty liver disease, but elsewhere in the body as well. Hello, love handles, muffin tops, beer bellies,
and the worse kind of fat of all—invisible visceral fat that hugs our vital organs.
I love how Taubes draws a parallel between the cause-and-effect relationship uniting carbohydrates
and obesity, and the link between smoking and cancer: If the world had never invented cigarettes, lung
cancer would be a rare disease. Likewise, if we didn’t eat such high-carb diets, obesity would be a rare
condition.
6
I’d bet that other related conditions would be uncommon as well, including diabetes, heart
disease, dementia, and cancer. And if I had to name the kingpin here in terms of avoiding all manner
of disease, I’d say “diabetes.” That is to say, don’t become diabetic.
THE DEATH KNELL IN DIABETES
I cannot reiterate enough the importance of avoiding the path to diabetes, and if diabetes is already a
card you’re playing with, then keeping blood sugars balanced is key. In the United States there are
close to 11 million adults sixty-five years or older with type 2 diabetes, which speaks volumes to the
magnitude of the potential catastrophe on our hands if all of these individuals—plus the ones who
haven’t been officially diagnosed yet—develop Alzheimer’s. The data that supports the relationship
between diabetes and Alzheimer’s disease is profound, but it’s important to understand that diabetes is
a powerful risk factor for simple cognitive decline. This is especially true in individuals whose
diabetes is under poor control. Case in point: In June 2012, the Archives of Neurology published an
analysis of 3,069 elderly adults to determine if diabetes increased the risk of cognitive decline and if
poor blood sugar control was related to worse cognitive performance.
7 When first evaluated, about 23
percent of the participants actually had diabetes, while the remaining 77 percent did not (the
researchers intentionally chose a “diverse group of well-functioning older adults”). A small
percentage, however, of that 77 percent went on to develop diabetes during the nine-year study. At the
beginning of the study a panel of cognitive tests was performed, and over the next nine years these
tests were repeated.
The conclusion stated the following: “Among well-functioning older adults, DM [diabetes
mellitus] and poor glucose control among those with DM are associated with worse cognitive function
and greater decline. This suggests that severity of DM may contribute to accelerated cognitive aging.”
The researchers demonstrated a fairly dramatic difference in the rate of mental decline among those
with diabetes as compared to the non-diabetics. More interesting still, they also noted that even at the
start of the study, baseline cognitive scores of the diabetics were already lower than the controls.’ The
study also found a direct relationship between the rate of cognitive decline and higher levels of
hemoglobin A1C, a marker of blood glucose control. The authors stated, “Hyperglycemia (elevated
blood sugar) has been proposed as a mechanism that may contribute to the association between
diabetes and reduced cognitive function.” They went on to state that “hyperglycemia may contribute
to cognitive impairment through such mechanisms as the formation of advanced glycation end
products, inflammation, and microvascular disease.”
Before I get to explaining what advanced glycation end products are and how they are formed, let’s
turn to one more study done earlier, in 2008. This one, from the Mayo Clinic and published in the
Archives of Neurology, looked at the effects of the duration of diabetes. In other words, does how long
one has diabetes play into the severity of cognitive decline? You bet. The numbers are eye-popping:
According to the Mayo’s findings, if diabetes began before a person was sixty-five years old, the risk
for mild cognitive impairment was increased by a whopping 220 percent. And the risk of mild
cognitive impairment in individuals who had diabetes for ten years or longer was increased by 176
percent. If people were taking insulin, their risk was increased by 200 percent. The authors described a
proposed mechanism to explain the connection between persistent high blood sugar and Alzheimer’s
disease: “increased production of advanced glycation end products.”
8 Just what are these glycation
end products cropping up in the medical literature in reference to cognitive decline and accelerated
aging? I mentioned them briefly in the previous chapter, and I will explain their significance in the
next section.
ONE MAD COW AND MANY CLUES TO NEUROLOGICAL DISORDERS
I remember the hysteria that swept the globe in the mid-1990s when fears of mad cow disease spread
quickly as people in Britain began to document evidence of transmission of the disease from cattle to
humans. In the summer of 1996, Peter Hall, a twenty-year-old vegetarian, died of the human form of
mad cow, called variant Creutzfeldt-Jakob disease. He’d contracted it from eating beef burgers as a
child. Soon thereafter, other cases were confirmed, and countries, including the United States, started
banning beef imports from Britain. Even McDonald’s stopped serving burgers momentarily in some
areas until scientists could ferret out the origins of the outbreak and measures were taken to eradicate
the problem. Mad cow disease, also called bovine spongiform encephalopathy, is a rare bovine
disorder that infects cattle; the nickname comes from the odd behavior sick cows express when
infected. Both forms are types of prion diseases, which are caused by deviant proteins that inflict
damage as they spread aggressively from cell to cell.
While mad cow disease isn’t usually classified with classic neurodegenerative diseases such as
Alzheimer’s, Parkinson’s, and Lou Gehrig’s disease, all the conditions have a similar deformation in
the structure of proteins needed for normal, healthy functioning. Granted, Alzheimer’s, Parkinson’s,
and Lou Gehrig’s disease are not transmissible to people like mad cow is, but they nevertheless result
in similar features that scientists are just beginning to understand. And it all boils down to deformed
proteins.
Much in the way we now know that dozens of degenerative diseases are linked by inflammation,
we also know that dozens of those same diseases—including type 2 diabetes, cataracts,
atherosclerosis, emphysema, and dementia—are linked to deformed proteins. What makes prion
diseases so unique is the ability of those abnormal proteins to confiscate the health of other cells,
turning normal cells into misfits that lead to brain damage and dementia. It’s similar to cancer in that
one cell hijacks the normal regulation of another cell and creates a new tribe of cells that don’t act like
healthy ones. Working in laboratories with mice, scientists are finally collecting evidence to show that
major neurodegenerative conditions follow parallel patterns.
9
Proteins are among the most important structures in the body—they practically form and shape the
entire body itself, carrying out functions and acting like master switches to our operating manual. Our
genetic material, or DNA, codes for our proteins, which are then produced as a string of amino acids.
They need to achieve a three-dimensional shape to carry out their tasks, such as regulating the body’s
processes and guarding against infection. Proteins gain their shape through a special folding
technique; in the end, each protein achieves a distinctive shape that helps determine its unique
function.
Obviously, deformed proteins cannot serve their function well or at all, and unfortunately, mutant
proteins cannot be fixed. If they fail to fold properly into their correct shape, at best they are inactive
and at worst, toxic. Usually cells have built-in technology to extinguish deformed proteins, but aging
and other factors can interfere with this process. When a toxic protein is capable of inducing other
cells to create mis-folded proteins, the result can be disastrous. Which is why the goal for many
scientists today is to find a way to stop the cell-to-cell spread of misshapen proteins and literally halt
these diseases in their tracks.
Stanley Prusiner, the director of the Institute for Neurodegenerative Diseases at the University of
California, San Francisco, discovered prions, which earned him the Nobel Prize in 1997. In 2012, he
was part of a team of researchers who authored a landmark paper presented in the Proceedings of the
National Academy of Sciences that showed that amyloid-beta protein associated with Alzheimer’s
shares prion-like characteristics.
10
In their experiment, they were able to follow the progression of
disease by injecting amyloid-beta protein into one side of mice’s brains and observing its effects.
Using a light-generating molecule, they could see the marauding proteins collect as the mice’s brains
lit up—a toxic chain of events that’s similar to what happens in the Alzheimer’s brain.
This discovery holds clues to more than brain disease. Scientists who focus on other areas of the
body also have been looking at the impact of shape-shifting proteins. In fact, “mad” proteins may play
a role in a range of diseases. Type 2 diabetes, for example, can be seen from this perspective when we
consider the fact that people with diabetes have demented proteins in their pancreas that can
negatively affect insulin production (which begs the question: Does chronic high blood sugar cause
the deformation?). In atherosclerosis, the cholesterol buildup typical of the disease could be caused by
protein mis-folding. People with cataracts have rogue proteins that collect in the eye lens. Cystic
fibrosis, a hereditary disorder caused by a defect on the DNA, is characterized by improper folding of
the CFTR protein. And even a type of emphysema owes its devastation to abnormal proteins that build
up in the liver and never reach the lungs.
Okay, so now that we’ve established that wayward proteins play a role in disease and especially
neurological degeneration, the next question is, what causes the proteins to mis-fold? With a condition
like cystic fibrosis, the answer is more clear-cut because we have identified a specific genetic defect.
But what about other ailments that have mysterious origins, or that don’t manifest until later in life?
Let’s turn to those glycation end products.
Glycation is the biochemical term for the bonding of sugar molecules to proteins, fats, and amino
acids; the spontaneous reaction that causes the sugar molecule to attach itself is sometimes referred to
as the Maillard reaction. Louis Camille Maillard first described this process in the early 1900s.
11
Although he predicted that this reaction could have an important impact on medicine, not until 1980
did medical scientists turn to it when trying to understand diabetic complications and aging.
This process forms advanced glycation end products (commonly shortened, appropriately, to
AGEs), which cause protein fibers to become misshapen and inflexible. To get a glimpse of AGEs in
action, simply look at someone who is prematurely aging—someone with a lot of wrinkles, sagginess,
discolored skin, and a loss of radiance for their age. What you’re seeing is the physical effect of
proteins hooking up with renegade sugars, which explains why AGEs are now considered key players
in skin aging.
12 Or check out a chain-smoker: The yellowing of the skin is another hallmark of
glycation. Smokers have fewer antioxidants in their skin, and the smoking itself increases oxidation in
their bodies and skin. So they cannot combat the by-products of normal processes like glycation
because their bodies’ antioxidant potential is severely weakened and, frankly, overpowered by the
volume of oxidation. For most of us, the external signs of glycation show up in our thirties, when
we’ve accumulated enough hormonal changes and environmental oxidative stress, including sun
damage.
Glycation is an inevitable fact of life, just like inflammation and free radical production to some
degree. It’s a product of our normal metabolism and fundamental in the aging process. We can even
measure glycation using technology that illuminates the bonds formed between sugars and proteins. In
fact, dermatologists are well versed in this process. With Visia complexion-analysis cameras, they can
capture the difference between youth and age just by taking a fluorescent image of children and
comparing it to the faces of older adults. The children’s faces will come out very dark, indicating a
lack of AGEs, whereas the adults’ will beam brightly as all those glycation bonds light up.
Clearly, the goal is to limit or slow down the glycation process. Many anti-aging schemes are now
focused on how to reduce glycation and even break those toxic bonds. But this cannot happen when we
consume a high-carb diet, which speeds up the glycation process. Sugars in particular are rapid
stimulators of glycation, as they easily attach themselves to proteins in the body (and here’s a good bit
of trivia: The number one source of dietary calories in America comes from high-fructose corn syrup,
which increases the rate of glycation by a factor of ten).
When proteins become glycated, at least two important things happen. First, they become much
less functional. Second, once proteins become bonded to sugar, they tend to attach themselves to other
similarly damaged proteins and form cross-linkages that further inhibit their ability to function. But
perhaps far more important is that once a protein is glycated, it becomes the source of a dramatic
increase in the production of free radicals. This leads to the destruction of tissues, damaging fat, other
proteins, and even DNA. Again, glycation of proteins is a normal part of our metabolism. But when
it’s excessive, many problems arise. High levels of glycation have been associated with not only
cognitive decline, but also kidney disease, diabetes, vascular disease, and, as mentioned, the actual
process of aging itself.
13 Keep in mind that any protein in the body is subject to being damaged by
glycation and can become an AGE. Because of the significance of this process, medical researchers
around the world are hard at work trying to develop various pharmaceutical ways to reduce AGE
formation. But clearly, the best way to keep AGEs from forming is to reduce the availability of sugar
in the first place.
Beyond just causing inflammation and free radical–mediated damage, AGEs are associated with
damage to blood vessels and are thought to explain the connection between diabetes and vascular
issues. As I noted in the previous chapter, the risk of coronary artery disease is dramatically increased
in diabetics, as is the risk of stroke. Many individuals with diabetes have significant damage to the
blood vessels supplying the brain, and while they may not have Alzheimer’s, they may suffer from
dementia caused by this blood supply issue.
Earlier I explained that LDL—the so-called bad cholesterol—is an important carrier protein
bringing vital cholesterol to brain cells. Only when it becomes oxidized does it wreak havoc on blood
vessels. And we now understand that when LDL becomes glycated (it’s a protein, after all), this
dramatically increases its oxidation.
The link between oxidative stress and sugar cannot be overstated. When proteins are glycated, the
amount of free radicals formed is increased fiftyfold; this leads to loss of cellular function and
eventually cell death.
This calls our attention to the powerful relationship between free radical production, oxidative
stress, and cognitive decline. We know that oxidative stress is directly related to brain degeneration.
14
Studies show that damage to lipids, proteins, DNA, and RNA by free radicals happens early in the
journey to cognitive impairment, and long before signs of serious neurological disorders such as
Alzheimer’s, Parkinson’s, and Lou Gehrig’s disease. Sadly, by the time a diagnosis is made, the
damage is already done. The bottom line is that if you want to reduce oxidative stress and the action of
free radicals harming your brain, you have to reduce the glycation of proteins. Which is to say, you
have to diminish the availability of sugar. Pure and simple.
Most doctors employ a measurement of one glycated protein routinely in their medical practice.
I’ve already mentioned it: hemoglobin A1C. This is the same standard laboratory measurement used
to measure blood sugar control in diabetics. So, while your doctor may be measuring your hemoglobin
A1C from time to time to get an understanding of your blood sugar control, the fact that it’s glycated
protein has vast and extremely important implications for your brain health. But hemoglobin A1C
represents more than just a simple measurement of average blood sugar control over a 90-to 120-day
period.
Hemoglobin A1C is the protein found in the red blood cell that carries oxygen and binds to blood
sugar, and this binding is increased when blood sugar is elevated. While hemoglobin A1C doesn’t give
a moment-to-moment indication of what the blood sugar is, it is extremely useful in that it shows what
the “average” blood sugar has been over the previous ninety days. This is why hemoglobin A1C is
frequently used in studies that try to correlate blood sugar control to various disease processes like
Alzheimer’s, mild cognitive impairment, and coronary artery disease.
It’s well documented that glycated hemoglobin is a powerful risk factor for diabetes, but it’s also
been correlated with risk for stroke, coronary heart disease, and death from other illnesses. These
correlations have been shown to be strongest with any measurement of hemoglobin A1C above 6.0
percent.
We now have evidence to show that elevated hemoglobin A1C is associated with changes in brain
size. In one particularly profound study, published in the journal Neurology, researchers looking at
MRIs to determine which lab test correlated best with brain atrophy found that the hemoglobin A1C
demonstrated the most powerful relationship.
15 When comparing the degree of brain tissue loss in
those individuals with the lowest hemoglobin A1C (4.4 to 5.2) to those having the highest hemoglobin
A1C (5.9 to 9.0), the brain loss in those individuals with the highest hemoglobin A1C was almost
doubled during a six-year period. So hemoglobin A1C is far more than just a marker of blood sugar
balance—and it’s absolutely under your control!
An ideal hemoglobin A1C would be in the 4.8 to 5.4 range. Keep in mind that reducing
carbohydrate ingestion, weight loss, and physical exercise will ultimately improve insulin sensitivity
and lead to a reduction of hemoglobin A1C.
You also should know that there’s now documented evidence proving a direct relationship between
hemoglobin A1C and the future risk of depression. One study looked at more than four thousand men
and women whose average age was sixty-three years and showed a direct correlation between
hemoglobin A1C and “depressive symptoms.”
16 Poor glucose metabolism was described as a risk
factor for the development of depression in these adults. The bottom line: The glycation of proteins is
bad news for the brain.
EARLY ACTION
As I’ve already described, having normal blood sugar levels may mean that the pancreas is working
overtime to keep that blood sugar normal. Based upon this understanding, you can see that high
insulin levels will happen long before blood sugar rises and a person becomes diabetic. That’s why it’s
so important to check not only your fasting blood sugar, but also your fasting insulin level. An
elevated fasting insulin level is an indicator that your pancreas is trying hard to normalize your blood
sugar. It’s also a clear signal that you are consuming too much carbohydrate. And make no mistake
about it: Even being insulin resistant is a powerful risk factor for brain degeneration and cognitive
impairment. It’s not good enough to look at the diabetes data as it relates to brain disease and be
confident that your risk has been ameliorated because you are not diabetic. And if your blood sugar
happens to be normal, the only way you will know if you are insulin resistant is to have your fasting
blood insulin level checked. Period.
Need more evidence? Consider a study done a few years ago that looked at 523 people aged
seventy to ninety years who did not have diabetes or even elevated blood sugar.
17 Many of them were
insulin resistant, however, as determined by their fasting insulin levels. The study revealed that those
individuals who were insulin resistant had a dramatically increased risk of cognitive impairment
compared to those within the normal range. Overall, the lower the insulin level, the better. The
average insulin level in the United States is about 8.8 micro international units per milliliter (µIU/mL)
for adult men and 8.4 for women. But with the degree of obesity and carbohydrate abuse in America,
it’s safe to say that these “average” values are likely much higher than what should be considered
ideal. Patients who are being very careful about their carbohydrate intake might have insulin levels
indicated on their lab report as less than 2.0. This is an ideal situation—a sign that the individual’s
pancreas is not being overworked, blood sugars are under excellent control, there is very low risk for
diabetes, and there is no evidence of insulin resistance. The important point is that if your fasting
insulin level is elevated—anything over five should be considered elevated—it can improve, and I
will show you how to do that in chapter 10.
THE FATTER YOU ARE, THE SMALLER YOUR BRAIN
Most everyone has a pretty good idea that carrying around extra weight is unhealthy. But if you
needed just one more reason to drop the excess pounds, perhaps the fear of losing your mind—
physically and literally—will help motivate you.
When I was studying to be a doctor, the prevailing wisdom was that fat cells were primarily
storage bins where unwanted masses of excess could hang out silently on the sidelines. But that was a
grossly misguided perspective. Today we know that fat cells do more than simply store calories; they
are far more involved in human physiology. Masses of body fat form complex, sophisticated hormonal
organs that are anything but passive. You read that right: Fat is an organ.
18 And it could very well be
one of the body’s most industrious organs, serving a lot of functions beyond keeping us warm and
insulated. This is especially true of visceral fat—the fat wrapped around our internal, “visceral”
organs such as the liver, kidneys, pancreas, heart, and intestines. Visceral fat has also gotten a lot of
press lately: We know now that this type of fat is the most devastating to our health. We may lament
our thunder thighs, under-arm curtains, love handles, cellulite, and big butts, but the worst kind of fat
is the kind many of us cannot even see, feel, or touch. In extreme cases we do see it in the bulging
bellies and muffin tops that are the outward signs of fat-enveloped internal organs belowdecks. (For
this very reason, waist circumference is often a measurement of “health,” as it predicts future health
challenges and mortality; the higher your waist circumference, the higher your risk for disease and
death.
19
)
It’s well documented that visceral fat is uniquely capable of triggering inflammatory pathways in
the body as well as signaling molecules that disrupt the body’s normal course of hormonal actions.
20
This, in turn, keeps the cascade of negative effects from visceral fat going. In addition, visceral fat
does more than just generate inflammation down the road through a chain of biological events;
visceral fat itself becomes inflamed. This kind of fat houses tribes of inflammatory white blood cells.
In fact, the hormonal and inflammatory molecules produced by visceral fat get dumped directly into
the liver, which, as you can imagine, responds with another round of ammunition (i.e., inflammatory
reactions and hormone-disrupting substances). Long story short: More than merely a predator lurking
behind a tree, it is an enemy that is armed and dangerous. The number of health conditions now linked
to visceral fat is tremendous, from the obvious ones such as obesity and metabolic syndrome to the
not-so-obvious—cancer, autoimmune disorders, and brain disease.
The dots connecting excessive body fat, obesity, and brain dysfunction are not hard to follow given
the information you’ve already learned in this book. Excessive body fat increases not only insulin
resistance, but also the production of inflammatory chemicals that play directly into brain
degeneration.
In a 2005 study, the waist-to-hip ratios of more than 100 individuals were compared to structural
changes in their brains.
21 The study also looked at brain changes in relation to fasting blood sugar and
insulin levels. What the authors wanted to determine was whether or not a relationship existed
between the brain’s structure and the size of a person’s belly. And the results were striking.
Essentially, the larger a person’s waist-to-hip ratio (i.e., the bigger the belly), the smaller the brain’s
memory center, the hippocampus. The hippocampus plays a critical role in memory, and its function
is absolutely dependent upon its size. As your hippocampus shrinks, your memory declines. More
striking still, the researchers found that the higher the waist-to-hip ratio, the higher the risk for small
strokes in the brain, also known to be associated with declining brain function. The authors stated:
“These results are consistent with a growing body of evidence that links obesity, vascular disease, and
inflammation to cognitive decline and dementia.” Other studies since then have confirmed the
finding: For every excess pound put on the body, the brain gets a little smaller. How ironic that the
bigger the body gets, the smaller the brain gets.
In a joint research project between UCLA and the University of Pittsburgh, neuroscientists
examined brain images of ninety-four people in their seventies who had participated in an earlier
study of cardiovascular health and cognition.
22 None of the participants had dementia or other
cognitive impairments, and they were followed for five years. What these researchers found was that
the brains of obese people—defined by having a body mass index above 30—looked sixteen years
older than their healthy counterparts of normal weight. And those who were overweight—defined by
having a body mass index between 25 and 30—looked eight years older than their leaner counterparts.
More specifically, the clinically obese people had 8 percent less brain tissue, while the overweight had
4 percent less brain tissue compared to normal-weight individuals. Much of the tissue was lost in the
frontal and temporal lobe regions of the brain, the place from which we make decisions and store
memories, among other things. The authors of the study rightfully pointed out that their findings could
have serious implications for aging, overweight, or obese individuals, including a heightened risk for
Alzheimer’s disease.
Without a doubt, vicious cycles are at play here, each of which is contributing to the other.
Genetics could affect one’s propensity to overeat and gain weight, and this then factors into activity
levels, insulin resistance, and risk for diabetes. Diabetes then affects weight control and blood sugar
balance. Once a person becomes diabetic and sedentary, it’s inevitable that a breakdown in tissues and
organs occurs, and not just in the brain. What’s more, once the brain begins to degenerate and
physically shrink, it begins to lose its ability to function properly. That is to say, the brain’s appetite
and weight-control centers won’t be firing on all cylinders and could actually be misfiring, and this
No comments: