A commentary by Nora Gedgaudas, CNS, CNT, BCHN
On August 6th, a Science Daily article was released titled, “Paleo diet: Big brains needed carbs.” At the time I was coincidentally down in Florida visiting a close family member painfully dying from Alzheimer’s disease. Upon surreptitiously checking my iPhone for messages there I found I was suddenly receiving a barrage of emails from fans pointing out the just published article, wondering what I thought. My first thought upon reading the title was “Seriously? You have GOT to be kidding!” The article itself had me shaking my head in utter bewilderment and disbelief. This passes for science?
The premise of this article was clearly predicated on the mistaken idea that glucose is meant of absolute necessity to be “the” human brain’s primary source of fuel. In fact, most everything about the article was based upon this primary assumption. It is among the most commonly misleading foundational ideas taught in medical schools and to mainstream dietitians/nutritionists everywhere: that notion that the brain and body must of necessity rely upon glucose as its primary source of fuel. Unfortunately, this misguided assumption is in fact only conditionally true. It is only true if a human being has cultivated a rather unnatural dependence upon glucose as their primary source of fuel by what they choose to habitually eat.
Nature would never have been so stupid as to force a primary dependence upon so volatile and unreliable a source of fuel as blood sugar. Our brains are actually designed to make use of more than one type of major fuel: sugars (glucose) and fat (in the form of ketones). Fat is at base the human brain’s preferred and most efficient superfuel, but a diet significantly high in carbohydrates (anything close to and over roughly 100 g per day) forces the brain to adapt instead to a less efficient or dependable reliance on sugar. Thanks to aggressive government-controlling transnational Big Agribusiness interests, large multinational chemical industries (marketing fertilizers, pesticides and herbicides, etc.) and corporate food industry efforts, our modern day diet is largely based in sugar and starch-based carbohydrate foods (refined and otherwise) – for the very first time in all of human history. We are officially told we need them in order to be optimally healthy—even though a distinct lack of actual science exists to corroborate such an assertion.
Glucose, a fuel otherwise meant to be an auxiliary or supplemental form of kindling/rocket fuel for bursts of emergency anaerobic energy (with only very small amounts actually required for fueling our red blood cells) has turned into something it was never meant to be. Today people everywhere are relying on tidal waves of insulin to manage unnatural, chronic blood sugar surges resulting from such diets, with decided consequences. Our otherwise overburdened stress management system (i.e., stress hormones) have been chronically and unnaturally tasked in modern times with dealing with the subsequent chronic insulin-induced plummets in blood sugar, leading to roller coastering moods, energy, neurological stability and, yes, cognitive function. We have our modern day carbohydrate-based diet to thank for the increase—not in our brain size- but in metabolic syndrome, diabetes, heart disease, and the progressive neurodegenerative characteristic of Alzheimer’s disease and dementias.
It was never “protein” that was the source of our “initial accelerated expansion of brain size” in early humans/pre-humans (as the article contends), but instead high amounts of its accompanying dietary brain-building fat. The human brain is made up of at least 60 to 80% fat by dry weight and relies upon the dietary fat and cholesterol (yes, eeevil cholesterol) we supply it with in order to maintain its structure and energy-intensive function. Carbohydrates, conversely, provide zero meaningful brain structure. Fat supplies more than twice the calories per gram than even the starchiest carbohydrates ever could, and (when one is well adapted to doing so) provides a steady and reliable, efficiently stored source of fuel for this critical organ—even in the absence of regular meals. Glucose dependence derived from the chronic consumption sugars and starches, on the other hand, is a highly volatile and unreliable form of fuel that must be constantly and vigilantly resupplied and managed to prevent loss of function. This of course is a highly profitable form of metabolic enslavement for the industries that produce and market this type of food. The added problem, unfortunately, is that glucose (and other sugars, such as particularly fructose) also generates a form of cumulative damage known as glycation over time—something that the human brain is significantly susceptible to. Cumulative (non-enzymatically controlled) glycation and advanced glycation/glycosylation end-products (A.G.E.’s) are known to be responsible for premature aging, adverse metabolic changes and loss of tissue function in diabetes/aging and their various complications. Alzheimer’s disease (something rather close to home for me right now) is, in fact today being referred to as “type III diabetes,” and recent studies are clearly showing pathophysiological changes in the Alzheimer’s regions of brains in those having higher blood sugar levels—even in those presenting with supposedly “normal”, acceptable, non-diabetic fasting blood sugar ranges.
If starch in any form were so healthy for the human brain, then certainly more would be better and the human brain today—given the uniquely starch-based diet of our times—would be even larger–growing by leaps and bounds–and better than ever. But the opposite is actually true.
We humans have literally lost just over 10% of our brain volume over the last 10,000 years since the development of agriculture, where cheap and easy to produce starch became a much more prevalent part of the human diet. One might attempt to reason some manner of “improved brain efficiency” with this recently reduced brain size of ours – but this seems more of a rationalization than a viably supportable hypothesis.
That humans have consumed starch as seasonally/climatically available since the universal adoption of fire as a food preparation tool (much more recent in our evolutionary history), particularly in Neolithic times is not necessarily the subject of that much debate. No doubt our Neolithic predecessors spent many a night farting around the campfire during times when starchy tubers were plentiful. Since the advent of agriculture, however, we literally shifted from a diet comprised of close to 90% animal source foods rich in brain-building fats to as little as 10%, in favor of starch—and this has yielded some rather obvious consequences (case-in-point: the popularity of tabloids and “reality TV”).
Nonetheless, no human dietary requirement for any form of carbohydrate has ever been established by science in any medical textbook or textbook of human physiology. Seems like quite the oversight for something that is supposedly responsible in some fundamental way as THE primary fuel for advanced human intelligence…
I’m not buying it.
Stable isotopic evidence from human bone collagen remains representing vast periods of human history show an unmistakable (if not overwhelming) primary dependence upon animal source foods during the rapid encephalization of the human brain roughly two million years ago (well before we would have had universal access to fire for cooking), however, and have never shown evidence for a starch-based diet throughout ancient prehistory in any truly meaningful way. As wild humans attempting to eek out our existence in a harsh and uncertain environment, we would have certainly eaten whatever we had or needed to in order to survive. The fact that our Paleolithic ancestors were able to consume starches (or anything else we were able to put in our mouths in order to survive) by no means provides conclusive evidence that these foods were in any way optimal for our physiological or brain health, much less that we were even able to make meaningful use of them (one reason why the subtitle of my book, Primal Body, Primal Mind reads, “Beyond the Paleo Diet for Total Health and a Longer Life.”). In my book I was able to demonstrate how human longevity research actually lends better clues toward what is and isn’t optimal relative to Paleolithic principles. But I digress.
Intense and prolonged heat is required to transform raw starchy tubers through a process called “gelatinization” into anything remotely digestible by us. Once converted into more digestible starch through extensive cooking, the rest of the process would have required viable amylase genes and their active genetic expression in order to actually process this available dietary starch within the human body in any meaningful way with respect to energy. We really didn’t even have that ability to cook with fire consistently until far more recently in our evolutionary past (not much more than an estimated 75,000-100,000 years ago). By that time our brains were already fully modern, if not even a bit larger than they are now. Non-gelatinized (a.k.a., “resistant”) raw starch-based foods may have provided significant fodder for certain types of gut microorganisms along the way (as does other simple fibrous plant material), but nowhere near in the capacity typically exhibited by those animals actually designed to eat a carbohydrate-based diet in the first place. Ours is a hydrochloric acid and not fermentative-based digestive system, after all. Also, raw starch foods would have lacked any real nutritional (much less caloric) value for us, given their exceedingly poor digestibility.
Unlike our Great Ape ancestors, we have a greatly expanded small intestine and greatly shortened large intestine consistent with a diet much higher in meat and fat. Where our ape ancestor’s brains utilize perhaps 8% of their total metabolic energy requirements, the human brain demands a whopping 25% or more of our total daily energy needs. Dietary (and ample stored) fat is effortlessly poised to supply this need with more than double the efficiency of any carbohydrate-based food, and the human body’s fat reserves (in a healthy state of natural, ketogenic adaptation) are far more readily able to maintain steady and ample fuel supplies than comparatively paltry glycogen stores could ever hope to. Our advanced brains would have required consistency of availability for their structural building/maintenance materials and fuel supply—something fat would have been in an infinitely superior position to provide.
The development of amylase genes in humans (along with the availability of starch-based foods) was far more highly variable throughout global people groups than basic human brain size or intelligence, yielding a rather poor correlation here. By far the greatest consistency throughout human Paleolithic history was a dependence upon—first and foremost—animal source foods. And ample dietary animal fat (and its critical 20- and 22-carbon fatty acids, arachidonic acid/AA, as well as docosahexanoic acid/DHA) consumed with such foods, more than any other source of nourishment, would have provided the necessary available substrate and energy to both construct and fuel the uniquely and voraciously demanding human brain.
The “expensive tissue hypothesis” postulating the importance of increased dietary meat and fat toward the rapid enlargement of the human brain during human evolution is a well-established and well-accepted concept in the field of paleoanthropology and (for good reason) isn’t particularly controversial. We are unique among all primates in our advanced adaptation to high meat and fat diets; and this, more than any other distinguishing characteristic has contributed to what (up until about 10,000 years ago, anyway) was an unprecedented explosion of brain growth and unparalleled functional sophistication. Although high amounts of protein are ultimately not necessary for optimal health (moderation seems to be significantly more beneficial), fat-based diets–in the absence of high sugar/starch–demonstrate significantly superior cognitive and metabolic efficiency over carbohydrate-based diets—something not accounted for in the weak hypothesis proposed by the Science Daily article.
Innumerable corporate interests stand to profit handsomely by investing in the promotion of carbohydrate-based diets for every man, woman and child on planet Earth. They are enormously cheap to produce, highly profitable and they keep everyone perpetually hungry. Certainly Monsanto and the unscrupulous, profit-hungry multinational Food Industry it supplies have got to LOVE that. Ka-CHING! Carbohydrate-based diets are also increasingly recognized as responsible for the modern-day explosion of metabolic diseases, obesity, cancers, heart disease and chronic inflammatory conditions (nice for the profit-based Medical Industry and Big Pharma that drives it). The chronic demand for endogenous insulin production—highly correlated with sugar and starch consumption–is directly correlated to premature aging and age-related decline, as well as many of the aforementioned metabolic diseases. Add to this the commonly addictive nature of dietary sugars/starches, and you have a recipe for officially sanctioned/popular ongoing carbohydrate fixation and cravings (regardless of the abundant evidence to the contrary) embraced by many wanting to rationalize them as either benign or somehow beneficial. And those promoting carbs are far more likely to win popularity contests than those questioning their health effects. It’s simply the nature of the Beast.
Let’s just say that SOMEONE is clearly benefitting from all this marketing, rationalization and promotion, but it isn’t the vulnerable consumers or their brains.
The bottom line is this: we were exquisitely forged by Nature as Paleolithic hunter-gatherers to be primarily “fat-heads” and not “potato heads” or “grain brains.”
I’ll personally skip the potatoes, thank you very much.