Ötzi The Ice Man: Agriculture Didn’t Do Him Any Favors

5,300 years ago in what is now the Oetzal Alps between Austria and Italy, a man that researchers today decided to name “Ötzi” lived and also met an untimely and suspicious death.  His frozen and mummified body was discovered at nearly 11,000 feet of altitude melting from glacial ice in 1991 by a pair of German hikers.  Archaeologists considered this the discovery of the century—an intact, frozen semi-prehistoric man.  His body and possessions were almost perfectly preserved.  He was about 46 years of age, 5’2” (fairly average height for his day) and evidently a person of some higher status due to the quality of his possessions.  His likeliest cause of death, as forensic researchers determined, was actually murder…but his health was anything but stellar the day he died.

Even though he lived several thousand years ago, Ötzi was already quite clearly a neolithic (agricultural era) human, even as he may have also hunted for some of his food along the way.  Let’s just say that agriculture didn’t do Ötzi any favors.  For starters, he apparently suffered severe tooth decay and even had the grains that helped cause that still stuck in his teeth when they discovered his body.  Along with lousy oral health he also showed signs of suffering arteriosclerosis and advanced arthritis.  He may be portrayed by the media as a prehistoric human, but his state of health was decidedly forged by the dietary pitfalls of an agricultural-based society.  According to his stomach contents, bread was part of his last meal.  Despite whatever hunting and gathering he may have done along the way he clearly wasn’t “Paleo” in many of his dietary choices.  He was basically modern enough to have suffered some of the earliest diseases of modern civilization.  In short, Ötzi was an agriculturally-influenced, medical mess.

His shorter stature and ill health was actually typical of many post-agricultural societies.  Early agriculturalists commonly suffered mineral deficiency-associated diseases like rickets, stunted growth and bone mineral disorders. Osteoporosis, skeletal abnormalities, malnutrition, birth defects, dental malocclusion and degenerative disease also became much more commonplace once we adopted the widespread consumption of grains (yes–including whole grains).  Cereal-based diets also commonly led to numerous other vitamin and mineral deficiencies not typically shared by their more Stone Age ancestors; such as pellagra, scurvy, beriberi, as well as vitamin A, iron and zinc deficiencies (the minerals contained in grains are actually poorly available to us, if at all, due to the presence of a substance known as phytic acidsomething never mentioned by proponents of the government “food pyramids”, or the commercial purveyors of supposedly “healthy whole grains”).  According to the researchers analysis, Ötzi’s dental problems show the results of switching from a strictly hunter-gatherer to an agricultural one.

Grain consumption has been linked with allergies, food sensitivities, autoimmune disorders (now numbering in excess of 100 that have been identified thus far, and 40 additional diseases thought to have an autoimmune component), numerous cancers, pancreatic disorders, mineral deficiencies, arthritis, cardiovascular disease, celiac disease, epilepsy, cerebellar ataxia’s, dementia, degenerative diseases of the brain and central nervous system, peripheral neuropathies of axonal or demyelinating types, and myopathies as well as autism and schizophrenia, to name a few of our more common modern day health-related issues.

Contrast this with the primary cause of death in more Stone Age hunter-gatherers:  accident and infection.  That’s about it.  As long as we survived that, we stood pretty good odds of making it to old age without the chronic degenerative diseases that plague the elderly and not so elderly today.  Fast-forwarding to today— as we continue to follow this slippery slope of embracing products and byproducts of Monoculture Agriculture and the ever-powerful Food Industry, according to new findings, people today are being diagnosed with disease 33 percent (15 years) faster than their grandparents.  Age 30 has become the new 45 in reference to on-setting disease.  This is all relatively new—and NOT because we suddenly began consuming more animal-based foods or fats (which our ancestors have eaten consistently for close to 3 million years)!

Certain popular books today encourage those of us that have been smart enough to shed non-essential and highly problematic grains from our diet, to actually bring wheat back into our diets again.  Really?  Do we really want this (even as it may be telling some folks what they want to hear)?  So much is wrong with this idea that it isn’t even funny….but I’ll save the rest for another article.

Even as an agricultural lifestyle allowed us to live in better protected, safer and controlled environments, an agricultural lifestyle wasn’t exactly our fast track to optimal health and longevity.  Even as human populations exploded (ultimately leading to planetary overpopulation issues we face today), life expectancy actually declined by half—relative to our supposedly short-lived Paleolithic forebears–early on into the agricultural revolution.  In fact, between the Neolithic Revolution and the late 18th century human life expectancy never exceeded roughly 25 years! What basically improved longevity in more modern times sure as heck wasn’t our back-breaking, nutrition- and health-compromising agricultural lifestyle, but instead (apart from protection from the elements and predators) better technology, sanitation, and modern economic growth.

When researchers compared the health of farmers versus hunter-gatherers they found that the farmers suffered much higher rates of infection due to the concentration of stationary human populations in farming communities, much poorer nutrition due to their increased cereal grain intake and reduced meat/fat intake and problems with mineral absorption due to the effects of the cereal-based diet (Cohen, 1989).  Neolithic farmers were shorter, had many more health-related issues and a lower life expectancy relative to Mesolithic hunter-gatherers.   We also became much more vulnerable to famine; not to mention the oppression and control (and the other delightful things) that come with concentrated population centers and opportunistic ruling class hierarchies, not the least of which has been an unending and unprecidented pattern of full scale wars.

Agriculture also served to shift our macronutrient content from a focus on dietary fat and protein to a diet based far more upon carbohydrate intake.  The impact on human health in this regard has not been trivial.  Only in modern times, for instance, have we humans developed an emergency need to lower blood sugar through the unnaturally chronic and metabolically dysregulating release of insulin.  We were never designed to tolerate being marinated in a caustic and inflammatory mix of glucose (and other sugars) and insulin 24/7.  This is a modern-day development and one we need to rethink if we are ever to reclaim our primal birthright as a species of foundational health.

Agriculture cost us as a species, big time.  We’ve spent the last 500 generations—or less, less than 0.4% of our evolutionary history —after we spent more than 100,000 generations (99.9+%) on a mostly meat and fat diet—eating a diet that is demonstrably and increasingly unnatural to our species.  Pre-agricultural paleolithic hunters and gatherers derived most of their calories from about 100-200 different species of wild animals (meat and fat being as much as 90% of their dietary economy)…and to a lesser degree fibrous vegetables, greens and nuts/fruits.

Suddenly, all of about 17 species of plants today, most of which are entirely new to the human diet—are providing 90 percent of the world’s food supply.  That’s an insanely huge flip-flop, and it has NOT been a healthy one. The toll this has taken upon our health, not to mention the health of the planetary environment is immeasurable and tragic.

The energy and nutrient value of starch-based foods we have come to rely upon in modern times are considerably inferior to animal foods, even after cooking.  Sugar-based foods in particular have led to an epidemic of metabolic disorders, including obesity, type 2 diabetes and a huge variety of sugar-hungry cancers.

And these foods aren’t just problematic in today’s world.  They have been problematic from the beginning of their widespread consumption. They were literally never the cornerstone of optimal health.

Grains and legumes, much less refined carbohydrates weren’t necessarily all to blame for poor health related to starch consumption in pre- (and post-) agricultural humans.  An interesting study was published in the Proceedings of the National Academy of Sciences in January of 2014 revealing that pre-agricultural starchy carbohydrate consumption led to ill health and dental disease in prehistoric hunter-gatherers from time to time, as well.  In other words, we can’t blame it all on farming.  The research by a team from Oxford University, the Natural History Museum, London, and the National Institute of Archaeological Sciences and Heritage (INSAP) in Morocco actually showed that widespread tooth decay occurred in a hunter-gathering society in Morocco several thousand years before the dawn of agriculture.  The team analyzed 52 sets of adult teeth from hunter-gatherer skeletons found in Taforalt in Morocco, dating between 15,000 and 13,700 years ago. Unexpectedly, they found evidence of decay and abscesses in more than half of the surviving teeth, with only three skeletons showing no signs of cavities.  Excavations revealed evidence of the systematic harvesting and processing of wild foods, including sweet acorns, pine nuts and land snails.  The guilty food?  Sweet (starchy) acorns, apparently.  In other words, even naturally occurring starch in its whole food, wild state wasn’t an optimal food for our ancestors, either.

Dental abnormality such as malocclusion and dental crowding (attractive as they are) first became common among the world’s earliest farmers some 12,000 years ago in Southwest Asia, according to findings published (04 Feb 2015) in the journal PLoS ONE.  Pre-agricultural, paleolithic hunter-gatherers, on the other hand had almost no malocclusion and dental crowding demonstrated in any research to date.  In a Science Daily article illuminating highlights of this PLOS ONE study, interviews with the research scientists behind the study revealed that, “Our analysis shows that the lower jaws of the world’s earliest farmers in the Levant, are not simply smaller versions of those of the predecessor hunter-gatherers, but that the lower jaw underwent a complex series of shape changes commensurate with the transition to agriculture,” says Professor Ron Pinhasi from the School of Archaeology and Earth Institute, University College Dublin, the lead author on the study.

“Our findings show that the hunter gatherer populations have an almost “perfect harmony” between their lower jaws and teeth,” he explains. “But this harmony begins to fade when you examine the lower jaws and teeth of the earliest farmers.”

In the case of hunter-gatherers, the scientists from University College Dublin, Israel Antiquity Authority, and the State University of New York, Buffalo, found a correlation between inter-individual jawbones and dental distances, suggesting an almost “perfect” state of equilibrium between the two. While in the case of semi-sedentary hunter-gatherers and farming groups, they found no such correlation, suggesting that the harmony between the teeth and the jawbone was disrupted with the shift towards agricultural practices and sedentism in the region. This, the international team of scientists say, may be linked to the dietary changes among the different populations.

The diet of the hunter-gatherer was based on “hard” foods like wild uncooked vegetables and meat, while the staple diet of the sedentary farmer is based on “soft” cooked or processed foods like cereals and legumes. With soft cooked foods there is less of a requirement for chewing which in turn lessens the size of the jaws but without a corresponding reduction in the dimensions of the teeth, there is no adequate space in the jaws and this often results in malocclusion and dental crowding.

The link between chewing, diet, and related dental wear patterns is well known in the scientific literature. Today, malocclusion and dental crowding affects around one in five people in modern-world populations. The condition has been described as the “malady of civilization.”

In short, we humans were fundamentally designed by our evolutionary selective pressures and earliest dietary preferences to be fat-heads, not potato-heads (or acorn-heads), much less grain-brains.  

Ketones, the basic energy units of fat, along with free fatty acids are an extremely abundant, reliable, stable and steady source of fuel, even in the absence of regular meals.  Furthermore, brain cells (and delicate brain tissue) are more vulnerable than just about any other tissue to the ravages of glycation and the oxidation or free radical activity that glucose and glycation attracts.  The brain actually prefers ketones (or, more specifically, BOHB) to glucose as it’s primary source of fuel (from the standpoint of being non-damaging, as well as being a “superfuel”, yielding even more ATP than either glucose or free fatty acids).  Interestingly, acetoacetate (another form of ketone body) is the preferred source of energy in our heart muscle and renal cortex… but your brain can also use that if it needs to, and it adds needed balance to BOHB in those who may be seizure-prone.  Nature was beyond wise to design us this way, but unfortunately most people today fail to take advantage of this incredibly elegant and efficient design, instead diverting away from the energy we were mostly meant to rely upon and forcing an unnatural and enslaving dependence upon carbohydrates as a primary source of fuel.

Think about what we’ve been mostly eating for the last 10,000 years while our brains have been progressively shrinking and our collective health increasingly deteriorating…

The most obvious and far-reaching dietary change during the last 10,000 years is, of course, the enormous drop in consumption of high-energy, fat-rich foods of animal origin which formed easily 90% of our diet in Paleolithic times, to as little as 10% today, coupled with a large rise in less energy-dense grain consumption. This switch—this literal flip-flop to a carbohydrate-based diet embraced by the corporately controlled USDA Food Pyramid (and other nearly identical guidelines offered in Europe/UK and Australia/New Zealand) is far from optimal and far from benign in its effects.

In fact, more recently, there has been a long overdue, detailed analysis of the US National Health and Nutrition Examination Survey (NHANES) data, which has been measuring dietary trends, caloric consumption and the body heights and weights of americans for nearly 50 years.  What it revealed is both shocking and telling:

It turns out that Americans HAVE been following the rules and taking the mainstream government guidelines to heart in their dietary choices (all the while the health of the public has been deteriorating).   –And even more shockingly,  It seems the more obediently we have followed the rules, the sicker and more obese we have become as a society.  Trading animal source foods for carbs and animal fats for vegetable oils has effectively pushed the collective health of Americans (and others following similar guidelines the world over) to the brink of collapse.

It’s well past time we take the USDA Food Pyramid and turn it upside down.  I think even Ötzi would agree.


(Check out my new weekly online program: Primal Power 52)


1 Rollo, F.U. et al., “Ötzi’s last meals: DNA analysis of the intestinal content of the Neolithic glacier mummy from the Alps.” PNAS October 1 vol 99 no 99 (2002)

2 L Hallberg, et al. “Phytates and the inhibitory effect of bran on iron absorption in man.” Am J Clin Nutr. 1987; 45(5): 988.

3 Seiler R, Spielman AI, Zink A, Rühli, F. “Oral pathologies of the Neolithic Iceman, c. 3,300 BC”. European Journal of Oral Sciences (2013) 1-5

4 http://www.livescience.com/28608-otzi-iceman-had-bad-teeth.html

5  Cordain, L. Cereal grains: humanity’s double-edged sword.  World Rev Nutr Diet. 1999; 84:19-73.

6 Hulsegge G, Susan H, Picavet J , et al. Today’s adult generations are less healthy than their predecessors: generation shifts in metabolic risk factors: the Doetinchem Cohort Study. European Journal of Preventive Cardiology. Published online April 10 2013  (http://cpr.sagepub.com/content/early/2013/04/08/2047487313485512.abstract)

7 Angel JL. Health as a factor in the changes from hunting to developed farming in the eastern Mediterranean. In: Cohen MN, Armelagos GJ, editors. Paleopathology at the origins of agriculture. New York: Academic Press, 1984:51–73

8 Fogel RW. The fourth great awakening and the future of egalitarianism. Chicago: Univ. Chicago Press, 2000:48,137–75.) and (Kuznets S. Modern economic growth: rate, structure, and spread. New Haven: Yale Univ. Press, 1966:8–16.

9 Acsa’di, G.Y., and J. Nemeskeri, (1970), History of Human Life Span and Mortality, (Akademiai Kiado, Budapest).

10 Cohen M. N.,(1989),Health and the Rise of Civilization (Yale Uni. Press: New Haven).

11 Diamond, J. (1997),.Guns,Germs,andSteel:TheFatesofHumanSocieties.(Norton, New York).

12  Wesidorf, J. L. (2006) “From Foraging to Farming: Explaining the Neolithic Revolution,”Journal of Economic Surveys,19, 561-586

13 Humphrey LT, De Groote, I, Morales J, Barton N, Collcutt S, Bronk Ramsey C, Bouzouggar A. “Earliest evidence for caries and exploitation of starchy plant foods in Pleistocene hunter-gatherers from Morocco.“Proceedings of the National Academy of Sciences, 2014; DOI: 10.1073/pnas.1318176111

14 Pinhasi R, Eshed V, von Cramon-Taubadel N.  “Incongruity between Affinity Patterns Based on Mandibular and Lower Dental Dimensions following the Transition to Agriculture in the Near East, Anatolia and Europe.” PLOS ONE, 2015; 10 (2): e0117301 DOI: 10.1371/journal.pone.0117301

15 Malocclusion and dental crowding arose 12,000 years ago with earliest farmers.” Science Daily.  February 4, 2015.

16 Cahill GF Jr, Veech RL. “Ketoacids? Good medicine?” Trans Am Clin Climatol Assoc. 2003;114:149-61; discussion 162-3

17 John L. Tymoczko, Lubert Stryer and Jeremy Berg. Biochemistry 7th Edition. W. H. Freeman; Seventh Edition edition (December 24, 2010), Chapter 22 (Fatty Acid Metabolism).

18  Eaton, S Boyd, Eaton, Stanley B III. Evolution, diet and health. Presented in association with the scientific session, Origins and Evolution of Human Diet. 14th International Congress of Anthropological and Ethnological Sciences, Williamsburg, Virginia, 1998.

19 Cohen E, Cragg M, deFonseka J, et al. “Statistical review of US macronutrient consumption data, 1965-2011: Americans have been following dietary guidelines, coincident with the rise in obesity.”  Nutrition J. 2015. 31: 727-732.

"Cereal Killer": Be On the Lookout For Gluten


Gluten (the Latin word for “glue”), is a substance found in numerous grains.  It is actually a complex of proteomes and lectins that are a part of most known grains, including even rice, however the form of gluten associated with the most adverse health issues is found particularly in wheat (durum, semolina, spelt, triticale, kamut) rye, and barley.  This is the form of gluten I will be referring to in this article.


This large, complex and impossible to digest protein has been associated with literally hundreds of health related issues, though more misinformation and misunderstanding about it exists in medicine and even the field of nutrition than up-to-date, accurate information.  Part of the reason for this is because the field of immunology where the newest information clarifying the effects of gluten mostly arises is not made up of medical doctors, but rather PhD researchers.  There is a real disconnect between medicine and immunology and there really is no such thing as a “medical immunologist”.  As such, the exponentially growing numbers of patients filling hospital beds with some form of gluten-related issue are seldom addressed on this foundational level.  They are treated for their symptoms and their dietary habits are largely ignored and treated as insignificant.


It turns out this is VERY significant.


For the record, wheat gluten contamination is typically present in oats, soy flour and buckwheat flour, too, due mainly to modern processing and storage methods. Small amounts of wheat gluten contamination are also typically present in processed corn products and corn starch unless otherwise labeled.  All grains and processed foods should be considered suspect. 


What is called “gluten” is actually made up of hundreds of peptides.  The only proteome typically screened for by doctors is gliadin, which consists of subtypes alpha gliadin, beta gliadin, gamma and omega gliadin, as well.  All are potentially immunoreactive but the only fraction of gliadin currently tested for in most serologic or salivary testing is alpha-gliadin due to its close association with celiac disease, which leaves considerable margin for error in the form of false negatives.  Interestingly and also tragically, celiac disease (which until very recently was the only form of gluten sensitivity conventionally recognized) comprises no more than about 12% of those suffering with the effects of gluten sensitivity. Celiac is literally but the tip of a much larger iceberg.  If you happen to be sensitive to a fraction of gliadin (or other commonly reactive epitope of gluten) other than alpha-gliadin then you will likely test negative for “gluten sensitivity”.  This is deeply problematic.  And only about 40% of celiac patients actually even test positive for alpha gliadin.  A LOT ends up falling through the cracks with standard medical testing.  In fact, more actually falls through the cracks than not.


Tip of the iceberg       


Gliadin in some form exists in most grains.   Wheat, durum, spelt, triticale, barley, and rye are members of a family of grains having the most pronounced antigenic effects on those sensitive to gluten, though all grains (including rice) contain some form of gluten.  The gluten in these other grains may or may not be significantly problematic, though a general avoidance of dietary grains for numerous reasons (outlined in detail in my book, Primal Body, Primal Mind) is probably a good idea.

Wheat Diagram


Gluten, used in baking it gives bread dough its elasticity and baked goods their fluffiness and chewiness.  It is also used as an additive and stabilizing agent in innumerable processed foods and personal care products.  Insanely, gluten is nearly everywhere.  Laws do not require its labeling on all products so the consumer is left to judge for themselves whether gluten may be an additive or not.  I, personally, don’t trust any product that isn’t clearly labeled “100% gluten free”.


For us humans, where we have spent nearly all the last 2.6 million years as hunter-gatherers, gluten (and its closely related compounds) is a very new inclusion to the diet and is essentially impossible for us to digest.  In fact, according to respected celiac expert and researcher, Dr. Alexio Fasano, NO human can actually digest gluten.  This, to me, effectively takes it out of the food category and into a category of being no more than a contaminant.  Researchers at GreenMedInfo.com have recently identified fully 300 health related issues associated with gluten.  To say that gluten can add complications to your health is putting things mildly.  Problems with gluten are becoming literally epidemic and although public awareness about this issue is certainly growing there is more that is poorly understood by most than not.  The consequences of gluten sensitivity (diagnosed or undiagnosed) can literally be lethal and often are, particularly in tandem with other vulnerabilities.  You may not think you’ve heard about this but the fact is that you hear about it every day.  It just goes under different names:  cancer, heart disease, autoimmunity.  The consequences of gluten exposure are potentially very real.


Although commonly associated with celiac disease many do not appreciate gluten’s potentially incredible impact on the health of countless individuals or the commonality with which people may be afflicted with non-celiac “gluten sensitivity”.  In fact, gluten may well be at the silent root of a great many of the health challenges millions of people face today, both physical and mental.  Its inflammatory and immune compromising effects can be a dangerous catalyst for many things.  It is rarely even suspected as an underlying culprit in most instances, however.  Furthermore, the inherent presence of what are called exorphins in grains (morphine-like compounds) make gluten-containing grains quite addictive for large numbers of people and leave many in frank denial of the havoc it can wreak.


Allow me to elaborate:


A 2009 study in the Journal of the American Medical Association (JAMA Sept 16; 302(11):1171-8) found that those with celiac disease and/or gluten sensitivity, whether diagnosed or undiagnosed had a significantly higher risk of death, particularly from heart disease and cancer.  It is currently estimated (very conservatively) that one in every 100 people suffers from celiac disease, a devastating consequence of gluten-containing grain consumption. Some more recently have hypothesized that this number may be closer to one in 30.  Non-Celiac Gluten Sensitivity (as opposed to celiac disease) is itself also an autoimmune condition and is considerably more common.  In fact, it is currently nearly epidemic in its scope.  The effects of and markedly increased mortality risks associated with both full blown celiac disease and gluten sensitivity happen to be virtually identical.  Both are autoimmune conditions that create inflammation and immune system effects throughout the body.  They can affect all organ systems (including your brain, heart, kidneys, etc.), your nervous system, your mood, cognitive functioning, your immunological functioning, your digestive system and even your musculoskeletal system.  Almost literally everything from your hair follicles down to your toenails and everything in-between. 




Exposure to gluten in a sensitive individual essentially shuts down blood flow to the prefrontal cortex—the part of our brains that allow us to focus, manage emotional states, plan and organize and exercise our short term memory.  The prefrontal cortex is our brain’s “executive function” control center and is the part of our brain that basically makes us the most human.  The inflammatory response invoked by gluten exposure additionally activates the brain’s microglial cells, which have no built in inhibitory mechanisms and do not readily wind down again on their own. In some individuals this destructive inflammatory cascade can literally take months, years or even potentially decades in some to abate.  Additionally, these periods of cerebral hypoperfusion followed by reperfusion can be quite damaging (much the way heart muscle cells typically die following reperfusion after the ischemia of a heart attack).  The damage and neurodegeneration this can cause over time, together with sympathetic (“fight or flight”) nervous system over-arousal can be significant. 


In routine blood tests, seeing functionally chronic states of anemia (serum iron below 85 ug/dL or especially ferritin below 40, plus hemoglobin below 13.5 in women or 14 in men), functionally depressed or elevated serum protein levels (below 6.9 or above 7.4 G/dL), unusually depressed triglycerides (below 75 mg/dL–especially where carbs play a significant dietary role) and/or elevated CRP, significantly “elevated” or depressed (below 150 mg/dL) serum cholesterol, alkaline phosphatase levels (significantly below 70 U/L), functionally depressed BUN (below 13 mg/dL), abnormally high HDL (in excess of 75 mg/dL) and/or chronically (even functionally) elevated SGOT/SGPT liver enzymes, among other chronic inflammatory and malabsorptive markers although not diagnostic here can be cause–especially when found in combination with one another–for possible suspicion and further investigation.  It takes further testing to be sure–though even some of the best testing methods can vary greatly in their accuracy.


Gluten can also be looked upon somewhat as a bit of as “gateway food sensitivity”.  It is known to increase an enzyme in the body known as zonulin, which controls intestinal (and also blood-brain barrier) permeability.  Elevated zonulin levels in the presence of gluten can also serve to allow other types of undigested proteins to slip past what would otherwise be more selectively permeable barriers and cause additional immunological reactions to other foods.  Casein (milk protein) is the most common co-sensitivity and cross-reactive compound with gluten, but the immune system can come to react to almost anything if gluten consumption persists.  This can be a very real problem.  Once multiple food sensitivities take over it can amount to a very vicious cycle that only worsens with time and becomes extremely difficult to correct.  Living with this can be miserable at best.


A study published in 2009 in the peer reviewed journal, Gastroenterology (July;137(1):88-93) compared 10,000 available blood samples from individuals 50 years ago to 10,000 people today and found that there has been a 400% increase in the incidence of full blown celiac disease (defined by conventional medicine as a total villous atrophy of the small intestine)!  There are numerous potential reasons for this.  Part of it has to do with the fact that wheat has been so hybridized.  Each time they hybridize wheat in order to try and improve or modify some of its characteristics for specific growing regions or some desirable change it literally creates roughly 5% new (foreign) proteins every time.  Proteins which are inherently foreign to us.  The deamidation of wheat to broaden its use in processed foods has significantly increased its immunologic reactivity potential.  There is also more selection for gluten in grains by Big Agribusiness.  Today, wheat is also beginning to be genetically modified, which will unquestioningly add an exponential problem.  Furthermore, I believe we are also looking at an increasingly weakened human genome through multi-generational exposure now to processed foods, depleted soils, environmental contaminants, pesticide use, increasing heavy metal contamination, fluoride, radiation exposure, EMF, and on and on.  According to this study in the Annals of Medicine in 2010, “The prevalence of Celiac Disease has increased five-fold overall since 1974. This increase was not due to increased sensitivity of testing, but rather due to an increasing number of subjects that lost the immunological tolerance to gluten in their adulthood.”  Ann Med. 2010 Oct;42(7):530-8

According to the Journal of Gastroenterology fully 30-50% of all people carry the gene for celiac disease (known as HLA-DQ8 or HLA-DQ2)–and eight times more people with celiac disease have no GI symptoms whatsoever.  Gluten sensitivity genes are significantly more common (HLA-DQB1, Alleles 1 and/or 2).
Although oats, soy flour, buckwheat flour and processed corn products technically are not part of the gliadin-containing family of grains, modern methods of processing and storage nearly always ensure gluten contamination of these products and the presence of actual gluten should always be assumed unless labeled “100% gluten free”.  The prolamin (avenin) content of oats, however, still makes them at least potentially suspect for inherent sensitivity issues, as is corn (zein).


Easily 99% of those who suffer from this entirely curable and potentially devastating condition do so completely unaware of the dangerous vulnerability within themselves.  Although a biopsy of the small intestine is commonly used to diagnose (intestinally-based) celiac disease, fully eight out of ten celiac sufferers exhibit no intestinal or GI symptoms at all. In fact, an article in the journal Neurology (Vol 56/No.3 Feb 13, 2005) states that “Gluten sensitivity can be primarily and at times exclusively a neurological disease”, affecting not only the brain and nervous system directly, but also cognitive and psychiatric illness.”  In the Journal of Neurology, Neurosurgery and Psychiatry (1997; 63; 770-775) an article states “Our finding…implies that immune response triggered by sensitivity to gluten may find expression in organs other than the gut; and the central and peripheral nervous systems are particularly susceptible.”


A 2002 review paper in the New England Journal of Medicine (Jan 17; 346(3):180-188) found that fully 55 diseases were known (at that time) to be potentially caused by gluten.  These partly included heart disease, cancer, nearly all autoimmune diseases, osteoporosis, irritable bowel syndrome, as well as many common psychiatric illnesses, partly including anxiety issues, ADD, bipolar disorder, depression dementia, schizophrenia, Hashimoto’s (autoimmune thyroid disorders), migraines, epilepsy, Parkinson’s, ALS, neuropathies (having normal EMG), and most other degenerative neurological disorders…as well as Autism, which is technically an autoimmune brain disorder.  In my opinion, it is always safest to assume the presence of gluten sensitivity in these populations, or frankly wherever significantly compromised health, mood or brain function is an issue.


Testing for gluten sensitivity


Although there are numerous methods for assessing gluten sensitivity and/or celiac disease, most are unfortunately somewhat unreliable (at best) in their accuracy (including the so-called “gold standard” approach of intestinal biopsy), which may be partly why so few are properly diagnosed even when testing is sought out.  With respect to blood and salivary testing, out of potentially hundreds of different sub-fractions of gliadin, for instance, typically only one—alpha-gliadin—is ever tested for.  If you happen to have a sensitivity for any of the other forms of gliadin (or other compounds in gluten) it might not ever show.  False negatives are a notorious part of this type of testing, unfortunately.  Accuracy (where negative results are concerned) is never 100%.  In fact most standard serologic or salivary testing is no more than about 30% accurate. Immunoglobulin testing for food sensitivities in those with autoimmune disorders and particularly Hashimoto’s are almost always skewed due to chronic imbalances of TH-1 (T-cell) and TH-2 (B-cell) immune response.  It’s critical to look for multiple markers (although the overwhelming—nearly 100% association between gluten sensitivity and Hashimoto’s and many other autoimmune disorders make the automatic assumption of gluten sensitivity a good idea).  The most important tests to run in regular serologic testing are IgA (anti-gliadin antibodies and anti-endomysial antibodies), IgG (anti-gliadin antibodies), IgM, antibodies, tissue transglutaminase 2 antibodies (which are most associated with small intestine villous atrophy (IgA and IgG)), Transglutaminase 3 antibodies (associated with epidermal gluten effects (IgA and IgG)) and Transglutaminase 6 (associated with neurodegenerative effects of gluten (IgA and IgG)), gluten antibodies, total IgA antibodies and if possible, it may be helpful to test for the presence of genes’ HLA-DQ2 and HLA-DQ8, as well as HLA-DQB1, Alleles 1 and 2.    WHEW!


Cyrex Labs

In lieu of ordering all these individual tests (with highly variable standard deviations of sensitivity/accuracy) I’ve found that by far the most accurate assessment may be made by simply using a the Array 3 panel offered by Cyrex Labs (www.CyrexLabs.com). Cyrex offers by far the most advanced, comprehensive, sensitive and accurate testing of any other lab in the world right now (and likely for some time to come) and is setting a new Gold Standard for this.  They look at fully 9 different epitopes of gluten (that’s 8 more than anyone else looks for), they test within one to two standard deviations of sensitivity (an accuracy otherwise unheard of) and look at IgA, IgG and in some tests even IgM immunoglobulin reactivity. You will need to get this testing through your licensed health care provider.  They also offer accurate testing for the presence of gut barrier compromise (some refer to as “leaky gut”)—including what the specific nature of that leaky gut is—which makes a big difference in treating this, as well as accurate testing for other major common food sensitivities and gluten cross-reactivities, as well. They also have a test array for the presence of potentially inappropriate antibodies to 24 different tissue complexes to help you identify and be able to address developing autoimmune processes sometimes decades before one might otherwise obtain a diagnosis (and usually by then it’s too late).   

To quote the site, itself, “Cyrex™ is an advanced clinical laboratory developing and offering cutting-edge tests based on the latest scientific advances in the field of immunology. These tests cover mucosal, cellular, and humoral immunology and specialize in antibody arrays for complex thyroid, gluten, and other food-associated autoimmunities and related neurodysregulation.”  Make no mistake about it, Cyrex Labs WILL revolutionize the entire field of immunology.  And no, I have no financial stake in this company.  They are really just that good—and that unique. 


Elimination diets can at times be an effective means of determining the potential for gluten sensitivity, but must be strictly adhered to for no less than 2-3 weeks and ideally at least 6-8 months to make a genuinely clear determination.  Any exposure of any kind (even seemingly innocuous unintentional slip-ups) means you must start over with the time spent on the elimination diet.  Without a real lab result to motivate you, disciplined adherence to this type of self-testing can be difficult.  Avoidance of gluten must be no less than 100% from all (even hidden sources) and not so much as even a single crumb of bread or trace contamination.  Also, beware of cross contamination issues—where non-gluten foods may come into contact with gluten-containing foods via cooking/preparation surfaces and utensils in restaurants or at home (yes—all this matters).  The inflammatory effects of even trace gluten exposure in the brain especially and throughout the rest of the body can reverberate for several weeks, months or more in sensitive individuals.  


This is decidedly an issue that needs to be taken extremely seriously.


The journal Gastroenterology (2009; 137:88-93) states that “During a 45 year follow up, undiagnosed celiac disease was associated with a nearly 4-fold increased risk of death.”  The prevalence of undiagnosed CD seems to have increased dramatically in the United States during the last 50 years.”  In an individual with either full blown celiac or gluten sensitivity the risk of death from all causes, according to the journal Lancet (Vol 358, August 4, 2001) was dramatically greater: “Death was most significantly affected by diagnostic delay, pattern of presentation, and adherence to the gluten free diet…Non adherence to the gluten free diet, defined as eating gluten once-per-month increased the relative risk of death 600%.”  Next time you want to rationalize that “one little piece of bread” –think twice.  Is it really worth playing Russian roulette?


Being “mostly gluten free” or allowing yourself to imbibe in gluten-containing foods “only occasionally” just doesn’t cut it. In the case of diagnosed or undiagnosed gluten sensitivity or celiac disease the popular mantra of “all things in moderation” can literally cost you your health—maybe even your life.

Brain and mood disorders, migraines, osteoporosis, type 1 diabetes, cardiovascular diseases, bowel diseases, autoimmune diseases, inflammatory disorders and cancer are rampant. Grains are rarely suspected as an original culprit, though every one of these disorders, among many more, can potentially be traced to often-insidious gluten exposure. Gluten sensitivity is only rarely obvious to the afflicted, and many are even entirely surprised to learn they have this sensitivity.  I know I was.


That said, one doesn’t even need to have an immunoreactivity to gluten for it to be damaging to anyone that ingests it.  All gluten consumption at least temporarily compromises gut barrier (and likely blood-brain barrier) integrity and can open the gates to the potential for all kinds of immunoreactivity and inflammation/neuroinflammation. Wheat Germ Agglutinin (WGA), richest in supposedly “healthier” sprouted grains can pass right through your blood-brain barrier, attach itself to the myelin on your neurons and other nerve cells and inhibit nerve growth factor.  (Myelin is the protective sheath on nerves and nerve growth factor is important for the growth and maintenance of your neurons.)  So, even in the absence of immunoreactivity to gluten, wheat can cause brain damage!


Not much more than 1% of all people suffering gluten sensitivity or celiac disease are ever diagnosed, by the way.


The good news is that the devastating symptoms of gluten sensitivity and celiac disease are at times entirely reversible.  –The treatment solution?  You MUST eliminate 100%–not just “most”–gluten from your diet, including not just gluten containing dietary grains but all hidden sources, as well, which can include (but are not limited to) commercial soups, commercial broths, processed food mixes and soy sauce, teriyaki and other sauces, corn products and corn starch, and salad dressings. Even buckwheat and soy flours are commonly contaminated with highly significant amounts of gluten due to modern processing and storage methods.  Gluten can be cryptically listed on food labels as vegetable protein, seitan, hydrolyzed vegetable protein, modified food starch and others. Gluten is even an ingredient in many shampoos, cosmetics and lipsticks (which can potentially absorb transdermally–through the skin), children’s Play-Doh, medications, vitamins (unless specifically labeled “gluten free”)–even non self-adhesive stamps and envelopes.


Although I realize all this need for ultra-strict avoidance sounds rather tedious and extreme, an article in the Journal of Neurology, Neurosurgery and Psychiatry (1997; 63; 770-775) states clearly: “Even minute traces of gliadin (gluten) are capable of triggering a state of heightened immunological activity in gluten sensitive people”, meaning prolonged inflammation and other symptoms.  Saying you’ve eliminated “most” gluten from your diet is a bit like saying you’re just “a little bit pregnant”.  Either you are or you’re not.  There are NO in-betweens.  Avoidance must be strict…and total.


Many people will claim they have been adhering to a strict gluten-free diet when, in fact, they have only been avoiding the obvious sources and really haven’t been paying attention enough to potentially hidden sources, including their personal care products.  They will eventually rationalize their lack of positive health results to the idea that they weren’t gluten sensitive after all and they simply go back to eating whatever they want.  This is a HUGE mistake! 


Even where adherence to a genuinely gluten free diet doesn’t seem to generate expected turnarounds in health and well-being, you have at least removed one very major hurdle to improvement.  There can always be other hurdles yet to conquer, not the least of which is identifying and addressing potential as yet undiagnosed cross-reactivities (Cyrex Labs Array 4 panel can help identify these).  In a peer reviewed article in the journal, Curr Opin Allergy Clin Immunol 2008 Feb; 8(1):82-6 the authors stated that “The phenomenon of immunologic cross-reactivity occurs when an adaptive immune response to one antigen results in reactivity to structurally related antigens.”


Currently identified cross-reactive compounds with gluten/wheat include:

[X]Rye, barley, spelt, Polish wheat (also known as “Kamut”)

[X]Cow’s milk

[X]Alpha-casein and beta-casein (milk proteins)

[X]Casomorphin (peptide created during digestion of

milk protein that produces an opioid affect in the

nervous system)

[X]Milk butyrophilin (a protein in milk fat)

[X]Whey protein

[X]Milk chocolate


[X]Baker’s Yeast

[*]Prepackaged, Pre-ground and Instant coffee





This isn’t to say that everyone having an immunoreactivity to gluten will necessarily have these cross-reactivities, but some certainly will be reactive to one or more of them.  In this case the immune system sees the cross-reactive compound as indistinguishable with gluten and reacts accordingly—potentially perpetuating symptoms even on an otherwise gluten-free diet!  Obviously, it’s important information to have.  Cyrex offers this testing in their Array 4 panel.


Gluten is certainly not the only modern substance challenging the health of the masses—but it is the most common, the most potentially damaging and the one most likely to facilitate all other food/chemical sensitivities.  Restoring health can be like peeling back the layers of an onion.  It is a process.  Still, often enough, by simply removing this one major/key dietary antigen the turnaround in some people can seem nothing short of miraculous.  It can also make a massive difference where seemingly more benign issues like resistant weight loss may be concerned.


Wait just a minute, back up—did you just say “personal care products”?

Crazy sounding, but true.  You need to examine your shampoos, conditioners and other hair care and skin care products for the presence of wheat protein, sometimes also listed as “hydrolyzed vegetable protein”.  Look for corn-related additives, also.


While you’re at it (digressing a bit here), you might also want to consider avoiding toxic additives like parabens, pthlates, artificial fragrances, sodium laurel sulfate, methylisothiazolinone (MIT), and petroleum derivatives like mineral oil, toluene, petrolatum and paraffin (slightly off-topic, but extremely noteworthy, nonetheless).  Note that the FDA does nothing to ensure the safety of any chemical used in personal care products, so you’re left to trust the manufacturer.  Ummmm, yeah (er, NOT).  Even the FDA states: “Cosmetic products and ingredients are not subject to FDA premarket approval authority, with the exception of color additives … Cosmetic firms are responsible for substantiating the safety of their products and ingredients before marketing.”  Out of roughly 126 or more chemicals consumers regularly apply to their skin, 90% have never, ever been tested for their safety.  Most people think nothing of the products they apply on their hair or skin and the cosmetics industry readily capitalizes on this ignorance at tremendous potential cost to your health for considerable profit.


Why is this important?  I mean, we’re just talking about skin, right?  It’s not like you’re drinking the stuff…

In fact, it’s probably worse. Keep in mind that your skin is your largest organ and that it is exceedingly thin (less than 1/10th of an inch in thickness) and permeable.  If you were to eat or drink these products you’d have several things come into play to help protect you from direct bloodstream exposure—your gut lining, hydrochloric acid, enzymes, etc.  In a hot shower, however, with your pores open wide and barefoot up to your neck, there is very little between you and direct absorption of anything you are applying to your scalp and skin right into your bloodstream where it is all free to travel throughout your body to your brain and all your other organs.  These compounds may also even be inhaled with the shower’s steam.  The concern here is very real—especially if you happen to be symptomatically vulnerable.  When you’re reading hair and skin care labels it’s a good idea to ask yourself whether you would be willing to actually drink the contents of that product or not.  If you’re reading a list of a whole lot of difficult-to-pronounce chemicals and/or also seeing wheat protein/vegetable protein on the label you’d do well to think twice about using it.  And don’t let buzzwords like “organic” or “natural” fool you.  A partial listing of product sources can be found at www.celiac.com.  Also, just Google “gluten and additive-free hair and skin-care products” in your computer’s browser.  The potential selection is huge.  If you happen to have a smart phone there are also numerous available “gluten-free apps” available to help you screen individual products, restaurants, grocery stores and other shopping sources at your fingertips.  The good news is that the awareness of these issues is rapidly spreading and resources are likely to grow exponentially in the very near time to come.


So what about gluten-free “substitutes”?

Seeking out gluten-free substitutes is certainly an option, as there are scores of “gluten-free” products of all kinds available today. It’s big business for food manufacturers these days, in fact. Clearly, gluten free shampoos and cosmetics are a good and necessary idea, as are gluten free condiments and soup bases (chicken/beef/vegetable broths, etc).  Unfortunately, even though other grains, such as quinoa (actually more of a starchy seed than a grain), corn, millet and buckwheat or rice do not contain the same gluten as wheat, many are still a source of potential cross-reactivity, high glycemic starch and the majority of “gluten-free substitutes” are highly, highly processed foods.   Many are soy-based, as well (don’t get me started on THAT!).   Just because something is “gluten-free” does not mean it is actually healthy for you, any more than the word “organic” does.  “Gluten free organic brownies” are still junk food.  Don’t be fooled.  Gluten and carbohydrate intolerance, in general, are far more the rule than the exception in today’s world. It is logical to conclude that grain consumption, especially gluten-containing grains, just isn’t worth the dietary risk, given our modern day culture’s innumerable health challenges and vulnerabilities. Why play Russian roulette? Why add to the unnecessary, glycating, fattening and neurotransmitter and hormonally dysregulating carbohydrate load?  In my view it’s better to take processed food off the radar screen entirely and mainly stick to the foods that don’t need a label you have to read every time.


In short, there is no one alive for whom grains are essential for health and gluten, in particular, is a health food for no one.


It stands to further reason that the more symptoms a person has physically, cognitively or psychologically, the more primitive a diet  (in other words, pre-agricultural or “Primal”), one ought to consider adopting for reclaiming rightful health.  The commonality of degenerative diseases does not make these diseases a normal part of aging, or even remotely inevitable.


The choice is mostly ours.


 ~ Nora Gedgaudas

    Author of Primal Body, Primal Mind


For more information about gluten sensitivity and celiac disease go to www.thedr.com and/or www.celiac.com.


For the most accurate testing and more information go to:  www.cyrexlabs.com.


Click here to ORDER the Gluten Summit presentations!

   Gluten Summit Order

Sign up for my newsletter









Copyright 2013 – Nora Gedgaudas



A Question about grains

Q: While I’m in the process of cutting out grains I’m curious about quinoa. Do you recommend cutting it out as well?

And my other question is about the butter used in your nut ball snacker recipe. Is it better to use the salted or unsalted?

Thanks for your time,

~ Samantha

A: I see all grains as being fundamentally starch based–even when they are gluten-free. That said I tend personally to avoid grains, period. If a dish happens to have a little quinoa “accenting” it, then OK, but in general I wouldn’t include it in any significant quantity in my diet.

As for the butter–I like the taste of salted butter better. It’s a preference. If you’re concerned about the quality of the salt used then you could always buy unsalted and add, say Celtic or Himalayan sea salt to your liking.

~ Nora

Primal Body Primal Mind

Primal Body Primal Mind