Meat is best for growing brains

There are multiple lines of evidence that an animal-based diet best supports human brain development in infants and young children.

Human fetuses and infants rely on ketones for brain building.

In a previous post, we wrote about the known (but little-spoken-of) fact that human infants are in mild ketosis all the time, especially when breastfed. In other words, ketosis is a natural, healthy state for infants. Infancy is a critical time for brain growth, so we expect that ketosis is advantageous for a growing brain. Otherwise, there would have been a selective advantage to reduced ketotis in infancy. This species-critical, rapid brain growth continues well past weaning. For that reason, we suggest in our article that weaning onto a ketogenic diet would probably be preferable to weaning away from ketosis.

In response to that post, a reader sent us a paper called Survival of the fattest: fat babies were the key to evolution of the large human brain. [1] The authors discuss the apparently unique human trait of having extremely fat babies, and explain it in terms of the unique need for growth of extremely large brains.

A key point they make is that a baby's ample fat provides more than simply a large energy supply, (much more than could be stored as glycogen or protein; by their calculations, more than 20 times more), but that ketone bodies are themselves important for human brain evolution.

They repeat the usual unwarranted assumption that adult brains use mainly glucose for brain fuel by default, and that ketone bodies are merely an alternative brain fuel. Nonetheless, when talking about fetuses, they are willing to say that the use of ketones is not merely an "alternative":

In human fetuses at mid-gestation, ketones are not just an alternative fuel but appear to be an essential fuel because they supply as much as 30% of the energy requirement of the brain at that age (Adam et al., 1975).

Second, ketones are a key source of carbon for the brain to synthesize the cholesterol and fatty acids that it needs in the membranes of the billions of developing nerve connections.


Ketones are the preferred carbon source for brain lipid synthesis and they come from fatty acids recently consumed or stored in body fat. This means that, in infants, brain cholesterol and fatty acid synthesis are indirectly tied to mobilization and catabolism of fatty acids stored in body fat.

In other words, the claim is that ketones are the best source of certain brain-building materials, and specifically, that fetuses use them for that purpose.

Moreover, the thesis is that the extra body fat on human babies is there specifically for the purpose of supporting extra brain growth after birth, through continued use of ketones.

Weaning onto meat increases brain growth.

[ Please note that by convention weaning refers to the gradual process of transitioning from exclusive breastfeeding (starting with the first foods introduced, while breastfeeding is still ongoing), to the end of breastfeeding, not just the end itself. ]

We aren't the only ones who have thought weaning onto meat would be a good idea. A couple of studies have compared weaning onto meat rather than cereal.

One showed a larger increase in head circumference [2], which is a good index of brain growth in infants [3] and young children [4]. Moreover, higher increases in head circumference in infants are correlated with higher intelligence, independently of head circumference at birth [5]. In other words, the amount of brain growth after birth is a better predictor of intelligence than the amount of brain growth in gestation.

That study also found the meat-fed infants to have better zinc status, and good iron status despite not supplementing iron as was done in the cereal arm [2]. Zinc and iron are abundant in the brain, and zinc deficiency is implicated in learning disorders and other brain development problems [6]. Iron deficiency is a common risk in infants in our culture, because of our dietary practices, which is why infant cereal is fortified with it [7].

Another study showed better growth in general in babies weaned onto primarily meat [8].

Weaning onto meat is easy. Here's how I did it.

It is believed likely that early humans fed their babies pre-chewed meat [9]. I did that, too, although that wasn't my first weaning step. Influenced by baby-led weaning, I waited until he was expressing clear interest in my food, and then simply shared it with him. At the time this meant:

  • Broth on a spoon, increasingly with small fragments of meat in it.
  • Bones from steaks and chops, increasingly with meat and fat left on them.
  • Homemade plain, unseasoned jerky, which he teethed on, or sucked until it disintegrated.
  • Beef and chicken liver, which has a soft, silky texture, and is extremely nutrient-dense.


The brain is an energy-intensive organ that required an animal-based diet to evolve.

In 1995, anthropologists Leslie C. Aiello and Peter Wheeler posed the following problem [10]:

  • Brains require an enormous amount of energy.
  • Humans have much larger brains than other primates.
  • However, human basal metabolic rates are not more than would be predicted by their body mass.

Where do we get the extra energy required to fuel our brains, and how could this have evolved?

Aiello and Wheeler explain this by noting that at the same time as our brains were expanding, our intestines (estimated as comparably energy-intensive) were shrinking, by almost exactly the same amount. thereby freeing up the extra metabolic energy needed for the brain. Both adaptations, a large brain and small guts, independently required them to adopt a "high-quality" diet, for different reasons.

Let's mince no words; "high-quality" means meat [11]. Meat is more nutrient dense than plants, both in terms of protein and vitamins. Plants are simply too fibrous, too low in protein and calories, and too seasonal to have been relied on for such an evolutionary change [11], [12]. It is widely accepted that meat became an important part of our diets during this change. This is the mainstream view in anthropology [13].

Although the need for protein and brain-building nutrients is often cited as a reason for needing meat in the evolutionary diet, energy requirements are also important to consider. It would have been difficult to get caloric needs met from plants (especially before cooking) [13], because they were so fibrous. Herbivores with special guts (such as ruminants like cows with their "four stomachs") and primates with much larger intestines than we have, actually use bacteria in their guts to turn significant amounts of fiber into fat, see eg. [14]. This strategy is not available to a such a small gut [11], [15], which is why we had to find food that was energy dense as is.

Fortunately, insofar as we were already using animal sources to get protein and nutrients, we also had access to an abundance of fat. The animals we hunted were unlikely to have been as lean as modern game. Evidence supports the hypothesis that human hunting was the most likely cause of the extinction of many megafauna (large animals that were much fatter than the leaner game we have left today) [16]. Humans, like carnivores, prefer to hunt larger animals whenever they are available [17]. It has been proposed that the disappearance of the fatter megafauna exerted a strong evolutionary pressure on humans, who were already fat-dependent, to become more skilled hunters of the small game we have today, to rely more on the fat from eating brains and marrow, and to learn to find the fattest animals among the herds [18].

Animal fat and animal protein provided the energy, protein, and nutrients necessary for large brains, especially given the constraint of small guts.

Because humans wean early, and human brain growth is extended past weaning, the post-weaning diet must support fetal-like brain growth.

Humans wean much earlier than other primates, and yet their brains require prolonged growth. Our intelligence has been our primary selective advantage. Therefore it is critical from an evolutionary standpoint that the diet infants were weaned onto was supportive of this brain growth.

In a (fascinating and well-written) paper on weaning and evolution, Kennedy puts it this way:

"[A]lthough this prolonged period of development i.e., ‘‘childhood’’ renders the child vulnerable to a variety of risks, it is vital to the optimization of human intelligence; by improving the child’s nutritional status (and, obviously, its survival), the capability of the adult brain is equally improved. Therefore, a child’s ability to optimize its intellectual potential would be enhanced by the consumption of foods with a higher protein and calorie content than its mother’s milk; what better foods to nourish that weanling child than meat, organ tissues (particularly brain and liver), and bone marrow, an explanation first proposed by Bogin (1997)."


"Increase in the size of the human brain is based on the retention of fetal rates of brain growth (Martin, 1983), a unique and energetically expensive pattern of growth characteristic of altricial [ born under-developed ] mammals (Portmann, 1941; Martin, 1984). This research now adds a second altricial trait—early weaning—to human development. The metabolically expensive brain produced by such growth rates cannot be sustained long on maternal lactation alone, necessitating an early shift to adult foods that are higher in protein and calories than human milk."

The only food higher in protein and calories than breast milk is meat.

A high-fat animal-based diet best supports brain growth.

Taking these facts together:

  • Even modern fetuses and breastfed infants are in ketosis, which uniquely supports brain growth.
  • Infants who are weaned onto meat get essential nutrients to grow brains with: nutrients that are currently deficient in our plant-centric diets today. Moreover, experiments have found that their brains actually grow more than babies fed cereal.
  • Human brains continue to grow at a fast rate even past weaning.
  • It is likely that in order to evolve such large, capable brains, human babies were weaned onto primarily meat.

A meat-based, inherently ketogenic diet is not only likely to be our evolutionary heritage, it is probably the best way to support the critical brain growth of the human child.


We would like to thank Matthew Dalby, a researcher at the University of Aberdeen, for helpful discussions about short-chain fatty acid production in the large intestines.



Hypothesis paper

Cunnane SC, Crawford MA.
Comp Biochem Physiol A Mol Integr Physiol. 2003 Sep;136(1):17-26.

Evidence type: experiment

Krebs NF, Westcott JE, Butler N, Robinson C, Bell M, Hambidge KM.
J Pediatr Gastroenterol Nutr. 2006 Feb;42(2):207-14.

(Emphasis ours)


"This study was undertaken to assess the feasibility and effects of consuming either meat or iron-fortified infant cereal as the first complementary food.


"Eighty-eight exclusively breastfed infants were enrolled at 4 months of age and randomized to receive either pureed beef or iron-fortified infant cereal as the first complementary food, starting after 5 months and continuing until 7 months. Dietary, anthropometric, and developmental data were obtained longitudinally until 12 months, and biomarkers of zinc and iron status were measured at 9 months.


"Mean (+/-SE) daily zinc intake from complementary foods at 7 months for infants in the meat group was 1.9 +/- 0.2 mg, whereas that of the cereal group was 0.6 +/- 0.1 mg, which is approximately 25% of the estimated average requirement. Tolerance and acceptance were comparable for the two intervention foods. Increase in head circumference from 7 to 12 months was greater for the meat group, and zinc and protein intakes were predictors of head growth. Biochemical status did not differ by feeding group, but approximately 20% of the infants had low (<60 microg/dL) plasma zinc concentrations, and 30% to 40% had low plasma ferritin concentrations (<12 microg/L). Motor and mental subscales did not differ between groups, but there was a trend for a higher behavior index at 12 months in the meat group.


"Introduction of meat as an early complementary food for exclusively breastfed infants is feasible and was associated with improved zinc intake and potential benefits. The high percentage of infants with biochemical evidence of marginal zinc and iron status suggests that additional investigations of optimal complementary feeding practices for breastfed infants in the United States are warranted."


Evidence type: authority

(Emphasis ours)

"Today the close correlation between head circumference growth and brain development in the last weeks of gestation and in the first two years of life is no longer disputed. A recently developed formula even allows for calculations of brain weight based upon head circumference data. Between the ages of 32 postmenstrual weeks and six months after expected date of delivery there is a period of very rapid brain growth in which the weight of the brain quadruples. During this growth spurt there exists an increased vulnerability by unfavorable environmental conditions, such as malnutrition and psychosocial deprivation. The erroneous belief still being prevalent that the brain of the fetus and young infant is spared by malnutrition, can be looked upon as disproved by new research results. Severe malnutrition during the brain growth spurt is thought to be a very important non-genetic factor influencing the development of the central nervous system (CNS) and therewith intellectual performance. In the past a permanent growth retardation of head circumference and a reduced intellectual capacity usually was observed in small-for-gestational age infants (SGA). Nowadays, however, there can be found also proofs of successful catch-up growth of head circumference and normal intellectual development after early and high-energy postnatal feeding of SGA infants. The development of SGA infants of even very low birth weight can be supported in such a way that it takes a normal course by providing good environmental conditions, such as appropriate nutrition - especially during the early growth period - and a stimulating environment with abundant attention by the mother."


Evidence type: experiment

Bartholomeusz HH, Courchesne E, Karns CM.
Neuropediatrics. 2002 Oct;33(5):239-41.

(Emphasis ours)


"To quantify the relationship between brain volume and head circumference from early childhood to adulthood, and quantify how this relationship changes with age.


"Whole-brain volume and head circumference measures were obtained from MR images of 76 healthy normal males aged 1.7 to 42 years.


"Across early childhood, brain volume and head circumference both increase, but from adolescence onward brain volume decreases while head circumference does not. Because of such changing relationships between brain volume and head circumference with age, a given head circumference was associated with a wide range of brain volumes. However, when grouped appropriately by age, head circumference was shown to accurately predict brain volume. Head circumference was an excellent prediction of brain volume in 1.7 to 6 years old children (r = 0.93), but only an adequate predictor in 7 to 42 year olds.


"To use head circumference as an accurate indication of abnormal brain volume in the clinic or research setting, the patient's age must be taken into account. With knowledge of age-dependent head circumference-to-brain volume relationship, head circumference (particularly in young children) can be an accurate, rapid, and inexpensive indication of normalcy of brain size and growth in a clinical setting.


Evidence type: experiment

Gale CR1, O'Callaghan FJ, Godfrey KM, Law CM, Martyn CN.
Brain. 2004 Feb;127(Pt 2):321-9. Epub 2003 Nov 25.

"Head circumference is known to correlate closely with brain volume (Cooke et al., 1977; Wickett et al., 2000) and can therefore be used to measure brain growth, but a single measurement cannot provide a complete insight into neurological development. Different patterns of early brain growth may result in a similar head size. A child whose brain growth both pre‐ and postnatally followed the 50th centile might attain the same head size as a child whose brain growth was retarded in gestation but who later experienced a period of rapid growth. Different growth trajectories may reflect different experiences during sensitive periods of brain development and have different implications for later cognitive function.

"We have investigated whether brain growth during different periods of pre‐ and postnatal development influences later cognitive function in a group of children for whom serial measurements of head growth through foetal life, infancy and childhood were available."


"We found no statistically significant associations between head circumference at 18 weeks’ gestation or head circumference at birth SDS and IQ at the age of 9 years."


"In contrast, there were strong statistically significant associations between measures of postnatal head growth and IQ. After adjustment for sex, full‐scale IQ rose by 2.59 points (95% CI 0.87 to 4.32) for each SD increase in head circumference at 9 months of age, and by 3.85 points (95% CI 1.96 to 5.73) points for each SD increase in head circumference at 9 years; verbal IQ rose by 2.66 points (95% CI 0.49 to 4.83) for each SD increase in head circumference at 9 months of age, and by 3.76 points (95% CI 1.81 to 5.72) for each SD increase in head circumference at 9 years; performance IQ rose by 2.88 points (95% CI 0.659 to 5.11) for each SD increase in head circumference at 9 months of age, and by 3.16 points (95% CI 1.16 to 5.16) for each SD increase in head circumference at 9 years."


"[W]e interpret these findings as evidence that postnatal brain growth is more important than prenatal brain growth in determining higher mental function. This interpretation is supported by the finding that head growth in the first 9 months of life and head growth between 9 months and 9 years of age are also related to cognitive function, regardless of head size at the beginning of these periods."


Evidence type: review

Pfeiffer CC, Braverman ER.
Biol Psychiatry. 1982 Apr;17(4):513-32.

"The total content of zinc in the adult human body averages almost 2 g. This is approximately half the total iron content and 10 to 15 times the total body copper. In the brain, zinc is with iron, the most concentrated metal. The highest levels of zinc are found in the hippocampus in synaptic vesicles, boutons, and mossy fibers. Zinc is also found in large concentrations in the choroid layer of the retina which is an extension of the brain. Zinc plays an important role in axonal and synaptic transmission and is necessary for nucleic acid metabolism and brain tubulin growth and phosphorylation. Lack of zinc has been implicated in impaired DNA, RNA, and protein synthesis during brain development. For these reasons, deficiency of zinc during pregnancy and lactation has been shown to be related to many congenital abnormalities of the nervous system in offspring. Furthermore, in children insufficient levels of zinc have been associated with lowered learning ability, apathy, lethargy, and mental retardation. Hyperactive children may be deficient in zinc and vitamin B-6 and have an excess of lead and copper. Alcoholism, schizophrenia, Wilson's disease, and Pick's disease are brain disorders dynamically related to zinc levels. Zinc has been employed with success to treat Wilson's disease, achrodermatitis enteropathica, and specific types of schizophrenia."


Evidence type: authority

From the CDC:

"Who is most at risk?

Young children and pregnant women are at higher risk of iron deficiency because of rapid growth and higher iron needs.

Adolescent girls and women of childbearing age are at risk due to menstruation.

Among children, iron deficiency is seen most often between six months and three years of age due to rapid growth and inadequate intake of dietary iron. Infants and children at highest risk are the following groups:

  • Babies who were born early or small.
  • Babies given cow's milk before age 12 months.
  • Breastfed babies who after age 6 months are not being given plain, iron-fortified cereals or another good source of iron from other foods.
  • Formula-fed babies who do not get iron-fortified formulas.
  • Children aged 1–5 years who get more than 24 ounces of cow, goat, or soymilk per day. Excess milk intake can decrease your child's desire for food items with greater iron content, such as meat or iron fortified cereal.
  • Children who have special health needs, for example, children with chronic infections or restricted diets.

Evidence type: experiment

(Emphasis ours)

"Background: High intake of cow-milk protein in formula-fed infants is associated with higher weight gain and increased adiposity, which have led to recommendations to limit protein intake in later infancy. The impact of protein from meats for breastfed infants during complementary feeding may be different.

"Objective: We examined the effect of protein from meat as complementary foods on growths and metabolic profiles of breastfed infants.

"Design: This was a secondary analysis from a trial in which exclusively breastfed infants (5–6 mo old from the Denver, CO, metro area) were randomly assigned to receive commercially available pureed meats (MEAT group; n = 14) or infant cereal (CEREAL group; n = 28) as their primary complementary feedings for ∼5 mo. Anthropometric measures and diet records were collected monthly from 5 to 9 mo of age; intakes from complementary feeding and breast milk were assessed at 9 mo of age.

"Results: The MEAT group had significantly higher protein intake, whereas energy, carbohydrate, and fat intakes from complementary feeding did not differ by group over time. At 9 mo of age mean (± SEM), intakes of total (complementary feeding plus breast-milk) protein were 2.9 ± 0.6 and 1.4 ± 0.4 g ⋅ kg−1 ⋅ d−1, ∼17% and ∼9% of daily energy intake, for MEAT and CEREAL groups, respectively (P < 0.001). From 5 to 9 mo of age, the weight-for-age z score (WAZ) and length-for-age z score (LAZ) increased in the MEAT group (ΔWAZ: 0.24 ± 0.19; ΔLAZ: 0.14 ± 0.12) and decreased in the CEREAL group (ΔWAZ: −0.07 ± 0.17; ΔLAZ: −0.27 ± 0.24) (P-group by time < 0.05). The change in weight-for-length z score did not differ between groups. Total protein intake at 9 mo of age and baseline WAZ were important predictors of changes in the WAZ (R2 = 0.23, P = 0.01).

"Conclusion: In breastfed infants, higher protein intake from meats was associated with greater linear growth and weight gain but without excessive gain in adiposity, suggesting potential risks of high protein intake may differ between breastfed and formula-fed infants and by the source of protein."


From Wikipedia:

"Breastmilk supplement

"Premastication is complementary to breastfeeding in the health practices of infants and young children, providing large amounts of carbohydrate and protein nutrients not always available through breast milk,[3] and micronutrients such as iron, zinc, and vitamin B12 which are essential nutrients present mainly in meat.[25] Compounds in the saliva, such as haptocorrin also helps increase B12 availability by protecting the vitamin against stomach acid.

"Infant intake of heme iron

"Meats such as beef were likely premasticated during human evolution as hunter-gatherers. This animal-derived bioinorganic iron source is shown to confer benefits to young children (two years onwards) by improving growth, motor, and cognitive functions.[26] In earlier times, premastication was an important practice that prevented infant iron deficiency.[27]

"Meats provide Heme iron that are more easily absorbed by human physiology and higher in bioavailability than non-heme irons sources,[28][29] and is a recommended source of iron for infants.[30]"


Hypothesis paper

Leslie C. Aiello and Peter Wheeler
Current Anthropology, Vol. 36, No. 2 (Apr., 1995), pp. 199-221

Evidence type: review

Milton K.
J Nutr. 2003 Nov;133(11 Suppl 2):3886S-3892S.

(The whole paper is worth reading, but these highlights serve our point.)

"Without routine access to ASF [animal source foods], it is highly unlikely that evolving humans could have achieved their unusually large and complex brain while simultaneously continuing their evolutionary trajectory as large, active and highly social primates. As human evolution progressed, young children in particular, with their rapidly expanding large brain and high metabolic and nutritional demands relative to adults would have benefited from volumetrically concentrated, high quality foods such as meat."


"If the dietary trajectory described above was characteristic of human ancestors, the routine, that is, daily, inclusion of ASF in the diets of children seems mandatory as most wild plant foods would not be capable of supplying the protein and micronutrients children require for optimal development and growth, nor could the gut of the child likely provide enough space, in combination with the slow food turnover rate characteristic of the human species, to secure adequate nutrition from wild plant foods alone. Wild plant foods, though somewhat higher in protein and some vitamins and minerals than their cultivated counterparts (52), are also high in fiber and other indigestible components and most would have to be consumed in very large quantity to meet the nutritional and energetic demands of a growing and active child."


"Given the postulated body and brain size of the earliest humans and the anatomy and kinetic pattern characteristics of the hominoid gut, turning increasingly to the intentional consumption of ASF on a routine rather than fortuitous basis seems the most expedient, indeed the only, dietary avenue open to the emerging human lineage (2,3,10,53)."


"Given the probable diet, gut form and pattern of digestive kinetics characteristic of prehuman ancestors, it is hypothesized that the routine inclusion of animal source foods in the diet was mandatory for emergence of the human lineage. As human evolution progressed, ASF likely achieved particular importance for small children due to the energetic demands of their rapidly expanding large brain and generally high metabolic and nutritional demands relative to adults."


Evidence type: review

Kennedy GE.
J Hum Evol. 2005 Feb;48(2):123-45. Epub 2005 Jan 18.

"Although some researchers have claimed that plant foods (e.g., roots and tubers) may have played an important role in human evolution (e.g., O’Connell et al., 1999; Wrangham et al., 1999; Conklin-Brittain et al., 2002), the low protein content of ‘‘starchy’’ plants, generally calculated as 2% of dry weight (see Kaplan et al., 2000: table 2), low calorie and fat content, yet high content of (largely) indigestible fiber (Schoeninger et al., 2001: 182) would render them far less than ideal weaning foods. Some plant species, moreover, would require cooking to improve their digestibility and, despite claims to the contrary (Wrangham et al., 1999), evidence of controlled fire has not yet been found at Plio-Pleistocene sites. Other plant foods, such as the nut of the baobab (Adansonia digitata), are high in protein, calories, and lipids and may have been exploited by hominoids in more open habitats (Schoeninger et al., 2001). However, such foods would be too seasonal or too rare on any particular landscape to have contributed significantly and consistently to the diet of early hominins. Moreover, while young baobab seeds are relatively soft and may be chewed, the hard, mature seeds require more processing. The Hadza pound these into flour (Schoeninger et al., 2001), which requires the use of both grinding stones and receptacles, equipment that may not have been known to early hominins. Meat, on the other hand, is relatively abundant and requires processing that was demonstrably within the technological capabilities of Plio-Pleistocene hominins. Meat, particularly organ tissues, as Bogin (1988, 1997) pointed out, would provide the ideal weaning food."


Plants can become more nutrient dense through cooking. That is the basis of Wrangham's hypothesis:

(From Wikipedia)

"Wrangham's latest work focuses on the role cooking has played in human evolution. He has argued that cooking food is obligatory for humans as a result of biological adaptations[9][10] and that cooking, in particular the consumption of cooked tubers, might explain the increase in hominid brain sizes, smaller teeth and jaws, and decrease in sexual dimorphism that occurred roughly 1.8 million years ago.[11] Most anthropologists disagree with Wrangham's ideas, pointing out that there is no solid evidence to support Wrangham's claims.[11][12] The mainstream explanation is that human ancestors, prior to the advent of cooking, turned to eating meats, which then caused the evolutionary shift to smaller guts and larger brains.[13]"


Evidence type: review

Popovich DG1, Jenkins DJ, Kendall CW, Dierenfeld ES, Carroll RW, Tariq N, Vidgen E.
J Nutr. 1997 Oct;127(10):2000-5.

(Emphasis ours)

"We studied the western lowland gorilla diet as a possible model for human nutrient requirements with implications for colonic function. Gorillas in the Central African Republic were identified as consuming over 200 species and varieties of plants and 100 species and varieties of fruit. Thirty-one of the most commonly consumed foods were collected and dried locally before shipping for macronutrient and fiber analysis. The mean macronutrient concentrations were (mean ± SD, g/100 g dry basis) fat 0.5 ± 0.4, protein 11.8 ± 8.2, available carbohydrate 7.7 ± 6.3 and dietary fiber 74.0 ± 12.9. Assuming that the macronutrient profile of these foods was reflective of the whole gorilla diet and that dietary fiber contributed 6.28 kJ/g (1.5 kcal/g), then the gorilla diet would provide 810 kJ (194 kcal) metabolizable energy per 100 g dry weight. The macronutrient profile of this diet would be as follows: 2.5% energy as fat, 24.3% protein, 15.8% available carbohydrate, with potentially 57.3% of metabolizable energy from short-chain fatty acids (SCFA) derived from colonic fermentation of fiber. Gorillas would therefore obtain considerable energy through fiber fermentation. We suggest that humans also evolved consuming similar high foliage, high fiber diets, which were low in fat and dietary cholesterol. The macronutrient and fiber profile of the gorilla diet is one in which the colon is likely to play a major role in overall nutrition. Both the nutrient and fiber components of such a diet and the functional capacity of the hominoid colon may have important dietary implications for contemporary human health."

We disagree, of course, with the authors' suggested interpretation that humans, too, could make good use of the same dietary strategy, as we haven't the colons for it.


The maximum amount of fat humans could get from fermenting fibre in the gut is unknown. The widely cited value of 10% of calories comes from:

E. N. Bergman
Physiological Reviews Published 1 April 1990 Vol. 70 no. 2, 567-590

"The value of 6-10% for humans (Table 3) was calculated on the basis of a typical British diet where 50-60 g of carbohydrate (15 g fiber and 35-50 g sugar and starch) are fermented per day (209). It is pointed out, however, that dietary fiber intakes in Africa or the Third World are up to seven times higher than in the United Kingdom (55). It is likely, therefore, that much of this increased fiber intake is fermented to VFA and even greater amounts of energy are made available by large intestinal fermentation."

However, it should not be concluded that SCFA production could rise to 70% of energy requirements!

For one thing, as a back-of-the-envelope calculation, you can get up to about 2 kcal worth of SCFA per gram of fermentable carbohydrate. That would come from soluble plant fiber, resistant starch and regular starch that escapes digestion. To get 70% of calories this way on a 2000 kcal/day diet, you'd need to ingest 700g of fibre.

Even if you achieved this, it is unlikely you could absorb it all, and in the process of trying, you would experience gastrointestinal distress, including cramping, diarrhea or constipation, gas, and perhaps worse. Indeed, this would probably happen even at 100g/d, which would provide about 10% of energy in a 2000 kcal/d diet. Moreover, it would interfere with mineral absorption, rendering it an unviable evolutionary strategy. Even the ADA, which extols the virtues of fiber, cautions against exceeding their recommendations of 20-35g. See Position of the American Dietetic Association: health implications of dietary fiber.


Evidence type: review

"As the mathematical models now seem quite plausible and the patterns of survivors versus extinct species seem inexplicable by climate change and easily explicable by hunting (7,11), it is worth considering comparisons to other systems. Barnosky et al. note that on islands, humans cause extinctions through multiple synergistic effects, including predation and sitzkrieg, and “only rarely have island megafauna been demonstrated to go extinct because of environmental change without human involvement,” while acknowledging that the extrapolation from islands to continents is often disputed (7). The case for human contribution to extinction is now much better supported by chronology (both radiometric and based on trace fossils like fungal spores), mathematical simulations, paleoclimatology, paleontology, archaeology, and the traits of extinct species when compared with survivors than when Meltzer and Beck rejected it in the 1990s, although the blitzkrieg model which assumes Clovis-first can be thoroughly rejected by confirmation of pre-Clovis sites. Grayson and Meltzer (12) argue that the overkill hypothesis has become irrefutable, but the patterns by which organisms went extinct (7,11), the timing of megafauna population reductions and human arrival when compared with climate change (5), and the assumptions necessary to make paleoecologically informed mathematical models for the extinctions to make accurate predictions all provide opportunities to refute the overkill hypothesis, or at least make it appear unlikely. However, all of these indicate human involvement in megafauna extinctions as not only plausible, but likely."


Evidence type: review

William J. Ripple and Blaire Van Valkenburgh
BioScience (July/August 2010) 60 (7): 516-526.

"Humans are well-documented optimal foragers, and in general, large prey (ungulates) are highly ranked because of the greater return for a given foraging effort. A survey of the association between mammal body size and the current threat of human hunting showed that large-bodied mammals are hunted significantly more than small-bodied species (Lyons et al. 2004). Studies of Amazonian Indians (Alvard 1993) and Holocene Native American populations in California (Broughton 2002, Grayson 2001) show a clear preference for large prey that is not mitigated by declines in their abundance. After studying California archaeological sites spanning the last 3.5 thousand years, Grayson (2001) reported a change in relative abundance of large mammals consistent with optimal foraging theory: The human hunters switched from large mammal prey (highly ranked prey) to small mammal prey (lower-ranked prey) over this time period (figure 7). Grayson (2001) stated that there were no changes in climate that correlate with the nearly unilinear decline in the abundance of large mammals. Looking further back in time, Stiner and colleagues (1999) described a shift from slow-moving, easily caught prey (e.g., tortoises) to more agile, difficult-to-catch prey (e.g., birds) in Mediterranean Pleistocene archaeological sites, presumably as a result of declines in the availability of preferred prey."


Evidence type: review

Ben-Dor M1, Gopher A, Hershkovitz I, Barkai R.
PLoS One. 2011;6(12):e28689. doi: 10.1371/journal.pone.0028689. Epub 2011 Dec 9.

"The disappearance of elephants from the diet of H. erectus in the Levant by the end of the Acheulian had two effects that interacted with each other, further aggravating the potential of H. erectus to contend with the new dietary requirements:

"The absence of elephants, weighing five times the weight of Hippopotami and more than eighty times the weight of Fallow deer (Kob in Table 3), from the diet would have meant that hunters had to hunt a much higher number of smaller animals to obtain the same amount of calories previously gained by having elephants on the menu.

"Additionally, hunters would have had to hunt what large (high fat content) animals that were still available, in order to maintain the obligatory fat percentage (44% in our model) since they would have lost the beneficial fat contribution of the relatively fat (49% fat) elephant. This ‘large animal’ constraint would have further increased the energetic cost of foraging."


"Comparing the average calories per animal at GBY and Qesem Cave might lead to the conclusion that Qesem Cave dwellers had to hunt only twice as many animals than GBY dwellers. This, however, is misleading as obligatory fat consumption complicates the calculation of animals required. With the obligatory faunal fat requirement amounting to 49% of the calories expected to be supplied by the animal, Fallow deer with their caloric fat percentage of 31% (Kob in Table 3) would not have supplied enough fat to be consumed exclusively. Under dietary constraints and to lift their average fat consumption, the Qesem Cave dwellers would have been forced to hunt aurochs and horses whose caloric fat ratio amounts to 49% (the equivalent of buffalo in Table 3). The habitual use of fire at Qesem Cave, aimed at roasting meat [23], [45], may have reduced the amount of energy required for the digestion of protein, contributing to further reduction in DEE. The fact that the faunal assemblage at Qesem Cave shows significantly high proportions of burnt and fractured bones, typical of marrow extraction, is highly pertinent to the point. In addition, the over-representation of fallow deer skulls found at the site [9], [45] might imply a tendency to consume the brain of these prey animals at the cave. Combined, these data indicate a continuous fat-oriented use of prey at the site throughout the Acheulo-Yabrudian (400-200 kyr).

"However, the average caloric fat percentage attributed to the animals at Qesem Cave – 40% – is still lower than the predicted obligatory fat requirements of faunal calories for H. sapiens in our model, amounting to 49% (Table 2). This discrepancy may have disappeared should we have considered in our calculations in Table 3 the previously mentioned preference for prime-age animals that is apparent at Qesem Cave [9], [45]. The analysis of Cordain's Caribou fat data ([124]: Figure 5) shows that as a strategy the selective hunting of prime-age bulls or females, depending on the season, could, theoretically, result in the increase of fat content as the percentage of liveweight by 76% from 6.4% to 11.3%, thus raising the caloric percentage of fat from animal sources at Qesem Cave. Citing ethnographic sources, Brink ([125]:42) writes about the American Indians hunters: “Not only did the hunters know the natural patterns the bison followed; they also learned how to spot fat animals in a herd. An experienced hunter would pick out the pronounced curves of the body and eye the sheen of the coat that indicated a fat animal”. While the choice of hunting a particular elephant would not necessarily be significant regarding the amount of fat obtained, this was clearly not the case with the smaller game. It is apparent that the selection of fat adults would have been a paying strategy that required high cognitive capabilities and learning skills."


The Effect of Ketogenic Diets on Thyroid Hormones

The previous generation of myths about low carb diets were focused on organ systems. They warned of things like kidney dysfunction, and osteoporosis. As these myths became untenable, new myths have swiftly taken their place: myths, for example, about hormone systems, and gut bacteria.

In previous posts, such as here, and here, we dispelled misinformation arising from fears about cortisol. In this post we address fears about thyroid.

The idea that ketogenic diets are “bad for thyroid” is spouted in keto-opposed and keto-friendly venues alike. Despite rampant parroting, it is difficult to find evidence to support this idea. The only evidence that we found even suggestive of this idea is the fact that T₃, the most active thyroid hormone, has repeatedly been shown to be lower in ketogenic dieters.

However, this lowered T₃ is not a sign of “hypothyroid”. In fact, it has a beneficial function! In this article, we explain why lower T₃ on a ketogenic diet is beneficial, rather than a sign of dysfunction or cause for alarm.

Low T₃ is not hypothyroid.


Let's first clear up some confusion about “low thyroid”.

Diagnosis is a tricky business. Diseases manifest in unwanted symptoms, and diagnosis is the art of determining the cause. Sometimes symptoms are very good discriminators. They are easy to verify, and they have only one or two common causes. Other times symptoms are common in a variety of illnesses, and by themselves don't help diagnosis much. Hypothyroid tends to be a cluster of these indiscriminate symptoms, and therefore, a lot of people are tempted, in understandable desperation, to diagnose themselves with it.

Ideally in medical research we want to find indicators and predictors of diseases: things we can measure that discriminate well between diseases, or predict the imminent manifestation of those diseases. Often they are measures that are not readily apparent to a patient, for example blood levels of various substances. To verify a suspicion of hypothyroid, we measure thyroid hormones in the blood.

As we have seen again and again, there are often different ways to measure something, and symptoms or outcomes correlated with one measure may or may not correlate with the others.


The most common thyroid measures are the levels of TSH (thyroid stimulating hormone), T₄ (a relatively inactive form of thyroid hormone), and T₃ (the more active form). TSH acts on the thyroid gland causing T₃ and T₄ to be produced. Further T₃ can be generated out of T₄. Hypothyroid is a problem in the gland, where not enough T₃ and T₄ are being produced. It is indicated by high values of TSH (along with low T₃ and T₄).

It is my suspicion that supplementing thyroid hormone in the general case of hypothyroidism may be as foolish as supplementing insulin in Non-Insulin-Dependent Diabetes. Insulin is appropriate in (aptly named) Insulin-Dependent Diabetes, just as thyroid hormone would remain appropriate in Hashimoto's.

The situation is analogous to high insulin in a Type II (Non-Insulin-Dependent) Diabetic: In that case, insulin at normal amounts is not effectively reducing blood sugar as it would in a healthy body, so more and more gets produced to have the needed effect. In the case of hypothyroid, more and more TSH is produced, because TSH is what acts on the thyroid gland to produce T₃ and T₄. In other words, when you have low T₃ and T₄ levels, this signals more TSH to be created, in order to cause more T₃ and T₄ to be made in the gland.

Low T₃ by itself, without high TSH or low T₄, has been studied extensively, and has various names, including “nonthyroidal illness syndrome” (NTIS) [1]. On modern, high carb diets, it appears to happen only in cases of critical illness [1].

Whether low T₃ in critical illness is adaptive or not is a point of controversy [1]. Clearly, either there is a disruption in production caused by the illness, or the body has a functional reason for not raising T₃; that is, that low T₃ helps recovery in some way. The adaptive hypothesis would be supported if supplementing T₃ caused harm. Unfortunately, results have been mixed.

The mixed results are probably an artefact of the lumping together of the various situations in which NTIS occurs. Although NTIS occurs with starvation, ketogenic diets, which share some metabolic similarities with starvation, have not so far been included in this area of research. However, research in calorie, carbohydrate, and protein restriction indicates that in these cases, as with starvation [1], lower T₃ is adaptive.

Lower T₃ spares muscle in conditions of weight loss or inadequate protein.

In weight loss, starvation, or protein deficiency conditions, lowered T₃ is thought to be a functional response that protects against muscle loss [2], [3]. When a diet creates a calorie deficit, or is low in protein, this creates a catabolic state (one in which the body tends to be breaking things down, rather than building them up). If the body does not respond to this by lowering T₃, then lean mass would be lost. Moreover, if T₃ is supplemented by a well-meaning person who interpreted this adaptation as a detrimental “hypothyroid” condition, this also results in loss of lean mass, as shown by Koppeschaar et al. [4]. Supplementing T₃ decreases ketosis, and increases the insulin-to-glucagon ratio [4], which, as we have previously discussed is tightly correlated with glucose production. This suggests that supplementing T₃ induces gluconeogenesis; as Koppeschaar et al. put it: "It must be concluded that triiodothyroxine also directly influenced glucose metabolism".

Not only are T₃ levels lower in calorie restriction, but T₃ receptors are downregulated [1], [5], suggesting a second mechanism by which the body adapts away from T₃ use under ketogenic conditions.

If you are on a low-carb diet in which you are losing weight, and your T₃ is low, don't assume you should correct this with supplementation. Lowered T₃ has a purpose, and supplementing it defeats the purpose.

Other research has shown a correlation between lower T₃ and higher ketosis [6], and between lower T₃ and very low carbohydrate levels [7], [8], [9]. It's all very consistent.

In other words, the more ketogenic a weight loss diet is the better it spares muscles, and lowered T₃ is thought to be part of the mechanism, because it is both correlated with higher βOHB, correlated with muscle sparing, and because supplementing with T₃ reverses the muscle sparing effect.

As alluded to above, T₃ will also be lowered in a situation where weight is not being lost, and carbs are not ketogenically low, if protein is inadequate [10]. This further underscores the function of T₃ lowering: to spare protein for lean mass.

We are not aware of a study showing the effects of a protein adequate, ketogenic maintenance diet (i.e. not calorie restricted) that measured T₃. Therefore, we are not certain whether lowered T₃ would continue in that context [11].

However, insofar as it may continue, that could be beneficial:

Low T₃ is associated with longevity.

It's possible that the lower T₃ found in ketogenic dieters is an indicator of a lifespan increasing effect.

First, T₃ is associated with longevity. Low T₃ has been found in the very long-lived [12]. This does not appear to be simply an effect of old age, though, because the correlation also shows up in a genetic study of longevity [13].

Moreover, just as with moderately elevated cortisol, low T₃ is found in animals who have their lifespans experimentally increased, and therefore (again, as with elevated cortisol) the low T₃ is hypothesised to be part of the mechanism in increasing lifespan [13], [14].


There is no evidence that we are aware of indicating that ketogenic diets cause hypothyroid, or negatively impact thyroid function. The fact that T₃ is lower in ketogenic dieters is probably part of the mechanism that protects lean mass when fat is being lost. Moreover, low T₃ may possibly even be an indicator of a life extending effect, an effect we have suggested elsewhere when examining the cortisol profile of ketogenic dieters.



Evidence type: review

Economidou F1, Douka E, Tzanela M, Nanas S, Kotanidou A.
Hormones (Athens). 2011 Apr-Jun;10(2):117-24.

(Emphasis ours)

“The metabolic support of the critically ill patient is a relatively new target of active research and little is as yet known about the effects of critical illness on metabolism. The nonthyroidal illness syndrome, also known as the low T₃ syndrome or euthyroid sick syndrome, describes a condition characterized by abnormal thyroid function tests encountered in patients with acute or chronic systemic illnesses. The laboratory parameters of this syndrome include low serum levels of triiodothyronine (T₃) and high levels of reverse T₃, with normal or low levels of thyroxine (T₄) and normal or low levels of thyroid-stimulating hormone (TSH). This condition may affect 60 to 70% of critically ill patients. The changes in serum thyroid hormone levels in the critically ill patient seem to result from alterations in the peripheral metabolism of the thyroid hormones, in TSH regulation, in the binding of thyroid hormone to transport-protein and in receptor binding and intracellular uptake. Medications also have a very important role in these alterations. Hormonal changes can be seen within the first hours of critical illness and, interestingly, these changes correlate with final outcome. Data on the beneficial effect of thyroid hormone treatment on outcome in critically ill patients are so far controversial. Thyroid function generally returns to normal as the acute illness resolves.”


It remains controversial whether development of the aforementioned changes in thyroid metabolism reflects a protective mechanism or a maladaptive process during illness.

If these changes constitute an adaptation mechanism, then treatment to restore thyroid hormone levels to the normal range could have deleterious effects. In contrast, if these changes are pathologic, treatment may improve an otherwise poor clinical outcome. Current literature data indicate that:

Starvation-induced decrease in serum T₃ concentrations most likely reflects a process of adaptation.

Ketogenic metabolism most closely resembles starvation, though, of course, with the important difference that it is nutritionally complete and there is no reason to believe it would be unhealthy indefinitely. — Amber


Evidence type: experiment

Kaptein EM, Fisler JS, Duda MJ, Nicoloff JT, Drenick EJ.
Clin Endocrinol (Oxf). 1985 Jan;22(1):1-15.

(Emphasis ours)

“The relationship between the changes in serum thyroid hormone levels and nitrogen economy during caloric deprivation were investigated in ten obese men during a 40 d, 400 kcal protein-supplemented weight-reducing diet. This regimen induced increases in the serum levels of total T₄, free T₄ and total rTT₃and decreases of total T₃, while serum TSH remained unchanged. There were progressive decreases in total body weight and urinary losses of total nitrogen and 3-methylhistidine, with the early negative nitrogen balance gradually returning towards basal values during the 40 days. Subjects with the largest weight loss had the most increase in the serum levels of total T₄ and free T₄ index and the greatest decrease in T₃. The magnitude of the increase of the nitrogen balance from its nadir was correlated with the extent of the reduction of T₃ and increase of T₃ uptake ratio and free T₄ levels. The decrease in the urinary excretion of 3-methylhistidine correlated with the increase in free T₄ and rT₃ levels. Nadir serum transferrin values were directly related to peak rT₃ values, and the lowest albumin concentrations occurred in subjects with the highest total T₄ and free T₄ index values. Further, the maximum changes in the serum thyroid hormone levels preceded those of the nutritional parameters. These relationships suggest that: (1) increases in serum rT₃ and free T₄ and reductions in T₃ concentrations during protein supplemented weight reduction may facilitate conservation of visceral protein and reduce muscle protein turnover; and (2) the variation in the magnitude of these changes may account for the heterogeneity of nitrogen economy.”


Evidence type: experiment

(Emphasis ours)

“Although the rate of fat loss was relatively constant throughout the study, wide interindividual variations in cumulative protein (nitrogen) deficit were observed. Total nitrogen losses per subject ranged from 90.5 to 278.7 g. Cumulative nitrogen loss during the first 16 days tended to correlate negatively with initial mean fat cell size and positively with initial lean body mass. Most notable was the strong negative correlation between the size of the decrease in serum triiodothyronine over the 64-day study and the magnitude of the concurrent cumulative N deficit. During severe caloric restriction, one's ability to decrease circulating serum triiodothyronine levels may be critical to achievement of an adaptational decrease in body protein loss.


Evidence type: experiment

(Emphasis ours)

“Metabolic responses during a very-low-calorie diet, composed of 50 per cent glucose and 50 per cent protein, were studied in 18 grossly obese subjects (relative weights 131-205 per cent) for 28 d. During the last 14 d (period 2) eight subjects (Gp B) served as controls, while the other ten subjects (Gp A) in the low T₃ state were treated with triiodothyronine supplementation (50 micrograms, 3 times daily). During the first 14 d (period 1) a low T₃-high rT₃ state developed; there was an inverse relationship between the absolute fall of the plasma T₃ concentrations and the cumulative negative nitrogen balance as well as the beta-hydroxybutyrate (βOHB) acid concentrations during the semi-starvation period, pointing to a protein and fuel sparing effect of the low T₃ state. Weight loss in the semi-starvation period was equal in both groups; during T₃ treatment the rate of weight loss was statistically significant (Gp A 6.1 +/- 0.3 kg vs Gp B 4.2 +/- 0.2 kg, P less than 0.001). In the control group there was a sustained nitrogen balance after three weeks; in Gp A the nitrogen losses increased markedly during T₃ treatment. Compared to the control group, on average a further 45.4 g extra nitrogen were lost, equivalent to 1.4 kg fat free tissue. Thus, 74 per cent of the extra weight loss in the T₃ treated group could be accounted for by loss of fat free tissue. During the T₃ treatment period no detectable changes occurred regarding plasma triglycerides and plasma free fatty acids (FFA) concentrations; the plasma βOHB acid concentrations decreased significantly as compared to the control group. Plasma glucose concentrations and the immunoreactive insulin (IRI)/glucose ratio increased in Gp A in the T₃ treatment period, reflecting a state of insulin resistance with regard to glucose utilization. Our results warrant the conclusion that there appears to be no place for T₃ as an adjunct to dieting, as it enhances mostly body protein loss and only to a small extent loss of body fat.


"The plasma βOHB concentration declined significantly during T₃ treatment. In accordance with the results of Hollingsworth et al. we observed a decline of the plasma uric acid levels; this decline occurred simulataneously with the decrease in the βOHB levels in the T₃ treated group; as renal tubular handling of uric acid and ketones are closely linked during fasting, this might implicate a diminished renal reabsorbtion of ketones.

"It is known that renal conservation of ketones prevents large losses of cations during prolonged starvation without T₃ treatment; since ammonium is the major cation excreted in established starvation, the increased renal reabsorbtion of ketone bodies also minimizes nitrogen loss."


Evidence type: review

Schussler GC, Orlando J.
Science. 1978 Feb 10;199(4329):686-8.

"Fasting decreases the ratio of hepatic nuclear to serum triiodothyronine (T₃) by diminishing the binding capacity of nuclear T₃ receptors. In combination with the lower serum T₃ concentration caused by fasting, the decrease in receptor content results in a marked decrease in nuclear T₃-receptor complexes. The changes in T₃ receptor content and circulating T₃ in fasted animals appear to be independent synergistic adaptations for caloric conservation in the fasted state. Unlike changes in hormonal level, the modification of nuclear receptor content provides a mechanism that may protect cells with a low caloric reserve independently of the metabolic status of the whole animal."


Evidence type: controlled experiment

Spaulding SW, Chopra IJ, Sherwin RS, Lyall SS.
J Clin Endocrinol Metab. 1976 Jan;42(1):197-200.
“To evaluate the effect of caloric restriction and dietary composition on circulating T₃ and rT₃, obese subjects were studied after 7—18 days of total fasting and while on randomized hypocaloric diets (800 kcal) in which carbohydrate content was varied to provide from 0 to 100% calories. As anticipated, total fasting resulted in a 53% reduction in serum T₃ in association with a reciprocal 58% increase in rT₃. Subjects receiving the no-carbohydrate hypocaloric diets for two weeks demonstrated a similar 47% decline in serum T₃ but there was no significant change in rT₃ with time. In contrast, the same subjects receiving isocaloric diets containing at least 50 g of carbohydrate showed no significant changes in either T₃ or rT₃ concentration. The decline in serum T₃ during the no-carbohydrate diet correlated significantly with blood glucose and ketones but there was no correlation with insulin or glucagon. We conclude that dietary carbohydrate is an important regulatory factor in T₃ production in man. In contrast, rT₃, concentration is not significantly affected by changes in dietary carbohydrate. Our data suggest that the rise in serum rT₃ during starvation may be related to more severe caloric restriction than that caused by the 800 kcal diet.”

So at least in a very low calorie situation, T₃ becomes low only when the diet is sufficiently low in carbohydrate to be ketogenic, and its level correlates with ketogenesis. We are not told whether any of the diets were protein sufficient, but in this case it doesn't matter. The very low calories make it catabolic, and only when carbohydrate is at ketogenically low levels does the protein sparing effect occur. —Amber


Evidence type: controlled experiment

Mathieson RA, Walberg JL, Gwazdauskas FC, Hinkle DE, Gregg JM.
Metabolism. 1986 May;35(5):394-8.

(Emphasis ours)

“Twelve obese women were studied to determine the effects of the combination of an aerobic exercise program with either a high carbohydrate (HC) very-low-caloric diet (VLCD) or a low carbohydrate (LC) VLCD diet on resting metabolic rate (RMR), serum thyroxine (T₄), 3,5,3'-triiodothyronine (T₃), and 3,5,3'-triiodothyronine (rT₃). The response of these parameters was also examined when subjects switched from the VLCD to a mixed hypocaloric diet. Following a maintenance period, subjects consumed one of the two VLCDs for 28 days. In addition, all subjects participated in thrice weekly submaximal exercise sessions at 60% of maximal aerobic capacity. Following VLCD treatments, participants consumed a 1,000 kcal mixed diet while continuing the exercise program for one week. Measurements of RMR, T₄, T₃, and rT₃ were made weekly. Weight decreased significantly more for LC than HC. Serum T₄ was not significantly affected during the VLCD. Although serum T₃ decreased during the VLCD for both groups, the decrease occurred faster and to a greater magnitude in LC (34.6% mean decrease) than HC (17.9% mean decrease). Serum rT₃ increased similarly for each treatment by the first week of the VLCD. Serum T₃ and rT₃ of both groups returned to baseline concentrations following one week of the 1,000 kcal diet. Both groups exhibited similar progressive decreases in RMR during treatment (12.4% for LC and 20.8% for HC), but values were not significantly lower than baseline until week 3 of the VLCD. Thus, although dietary carbohydrate content had an influence on the magnitude of fall in serum T₃, RMR declined similarly for both dietary treatments.”


Evidence type: controlled experiment

Pasquali R, Parenti M, Mattioli L, Capelli M, Cavazzini G, Baraldi G, Sorrenti G, De Benedettis G, Biso P, Melchionda N.
J Endocrinol Invest. 1982 Jan-Feb;5(1):47-52.

(Emphasis ours)

“The effect of different hypocaloric carbohydrate (CHO) intakes was evaluated in 8 groups of obese patients in order to assess the role of the CHO and the other dietary sources in modulating the peripheral thyroid hormone metabolism. These changes were independent of those of bw. Serum T₃ concentrations appear to be more easily affected than those of reverse T₃ by dietary manipulation and CHO content of the diet. A fall in T₃ levels during the entire period of study with respect to the basal levels occurred only when the CHO of the diet was 120 g/day or less, independent of caloric intake (360, 645 or 1200 calories). Moreover, reverse T₃ concentrations were found increased during the entire period of study when total CHO were very low (40 to 50 g/day) while they demonstrated only a transient increase when CHO were at least 105 g/day (with 645 or more total calories). Indeed, our data indicate that a threshold may exist in dietary CHO, independent of caloric intake, below which modifications occur in thyroid hormone concentrations. From these results it appears that the CHO content of the diet is more important than non-CHO sources in modulating peripheral thyroid hormone metabolism and that the influence of total calories is perhaps as pronounced as that of CHO when a “permissive” amount of CHO is ingested.”


Evidence type: controlled experiment

(Emphasis ours)

“To assess the effect of starvation and refeeding on serum thyroid hormones and thyrotropin (TSH) concentrations, 45 obese subjects were studied after 4 days of fasting and after refeeding with diets of varying composition. All subjects showed an increase in both serum total and free thyroxine (T₄), and a decrease in serum total and free triiodothyronine (T₃) following fasting. These changes were more striking in men then in women. The serum T₃ declined during fasting even when the subjects were given oral L-T₄, but not when given oral L-T₃. After fasting, the serum reverse T₃ (rT₃) rose, the serum TSH declined, and the TSH response to thyrotropin-releasing hormone (TRH) was blunted. Refeeding with either a mixed diet (n = 22) or a carbohydrate diet (n = 8) caused the fasting-induced changes in serum T₃, T₄, rT₃, and TSH to return to control values. In contrast, refeeding with protein (n = 6) did not cause an increase in serum T₃ or in serum TSH of fasted subjects, while it did cause a decline in serum rT₃ toward basal value.

The present data suggest that: (1) dietary carbohydrate is an important factor in reversing the fall in serum T₃ caused by fasting; (2) production of rT₃ is not as dependent on carbohydrate as that of T₃; (3) men show more significant changes in serum thyroid hormone concentrations during fasting than women do, and (4) absorption of T₃ is not altered during fasting.”

Note that in this case, “refeeding” was with an 800 calorie diet, i.e., for protein, 200g. So the refeeding diet is still low calorie, and thus still catabolic —Amber


Evidence type: controlled experiment

Otten MH, Hennemann G, Docter R, Visser TJ.
Metabolism. 1980 Oct;29(10):930-5.

“Short term changes in serum 3,3',5-triiodothyronine (T₃) and 3,3'5-triiodothyronine (reverse T₃, rT₃) were studied in four healthy nonobese male subjects under varying but isocaloric and weight maintaining conditions. The four 1500 kcal diets tested during 72 hr, consisted of: I, 100% fat; II, 50% fat, 50% protein; III, 50% fat, 50% carbohydrate (CHO), and IV, a mixed control diet. The decrease of T₃ (50%) and increase of rT₃ (123%) in the all-fat diet equalled changes noted in total starvation. In diet III (750 kcal fat, 750 kcal CHO) serum T₃ decreased 24% (NS) and serum rT₃ rose significantly 34% (p < 0.01). This change occurred in spite of the 750 kcal CHO. This amount of CHO by itself does not introduce changes in thyroid hormone levels and completely restores in refeeding models the alterations of T₃ and rT₃ after total starvation. The conclusion is drawn that under isocaloric conditions in man fat in high concentration itself may play an active role in inducing changes in peripheral thyroid hormone metabolism.”

Here, finally, is a study that is explicitly a maintenance diet. It says mostly what we would expect. It was a bit surprising, and contrary to some previous findings, that in the half carb, half fat diet, this high a carbohydrate level would still allow lower T₃. The authors suggest that this is evidence that high fat alone is responsible. Our interpretation, in contrast, is that it is the zero protein condition that led to the lower T₃. In the body of the paper, the authors, to their credit, acknowledge that they are speculating. We would love to see this example followed by more researchers. —Amber


Ebbeling et al. did make T₃ measurements, on a ketogenic diet intended to be weight stable, but the subjects were losing weight while on the ketogenic phase, and therefore no conclusion about T₃ in weight stable, protein adequate conditions can be drawn from that study.

Ebbeling CB, Swain JF, Feldman HA, Wong WW, Hachey DL, Garcia-Lago E, Ludwig DS.
JAMA. 2012 Jun 27;307(24):2627-34. doi: 10.1001/jama.2012.6607.

(Emphasis ours)

“Participants Overweight and obese young adults (n=21).

Interventions After achieving 10 to 15% weight loss on a run-in diet, participants consumed low-fat (LF; 60% of energy from carbohydrate, 20% fat, 20% protein; high glycemic load), low-glycemic index (LGI; 40%-40%-20%; moderate glycemic load), and very-low-carbohydrate (VLC; 10%-60%-30%; low glycemic load) diets in random order, each for 4 weeks.”


“Hormones and Components of the Metabolic Syndrome (Table 3)

Serum leptin was highest with the LF diet (14.9 [12.1 to 18.4] ng/mL), intermediate with the LGI diet (12.7 [10.3 to 15.6] ng/mL) and lowest with the VLC diet (11.2 [9.1 to 13.8] ng/mL; P=0.0006). Cortisol excretion measured with a 24-hour urine collection (LF: 50 [41 to 60] μg/d; LGI: 60 [49 to 73] μg/d; VLC: 71 [58 to 86] μg/d; P=0.005) and serum TSH (LF: 1.27 [1.01 to 1.60] μIU/mL; LGI: 1.22 [0.97 to 1.54] μIU/mL; VLC: 1.11 [0.88 to 1.40] μIU/mL; P=0.04) also differed in a linear fashion by glycemic load. Serum T₃ was lower with the VLC diet compared to the other two diets (LF: 121 [108 to 135] ng/dL; LGI: 123 [110 to 137] ng/dL; VLC: 108 [96 to 120] ng/dL; P=0.006).


Evidence type: observational

Baranowska B1, Wolinska-Witort E, Bik W, Baranowska-Bik A, Martynska L, Broczek K, Mossakowska M, Chmielowska M.
Neurobiol Aging. 2007 May;28(5):774-83. Epub 2006 May 12.

(Emphasis ours)

“It is well known that physiological changes in the neuroendocrine system may be related to the process of aging. To assess neuroendocrine status in aging humans we studied a group of 155 women including 78 extremely old women (centenarians) aged 100-115 years, 21 early elderly women aged 64-67 years, 21 postmenopausal women aged 50-60 years and 35 younger women aged 20-50 years. Plasma NPY, leptin, glucose, insulin and lipid profiles were evaluated, and serum concentrations of pituitary, adrenal and thyroid hormones were measured. Our data revealed several differences in the neuroendocrine and metabolic status of centenarians, compared with other age groups, including the lowest serum concentrations of leptin, insulin and T₃, and the highest values for prolactin. We failed to find any significant differences in TSH and cortisol levels. On the other hand, LH and FSH levels were comparable with those in the elderly and postmenopausal groups, but they were significantly higher than in younger subjects. GH concentrations in centenarians were lower than in younger women. NPY values were highest in the elderly group and lowest in young subjects. We conclude that the neuroendocrine status in centenarians is markedly different from that found in early elderly or young women.”


Evidence type: observational

Rozing MP1, Westendorp RG, de Craen AJ, Frölich M, Heijmans BT, Beekman M, Wijsman C, Mooijaart SP, Blauw GJ, Slagboom PE, van Heemst D; Leiden Longevity Study (LLS) Group.
J Gerontol A Biol Sci Med Sci. 2010 Apr;65(4):365-8. doi: 10.1093/gerona/glp200. Epub 2009 Dec 16.


The hypothalamo-pituitary-thyroid axis has been widely implicated in modulating the aging process. Life extension effects associated with low thyroid hormone levels have been reported in multiple animal models. In human populations, an association was observed between low thyroid function and longevity at old age, but the beneficial effects of low thyroid hormone metabolism at middle age remain elusive.


We have compared serum thyroid hormone function parameters in a group of middle-aged offspring of long-living nonagenarian siblings and a control group of their partners, all participants of the Leiden Longevity Study.


When compared with their partners, the group of offspring of nonagenarian siblings showed a trend toward higher serum thyrotropin levels (1.65 vs157 mU/L, p = .11) in conjunction with lower free thyroxine levels (15.0 vs 15.2 pmol/L, p = .045) and lower free triiodothyronine levels (4.08 vs 4.14 pmol/L, p = .024).


Compared with their partners, the group of offspring of nonagenarian siblings show a lower thyroidal sensitivity to thyrotropin. These findings suggest that the favorable role of low thyroid hormone metabolism on health and longevity in model organism is applicable to humans as well.


Evidence type: experiment

Fontana L, Klein S, Holloszy JO, Premachandra BN.
J Clin Endocrinol Metab. 2006 Aug;91(8):3232-5. Epub 2006 May 23.


Caloric restriction (CR) retards aging in mammals. It has been hypothesized that a reduction in T₃ hormone may increase life span by conserving energy and reducing free-radical production.


The objective of the study was to assess the relationship between long-term CR with adequate protein and micronutrient intake on thyroid function in healthy lean weight-stable adult men and women.


In this study, serum thyroid hormones were evaluated in 28 men and women (mean age, 52 +/- 12 yr) consuming a CR diet for 3-15 yr (6 +/- 3 yr), 28 age- and sex-matched sedentary (WD), and 28 body fat-matched exercising (EX) subjects who were eating Western diets.


Serum total and free T₄, total and free T₃, reverse T₃, and TSH concentrations were the main outcome measures.


Energy intake was lower in the CR group (1779 +/- 355 kcal/d) than the WD (2433 +/- 502 kcal/d) and EX (2811 +/- 711 kcal/d) groups (P < 0.001). Serum T₃ concentration was lower in the CR group than the WD and EX groups (73.6 +/- 22 vs. 91.0 +/- 13 vs. 94.3 +/- 17 ng/dl, respectively) (P < or = 0.001), whereas serum total and free T₄, reverse T₃, and TSH concentrations were similar among groups.


Long-term CR with adequate protein and micronutrient intake in lean and weight-stable healthy humans is associated with a sustained reduction in serum T₃ concentration, similar to that found in CR rodents and monkeys. This effect is likely due to CR itself, rather than to a decrease in body fat mass, and could be involved in slowing the rate of aging.”