This blog is about the intersection between evolutionary biology and food. But also about practical applications, sustainable agriculture, and general tasty things.
Every two years or so I notice a cyclical trend in the online “paleo” community. It’s the resurgence of dogmatic carnivory. It has two main themes: plants are “poisons” that cause most of our health problems and humans “evolved to be” very low carb. Always an undercurrent with some very zealous devotees (“The Bear” of Grateful Dead fame was probably one of its most prominent popularizers), it suddenly finds popularity among normally more moderate people, picking up some non-paleo low-carb followers in the process. Then it goes away again, hilariously with some of its top cheerleaders renouncing it in the process (like Danny Roddy).
It’s been back again lately. A few readers have written me about Anna who writes the blog Life Extension*. She is a graduate student in archaeology and social anthropology. Anna’s most popular post so far is “Debunking and Deconstructing Some ‘Myths of Paleo’. Part One: Tubers.” Sadly, an opportunity for greater communication to the public from a much-maligned discipline becomes a manifesto for low-carb diets. The tagline is “Glucose restriction represents not only the most crucial component of ancestral diets but is by far the easiest element to emulate.” I think we’ve heard this one before, but this time it is in language that is more authoritative than usual. This is the kind of writing I would have liked Paleofantasy to take on.
Unfortunately she doesn’t refer to sources directly in her text, so I’ve done my best to figure out which sources she is referring to.
Most archaeologists don’t go around promoting diets, because they recognize the limitations of their field. There is so much that is unknown and unknowable. It’s pretty easy for nearly anyone to pigeonhole what we do have to fit their own narratives.
The reduction in size and robusticity of the human skeleton is a clear temporal trend of newly agricultural communities. Diachronic skeletal comparisons reveal large-scale, significant reductions in growth rates.
Yes, of some newly agricultural communities, and that doesn't mean it stayed this way. I’ve written about it more than I would have liked. I just wrote about it in my last post about Paleofantasy (which cites this review).
Then a funny thing happened on the way from the preagricultural Mediterranean to the giant farms of today: people, at least some of them, got healthier, presumably as we adapted to the new way of life and food became more evenly distributed. The collection of skeletons from Egypt also shows that by 4,000 years ago, height had returned to its preagricultural levels, and only 20 percent of the population had telltale signs of poor nutrition in their teeth. Those trying to make the point that agriculture is bad for our bodies generally use skeletal material from immediately after the shift to farming as evidence, but a more long-term view is starting to tell a different story. - Marlena Zuk
It also brings up how questionably height is used in these narratives. The few hunter-gatherers that exist today are very very short (mostly due to genetics). The rest of the world has grown taller and taller. Staffan Lindeberg in his magnum opus suggests we are too tall from overnutrition. Other markers that extremists attempt to use to show that agricultural humans show a downward trend in terms of health suffer from similar limitations.
Instances of porotic hyperostosis brought on by iron deficiency anaemia increased dramatically in agricultural settings.
There is a new appreciation of the adaptability and flexibility of iron metabolism; as a result it has become apparent that diet plays a very minor role in the development of iron deficiency anemia. It is now understood that, rather than being detrimental, hypoferremia (deficiency of iron in the blood) is actually an adaptation to disease and microorganism invasion.”- Porotic hyperostosis: A new perspective
Either way, I’m not sure what the transition these communities in upheaval experienced has to do with whether or not tubers or any carbohydrates are bad for you. It wasn’t just the food that changed for these people, it was their entire way of life, and it was a transition that changed their biology. And while there are trends, there is no linear health decline. There is a more systematic database of human remains and health markers that is in the process of being created right now that should be a great resource in the future. At this point a lot of papers claiming a decline are using inappropriate sample sizes and statistical methods.
Far too little evolutionary time has passed for us to be successfully acclimated to the novel conditions of agricultural life.
Another common thread that is begging the question. How long is long enough? How many adaptations are enough?
Speaking of evolutionary time:
Spending most of our human history in glacial conditions, our physiology has consequently been modelled by the climatologic record, with only brief, temperate periods of reprieve that could conceivably allow any significant amount of edible plant life to have grown.
Like Nora Gedgauda's paleo book Primal Body, Primal Mind, which she cites for unknown reasons, this sentences implies to her lay readers than glacial conditions = something out of the movie Ice Age. Which is just not true. A glacial maximum left some people in the cold, but Africa was still quite warm, and if we are talking about evolutionary time, that’s where we spent most of it. Outside Africa, most humans seem to have clustered in fairly temperate refugia such as Southern Iberia during the last ice age.
Many think of the late Pleistocene as the “Ice Age”, a time when continental glaciers coveredmuch of the earth and where the land not under ice was inhabited by giant cold-adapted animals—wooly mammoth, wooly rhinoceros, and cave bears—pursued by hardy humanhunters. While this image may be somewhat accurate for part of the world, most of the earthremained unglaciated throughout the Pleistocene.” -In Glacial Environments Beyond Glacial Terrains: Human Eco-Dynamics in LatePleistocene Mediterranean Iberia
Of course “significant amount” is also going to be a point of contention. Only in the very coldest tip of the arctic do levels of plants in human diet fall to close to zero. Beyond that, many people might not be aware of levels of starch and sugar available in the environment because traditions surrounding them have died out. I have written quite a bit about Northern sources of carbohydrates- “Siberian potatoes” and Alaskan native plant foods.
Further information on the evolution of our diet can be garnered from the genetic data of present populations, which demonstrates the historically-late biological adaptation to less than minimal quantities of starch and to only few and specific starch compounds.
I assume this refers to amylyse (AMY1) copy number, the function and history of which is not quite clear, much like lactase persistence. For example, I do not possess lactase persistence, even though my ancestors probably raised livestock for dairy, they were diversified pastoralists, so it’s likely there was not enough selective pressure for them to develop this trait. They consumed dairy, but the majority of their diet was not dairy.
It is unlikely the ancestral human diet was as high in starch as some horticulturalist tropical diets are now, where the majority of calories come from starch. But in the end, the differences in AMY1 copy number between humans are small compared to our differences with other primates, indicating that perhaps this was selected for in our own evolution. And in the original paper it is kind of mind-boggling they use the Mbuti as a “low-starch” population given their high starch consumption.
The Mbuti are particularly interesting because they are hunter-gatherers, but trade their surplus meat for starch and have done this for quite some time (when this isn't available there are forest tubers utilized as fall-backs). The only time they don’t trade is when honey is in abundance.
Anna’s assertion that starch is comparatively “inefficient” compared to meat using optimal foraging models doesn’t mean that humans would have chosen to eat only or mostly meat. That data includes game from South American environments, which is unusually fatty in comparison to African game. Even in South America, such game is not available in unlimited amounts in the first place, which is why even hunter-gatherer cultures that have access to it like the Ache also extensively gather and process starch and gather honey.
The consequences of limited availability and time investment of edible Palaeolithic plant foods has been analysed by Stiner, who compared food class returns amongst contemporary hunter-gatherer groups. Stiner found the net energy yield of roots and tubers to range from 1,882 kj/hour to 6,120 kj/hour (not to mention the additional time needed to dedicate to preparation) compared to 63,398 kj/hour for large game.
Anna’s assertions stand in stark contrast to the paper she seems to cite:
Staple plant resources present an extreme contrast to large game animals with respect to prevailing economic currencies (Table 11.1). Large animals generally yield high returns per unit foraging time (kJ per hour) but are unpredictable resources. Seeds and nuts give much lower net yields per increment time (kJ per kilogram acquired), but they have potentially high yields with respect to the volume obtained and the area of land utilized.
Surveys of hunter-gatherers show overwhelmingly that preferred foods are fatty game and honey, highly caloric (and delicious), yet these are not the majority of the diet because they are not available in high predictable amounts, like the modern equivalents are.
As Kim Hill, who studies the Ache says “High-ranked items may be so rarely encounteredthat they represent only a very small proportion of the diet; low-ranked items in the optimalset may be encountered with sufficient frequency to contribute the bulk. It is interesting to note that on several occasions, reports of nearby palm fruit (ranked 12) were ignored, something that did not happen with oranges. On several other occasions people discussed the relative merits of hunting monkeys (ranked 11). reaching consensus that monkeys should not be pursued “because they are not fat.”
Anthropologists have theorized on the importance of having carbohydrate fallback foods in the event that high-fat game is not available, either because of seasonality or over-hunting. In these cases, “rabbit starvation” from excess protein is a real danger. Surviving off of game is a real challenge, which probably accounts for the fact that many humans have any exploited seemingly tedious to gather plant resources in nearly every environment.
Some of Anna’s arguments indicate that she has decided on some issues that are actually very controversial in anthropology and archaeology, such as the date of regular fire use (Anna asserts it was much later than many think) and that “However, plants have been preserved in the Lower Palaeolithic, and they are used primarily for functional and material – rather than nutritional – purposes.”
She does admit that “I will concede however that absence of evidence is not evidence of absence” but then goes on to list some sites that show possible non-food-related plant use that aren’t even associated with Homo sapiens, many are hominid offshoots that are unlikely to have contributed to our line (except for some of us who have a possible small amount of neanderthal ancestry). Other sites she mentions aren’t dated to the lower Paleolithic anyway.
Later sites such as Kebara she also dismisses, implying that legumes would have been used as fire starters rather than food. But admits that hominids would have supplemented their diet with “low glycemic” foods when meat was scarce.
Firstly, Neanderthals were highly carnivorous and physiologically inept at digesting plant foods. This can be measured using the megadontia quotient of Neanderthal postcanine tooth area in relation to body mass, which reveals that H. neanderthalensis must have consumed a greater than 80% animal diet. Nonetheless, the evidence of phytoliths and grains from Neanderthal skeletons at Shanidar Cave may reveal the rare consumption of starches in this singular context, but not the deleterious costs to the health of those that ate them.
The megadontia quotient, which is controversial in the first place, is not meant to be used in this way. Neither is the also mentioned expensive tissue hypothesis. They are meant to analyze use of uncooked fibrous plant foods and is not particularly enlightening in the case of large-brained hominids with cultural adaptations to food such as cooking. Some of the most recent research that reappraises the carnivorous theory of neanderthals is covered in this recent talk by neanderthal experts Dr. Margaret J. Schoeninger and Dr. Alison S. Brooks.
Humans show up as carnivores, even when they are known corn-eating agriuculturalists, like these people. But what happens when you plot other plants?
Now the data makes more sense (remember this data is showing where protein in the diet came from, it doesn't tell us how much protein was eaten).
As you can see, initial isotopic studies (which can only show where the protein came from, not the amount of protein in the diet) that showed neantherthals as top carnivores came into question when farming populations were showing similar values. They realized that they needed to consider analyzing plants based on their most nutritious fractions, since when was the last time anyone sane ate something like a whole stalk of corn, husk and all? Another great paper by John D. Speth also summarizes some of the recent research on neanderthal diets and debunks hypercarnivory.
humans were no longer able to transmute fibre into fat – as other primates can (consequently, they eat a high-fat diet) – through fermentation in the large intestine.
This, as anthropologist Dr. Richard Wrangham has pointed out, could also be an adaptation to cooking. And we didn’t lose this ability, it is just reduced, though no biologist would argue it the SCFA produced in the colon, which can provide calories and also modulate inflammation, are unimportant. SCFA metabolism is not comparable to longer chain fatty acid metabolism, so it’s not really appropriate to call these diets “high fat.” Furthermore, there are other primates with similar guts to ours like capuchins, who most certainly do not eat a carnivorous diet– they eat sugary fruit. But it’s very hard to compare our guts to the guts of other animals since cultural traits like cooking are so important for our food consumption.
I think it’s a bit amusing to read these posts alongside those of PaleoVeganology, written by a graduate student in paleontology who criticizes many popular paleo narratives. However much I disagree with him on the issue of modern diet choices, I commend him for not using his expertise to promote his chosen diet- he is explicit that his dietary choices are built on modern ethics and not the murky past.
The skeletons at Shanidar are certainly the first of many analyses of starches on teeth, which rules out theories like that plants were only used as decorations or fire starters. Since that first paper was published, others using the same method have followed and more will. But there is no way to use such data to speculate on how often or how much of these foods were consumed.
The coprolite “paper” that Nora Gedgaudas frequently cites also comes up, which I’ve addressed here.
Another common thread in carnivore narratives is that plants were used “only” as medicinals. I would not consider this as insignificant in any way– in most cultures, the line between food and medicine is a thin one. Many foods we enjoy as foods these days have medicinal roots.
Anna rightly criticizes the use of non-hunter-gatherers as hunter-gatherer proxies in writings about the so-called paleo diet and then cites a study that does the exact same thing-
In an attempt to reconstruct the diet of ice age hominids, a recent study analysed the macronutrient dietary composition of existing hunter-gatherer groups within latitude intervals from 41° to greater than 60°.
But where did this data come from? Anthropologist Katherine Milton responded quite well to this paper by Cordain:
The hunter-gatherer data used by Cordain et al (4) came from the Ethnographic Atlas (5), a cross-cultural index compiled largely from 20th century sources and written by ethnographers or others with disparate backgrounds, rarely interested in diet per se or trained in dietary collection techniques. By the 20th century, most hunter-gatherers had vanished; many of those who remained had been displaced to marginal environments. Some societies coded as hunter-gatherers in the Atlas probably were not exclusively hunter-gatherers or were displaced agricultural peoples. Because most of the ethnographers were male, they often did not associate with women, who typically collect and process plant resources.- Katherine Milton
The Ethnographic Atlas used in the “study” is available online and quite clearly does not contain 229 pure hunter-gatherer cultures. The 229 Cordain uses includes people who trade for or cultivate foods.
There is no evidence that mostly carnivorous groups of humans have particularly high longevity and in fact mummies, whatever their limits, have shown people eating these diets were not in fantastic condition, which of course like the bad condition of some early agriculturalists cannot be blamed on their diet.
It is awfully convenient to build a narrative to convince people to eat a limited diet based on the murky unknowns of the far past and near-mythical groups of supposedly extremely healthy carnivorous hominids. The carnivore-ape hypothesis is about as credible as the aquatic ape one.
One of the problems with human evolution, as opposed to, say, rocket science, is that everybody feels that their opinion has value irrespective of their prior knowledge (the outraged academic in the encounter above was a scientist, but not a biologist, still less an evolutionary biologist). The reason is obvious – we are all human beings, so we think we know all about it, intuitively. What we think about human evolution "stands to reason". Hardly a month goes by without my receiving, at my desk at Nature, an exegesis on the reasons how or why human beings evolved to be this way or that. They are always nonsense, and for the same reason. They find some quirk of anatomy, extrapolate that into a grand scheme, and then cherry-pick attributes that seem to fit that scheme, ignoring any contrary evidence. Adherence to such schemes become matters of belief, not evidence. That's not science – that's creationism.
I saw the same story building among vegans, who often craft similar narratives around our lineage's long plant-eating past. It speaks for a deep desire for people to justify their own choices. What all these dietary narratives have in common is that they confirm a particular limited diet is our “natural” diet and one that is best for humans, animals, and the environment. It’s not possible for them all to be right, and that’s because none of them are.
Ancient humans ate a large variety of foods, which is why we are adapted to so many. Human variation is high though, since our lineage has become so populous and geographically wide-ranging. There are many reasons for a modern human to adopt a low-carbohydrate or limited carbohydrate diet either temporarily or permanently. None of those have to do with this being the optimal diet for all humans or with a mostly-carnivorous ancestry.
I guess I’m kind of late to the party on reviewing this book, but I actually haven’t noticed a lot of reviews of it, which is surprising given the amount of buzz the articles about it generated. I also suspect some reviewers didn’t actually read it, since they seemed abnormally fixated on defending their paleo diet, when the book only has two out of ten chapters devoted solely to diet and covers many other topics.
Like Marlene Zuk, I am also quite critical of some of the movements that use (and mis-use) evolutionary logic like the paleo diet. So I wanted to like this book.
It has its good moments, but is overall in need of a good editor. It could be much shorter. And much less meandering.
Much of the skepticism is directly towards the frequently-inane postings on online discussion boards, which I a have the misfortune of being very familiar with having moderated one of the most popular until I rage-quit in annoyance.
While a lot of people get dumb advice on internet discussion boards, do they really define these movements? While they are fun strawmen to take down easily, most people don’t take such posts seriously. What they take seriously is the often scientific-sounding books written by various gurus, often with many letters, legitimate and not, preceding and following their names. While she mentions them, it’s only in passing. Her “paleofantasy” seems to consist mainly of cacophony of crowd-sourced internet discussion.
Not to say you won’t learn anything from this book, but it hardly challenges the status quo, which makes the hysteric reactions of many against it and the author seem all the more ridiculous. A lot of it reminds me of the excellent The Beak of The Finch or The 10,000 Year Explosion. She covers many methods that evolutionary biologists use to understand evolution, why they matter, and common misconceptions about them.
But if only people were talking about evolution when they were talking about the paleo diet. Talk about actual evolutionary biology and you might be met by some of the silent crickets that Zuk studies. Only 54% of paleo dieters in a recent survey accept evolution as a fact.
But it’s beyond that the increasing specialization in of academia becomes a limitation. Zuk specializes in the evolution of crickets, which yes, does have surprisingly broad applicability, but she spends a long time on that and other similar research that I think a skeptic would find irrelevant and unconvincing. I read The Beak of the Finch, which discusses this type of research in length, in high school, and it didn't stop me from adopting the paleo diet narrative. I think the most common problems with the “paleo” worldview come from anthropology. For example, misinterpretations of isotopic studies, coprolite fossils, and paleopathological surveys are used very often to justify “paleo” diets.
On the cultural anthropology side of things, people often seem very confused by terms like “hunter-gatherer” or “forager.” Rather than elucidating the complexity of historical humans lifestyles, the book muddles this further in parts. If you were confused about this before, you’ll stay confused, and a clarification would improve her arguments anyway. For example, whether or not the Yanomani (of the Chagnon controversy) are relevant to revealing some aspect of hunter-gatherer “human nature” is pretty questionable considering that while they do forage and hunt for some of their food, they are horticulturalists, a lifestyle that probably not much older than agriculture. Same goes for Jared Diamonds extrapolations from the horticulturalists of Papua New Guinea in The World Before Yesterday.
This is also common in Paleo diet books– authors like Cordain cite starch-cultivating horticulturalists like The Kitavans when convenient, while recommending a diet that bears little resemblance to theirs. I noticed recently that paleo guru Art De Vany’s blog header has a picture of some imposingly muscular tribal warriors. The site doesn’t seem to say anything about them, but I knew they are Asaro “mudmen” of Papua New Guinea, who are horticulturalists and grow many crops that De Vany would view as unhealthy. It is a shame to see them exploited to promote his diet and as of late, extensive advertising of his own supplements.
Fuled by sweet potatoes, sugary fruit, and peanuts they grew in their forest gardens
If you are confused, for almost all of the paleolithic humans were nomadic hunter-gatherers with primitive weapons. There are really no people today who practice this lifestyle. If agriculture is a drop in the bucket of human history from a relative perspective, even the innovations of the Middle and Upper Paleolithic, are similar in relative timescale. These innovations included better weapons- the atlatl and later the bow and arrow, which would have affected hunting significantly. They also included the culinarily important pottery and grease-processing (smashing up bones to make a fat and protein rich broth). I made this crappy timeline that gives a vague idea of some of these innovations in human history. What time do you choose as the optimum?
Our ancestors’ diets clearly changed dramatically and repeatedly over the last tens, not to mention hundreds, of thousands of years, even before the advent of agriculture.
Even the few representatives of nomadic hunter-gatherers that exist on the planet use these relatively modern technologies, like the Hadza’s bows.
I don’t think these groups of people are irrelevant to health discussions though, if anything, these people show that diversity of lifeways in which our species has been able to thrive, a thread that seems constant no matter the time. And every lifeway has involved trade-offs. For example, while rheumatoid arthritis, which is common in industrialized first world cultures these days, seems to have been rare in foraging cultures, osteoarthritis seems to have been more common.
And in the end while it’s fascinating to think about how so much we are familiar with is “new” in their scale of geologic time, Zuk rightly points that evolution works faster than many might imagine.
I think the sections on lactase tolerance, which talk about in how many places and ways humans acquired this trait, are fascinating. But left also many unanswered questions that show just how far we have to go to understanding human evolution.
Interestingly, about half of the Hadza people of Tanzania were found to have the lactase persistence gene—a hefty proportion, given that they are hunter-gatherers, not herders. Why did the Hadza evolve a trait they don’t use? Tishkoff and coworkers speculate that the gene might be useful in a different context. The same enzyme that enables the splitting of the lactose molecule is also used to break down phlorizin, a bitter compound in some of the native plants of Tanzania. Could the lactase persistence gene also help with digestion of other substances? No one knows for sure, but the idea certainly bears further investigation.
But while she mentions a little elephant in the room, which is our microbiota. Of “our” cells, bacterial cells outnumber “human” cells ten to one. And they have had a lot more generations to evolve than “we” have.
Microbiologist Jeffrey Gordon says, “The gut microbial community can be viewed as a metabolic organ—an organ within an organ . . . It’s like bringing a set of utensils to a dinner party that the host does not have.” 44 As our diets change, so does our internal menagerie, which in turn allows us to eat more and different kinds of foods. The caveman wouldn’t just find our modern cuisine foreign; the microbes inside of us, were he able to see them, would be at least as strange.
I like that she takes on the common narrative of “people were really healthy until they became farmers and then they shrunk and had bad teeth etc.” The reality is while some of the earliest agrarian cultures did seem to suffer compared to their predecessors, it wasn’t all about the food and people by and large recovered. Besides, if we were going to pick diets based on bone and teeth health, we might as well pick the pastoralists like Masaai, who tend to be much much much taller than any hunter-gatherers.
Then a funny thing happened on the way from the preagricultural Mediterranean to the giant farms of today: people, at least some of them, got healthier, presumably as we adapted to the new way of life and food became more evenly distributed. The collection of skeletons from Egypt also shows that by 4,000 years ago, height had returned to its preagricultural levels, and only 20 percent of the population had telltale signs of poor nutrition in their teeth. Those trying to make the point that agriculture is bad for our bodies generally use skeletal material from immediately after the shift to farming as evidence, but a more long-term view is starting to tell a different story.
Many paleo diet books present our species as that of fragile creatures rather than what we really are, which is the consummate omnivore resilient and adaptable enough to thrive on a large range of foods. A curious being, that was travelled far and wide and tasted many things, rather than being defined by fear and a narrow food exceptionalism. I’ve even seen people, some of them fairly well-known bloggers, on Twitter and Facebook discussing buying an island where “paleo dieters” could be free from “non-foods” like grains and the people that eat them. It’s not as bad as blog posts from paleo dieters travelling in foreign countries who talk about how difficult it is to explain their special food to the local people. Traditional cultures are venerated, maybe even exploited, unless they don’t fit the paleo narrative.
The question is whether the various forms of the paleo diet really do replicate what our ancestors ate.
Unfortunately Paleofantasy focuses on this absurd strawman of dietary replication and only begins to scratch the surface of neurotic botany of many paleo writings. Books that fret about whether or not “nightshades” grew in Paleolithic Savanna Africa and their plant chemicals, while blithely consuming other classes of similarly alien plants with other potentially problematic chemicals. Because that’s what plants are– bundles of chemicals that can be friend or foe depending on amounts and contexts.
The skeptics she cites aren’t much better than the internet commenters representing paleo. They include the Ethnographic Atlas, a survey of modern populations, that she claims puts to “rest the notion of our carnivorous ancestors.” Or the U.S. News & World Report’s rating of diets.
It doesn’t take an evolutionary biologist to understand what the paleo diet has become, especially in alliance with the low-carb diet promoters, industrial supplement companies, or the standard dieting-culture food fear mongerers. It functions not as an attempt to use evolutionary biology to understand the human diet, but has become a social engineering scam to sell mediocre books, processed powders, and other crap. It was only about evolution in the beginning, mostly it’s just a diet in caveman clothing now.
Paleofantasy has just come along for the ride. It’s not going to convince very many people caught in the scam. It’s just going to make those who haven’t feel smug. At least it might teach a few people about evolutionary biology.
And I liked the section about attachment parenting, which is surprisingly rational about the matter, a welcome break from so many writings that either are almost religious about it or decry it as some kind of upper middle class fad.
The evolutionary psychology section is also not as critical as I thought it would be from the reactions of those are are enamoured with the subject.
There is a long section on barefoot running, which talks about how some paleo diet proponents like Art De Vany think we did not evolve for long-distance running and other evolutionary fitness advocates like anthropologist Dan Lieberman think is it a critical part of our evolutionary heritage. I think this highlights the fact that the past is so hazy that it’s pretty easy to use it to support a whole host of contradictory arguments.
It’s a shame Zuk tilted at internet idiot windmills and not at the far more sophisticated arguments that are dressed up as science. I sometimes wonder if publishing companies don’t want authors to criticize other authors. They have 199 Paleo Fried Chicken Recipes (I made that up, but it’s not that far out) and other book-like products to push before people get bored.
These books are also relentlessly shallow shadows of some of the earliest texts in the genre of using the deep past to better understand how we should live. Recently I was struck by the similarity in the cover of The Primal Connection: Follow Your Genetic Blueprint to Health and Happiness and the late Paul Shepard’s Coming Home to the Pleistocene.
I read Coming Home to the Pleistocene when I was twenty. While I certainly don’t agree with everything in it, it is beautifully-written and thought-provoking. It challenged the way I thought about the world. Paul was not afraid to espouse controversial ideas, unlike the books from the diet industry that turn the original ideas into drivelling Flintstones platitudes in order to appeal to everyone. I suspect people will still be reading Shepard in a decade when all the paleo publishing bubble books languish in the bargain bin.
Zuk says in closing that “I am all for examining human health and behavior in an evolutionary context, and part of that context requires understanding the environment in which we evolved.” I agree with this. I think evolution is important and will continue to improve our understanding of our world. And I eagerly await a book that more fervently challenges common misconceptions about it.
In my last post, I wrote about how it's impossible for epigenetic changes from very cold environments 3-4 billion years ago to have been conserved. Somehow people thought I was accusing Dr. Kruse of making up cold-adapted monkey ancestors or something.
No, I realize he doesn't mention cold-adapted monkeys, but he also doesn't stick to bacteria living in sad cold slurries either. He also mentions ancient mammals. Dr. Kruse also has an interesting belief that all mammals “evolved in the polar environments on earth.”
I can't find any evidence that early eutherian mammals evolved in such an environment or even a later candidate for a polar eutherian that could be a possible ancestor. They discovered the earliest known (so far) eutherian fossil last year in China, Juramaia sinensis, in a Late Jurassic formation. The climate in the area at the time was relatively warm and dry. Juramaia sinensis' teeth suggests it was an insectivore. Many other early mammal fossils have been found in Asia, but as we know, mammals went on to colonize a variety of environments and climates.
Dr. Kruse says "mammals were ideally adapted for hibernation too, until they got too smart for their own genes sake.” It is indeed true that Juramaia sinensis and other early eutherians did hibernate. Mevolutionary biologists now consider the origin of biological changes distinct to hibernation behaviors to have originated even before the evolution of class mammalia and are displayed even in reptiles who live in very warm environments.
Why did most mammals stop hibernating then? As the excellent paper The Evolution of Endothermy and Its Diversity in Mammals and Birds says “ energy-optimization-related selection pressures, often dictated by the energetic costs of reproduction, apparently favored abandonment of the capacity for short- or long-term torpors." In most primates, it seems this abandonment was characterized by a species with a large brain and increased adaptability to a variety of foods and climates.
That's too bad, because hibernation (or even torpor, a less extreme form) would be very useful for things like organ transplantations, surgery recoveries, or long space flights. In the future, if we figure out how to do it, being able to trigger hibernation would be incredibly useful. Unforunately, the exact way to trigger hibernation is not currently known, though there are many promising candidates. Dr. Kruse however believes that the stimuli is already known: “the stimulus for hibernation in eutherian mammals and their descendants are tied to high dietary carbohydrate intake (proven fact already in science and not controversial).” If only it were that easy. A search of the scientific literature found no papers that posited that carbohydrate consumption triggers hibernation, though it is established that carbohydrate metabolism undergoes changes before and after hibernation. Scientists who propose triggering hibernation believe it would probably involve injection of chemicals produced by hibernating animals. This would be possible because many of the genes related to hibernation are still present in primates, not because we hibernate, but because they have other functions. We'd also have to figure out how to prevent brain damage, which has been a major challenge to such research since humans appear to suffer memory loss from brain changes normal to hibernating mammals.
Evolution is efficient and while genes that had interesting past uses (wouldn't it be cool if we could "reawaken" gills or the ability to lay eggs??) are often conserved in our genome, they are often expressed in radically different ways. It seems the areas that once encoded for gills, for example, are now related to the bones in the ear. As for those that don't seem to be in use now, as geneticist Paul Szauter says:
If genes are not expressed in the human genome, they do not survive intact over evolutionary time, because they accumulate mutations in the absence of selection. If there were squid genes in the human genome that could be "activated," it is likely that the accumulated mutations would result in a truncated gene product (3 of the 64 codons are "stop") with many changes in its sequence.
Dr. Kruse believes that humans, like all mammals, are optimized for hibernation and that remnants of mammalian hibernation are activated in humans based on certain times of the day: "It appears 12-3 AM are the critical hours at night are where the remnants of mammalian hibernation lies for our species". This is a far cry from the current state on literature related to hibernation. The idea that remnants of hibernation occur in humans at night also goes against the definition of hibernation. An excellent paper authored by another McEwen, Dr. Bruce McEwen, has a great concise definition "Hibernation is a highly regulated physiological response to adverse environmental conditions characterized by hypothermia and drastic reductions of metabolic rate"
Re-definition and special unique definitions of terms is another of Melia’s characteristics of bad books: “ The texts of these books all continue in the same excited first-person voice. They often introduce vague, undefined or invented terms.” A good example of this is in Dr. Kruse’s PaleoFX talk, where he references “ geothermal circadian cycles.” It sounds scientific, but there are no known circadian cycles that are tied to Earth’s internal heat* and it appears Dr. Kruse invented the concept since it is found nowhere else. It is a particularly deceptive practice, made easier by the fact that many of the terms that are often mis-used by these authors, such as the species concept or even hibernation, are the subject of some academic contention. But while academics might be arguing about whether or not bears are “true hibernators,” we can be assured that no one is considering that humans are hibernating every night because that doesn’t even fit into the realm of contention or the fringes of what is considered hibernation.
The only known primate that hibernates is Cheirogaleus medius, member of suborder Strepsirrhini, which diverged from the evolutionary line that led to humans over 70 million years ago. They also store a lot of fat beforehand, so I don't know if I'd like to hibernate like them anyway. I don't think I'd look so good and I probably wouldn't get much work done.
Even if it were conserved, Dr. Kruse makes the mistake of tying hibernation to extreme cold: “Cold environments are found as mammals hibernate in normal circadian biology…….this completely reverses IR in mammals and wakes them up when conditions are better for life.” Dr. Kruse’s cold therapy involves exposure to freezing temperatures, because he thinks that is linked to hibernation in humans. Kruses asks his readers if maybe diabetes has “become thought of as a neolithic disease in humans because we we have simultaneously lost the ability to hibernate because we evolved the ability to control our environment completely?” However, there are many animals that hibernate without exposure to very cold temperatures and biologists are still debating whether or not relative cold is even needed to trigger hibernation at all. For example, the only hibernating primate, the aforementioned Cheirogaleus medius, hibernates at 30 degree celsius (86 F). And if humans had lost the ability to hibernate because we control our environment, we would find the ability in related primates who do not control their environments. But we do not. The northernmost living non-human primate, Macaca fuscata, does not show any evidence of hibernation or even torpor, even those that do not visit hot springs. Interestingly, their winter diet does include digging for roots.
Why do so few primates hibernate? Around the equator, where primates evolved, seasons operate quite differently than they do in the arctic and other regions far from the equator. Because the environments and climates of Africa are so diverse, with many micro-climates in certain regions, most primates closely related to humans have evolved to be able to adapt to scarcity regardless of Earth’s axis tilt through the reliance on “fall back foods.” Possibly because of this evolutionary strategy, there is no particular dietary pattern that consistently characterizes the seasons for primates as an order or even within species.
Even in non-primates that live in the north, a very small percentage hibernate. For example, some squirrels hibernate, some don't. Those that don't often will cache food and eat it later. Some humans are known to raid rodent caches for carbohydrates. This contrasts with Dr. Kruse’s idea of seasonal biological changes being triggered by changing to carbohydrates or one season being devoid of carbohydrates: “it appears that dietary carbohydrates, which are only present in long light cycles in the summer in cold places, induce mammals to add PUFA’s to our cells to become fluid so we can function as we hibernate.”
According to Kruse, since carbohydrate consumption is tied to hibernation in cold environments, since we don’t really hibernate now (except sort-of, at night according to Kruse), carbohydrates might not be safe to consume: “Since we no longer hibernate……..maybe you need to consider how you eat carbohydrates within the circadian controls? Maybe what you thought was safe………really is not?” The implication is that carbohydrate consumption is only “safe” for mammals in the context of nature’s “design” for hibernation. In terms of our evolutionary line, that makes little sense. The vast majority of primate species consume diets of mainly carbohydrate, with two main digestive strategies. The evidence is that ancestors of modern hominins relied on a mainly-carbohydrate diet until somewhat recently.
If Dr. Kruse’s line of reasoning is true, most primates are living out of balance with nature and have been for millions of years. Some of his followers have said that this only applies if you live in the north since there are somehow some circadian controls only in the North that are tied to carbohydrates (zero evidence provided), but then the mice and squirrels who are eating stored or underground roots are violating natures law. And the idea that if you put an individual primate in the north that it will change its underlying biology to fit the north's light cycles does not have any evidence behind it (and in fact the fact that individual humans don't adapt particularly well to northern light cycles is perhaps behind the etiology of many modern illnesses).
As I will write in my next post, some human populations (and possibly other hominin lines) have genetic adaptations to more polar light cycles, but these are recent adaptations and are not shared by all humans. And one unique thing is that humans that inhabit cold regions have a raised metabolic rate during the coldest season, not a lowered one characteristic of torpor or hibernation, which suggests adaptations more similar to those found in wolves rather than ground squirrels**. Also, I must also discuss longevity being derived rather than ancestral. But I'll leave that to the next post.
* Geothermal according to the OED is “ 1. Geol. Relating to or resulting from the internal heat of the earth; (of a locality or region) having hot springs, geysers, fumaroles, etc., heated by underlying magma.””
** are non-hibernating squirrels naughty "nature's law" breakers? Particularly if they are eating stored carbohydrates?
In my last post on the subject of Dr. Jack Kruse, AKA, The Quilt, I briefly touched on the misuse of the ideas of quantum theory. Not long after, the WSJ had an excellent article on the mis-use of that subject.
Actually, it's pretty interesting because I agree that both the mis-use of evolutionary biology and quantum science aren't that bad, because they show that people really are interested in science. They are just suseptible to bad science, which isn't a surprise considering the abysmal state of scientific education in this country. Only about 4 in 10 Americans actually believe that evolution is a real thing.
Well, here is some background for new readers. My original post confused some people who do not post on Paleohacks, where my rivalry with Dr. Kruse has a very long history.
The question of what drives popular interest in such evolutionary fantasies is a difficult one. Frequently, they are bundled in with good advice, further increasing credibility among laymen. They contrast with legitimate evolutionary biology in that they contain simple and often epic “just so” narratives that appeal to people, in contrast with the complexities and fervent debates in the academic study of human evolution.
An example of one such internet popularizer is the aforementioned Dr. Jack Kruse, a Tennessee neurosurgeon and dentist who started plying a “paleo” diet and lifestyle program called the Leptin RX Reset on various popular paleo diet websites (particularly Mark’s Daily Apple, which is ranked 2000 on the Alexa web traffic rating system in the US, and Paleohacks, which is ranked 8000) in 2010. Calling himself “The Quilt,” his posts quickly became some of the most popular on these sites. In June of 2011, he launched his own website, containing a blog and his master “Quilt” manifesto and cracked Alexa’s top 100,000 sites in the US within months, hitting 26,581 in March 2012. He gained a further audience from the online Paleo Summit, where his talk was among the most popular and where many people learned about the next part of his Leptin RX Reset program, the Cold Thermogenesis Protocol (CT). Then he was a keynote speaker at PaleoFX, an Austin paleo conference sponsored by The Ancestral Health Society that drew many of the movement’s most popular speakers. In the fall of 2012, he will be a panelist on a debate about “safe starches” at the Ancestral Health Society’s main conference, the Ancestral Health Symposium.
Evolution is central to Dr. Kruse’s ideas and recommendations. In a comment to reader he explained “ Adapting evolutionary biology to what we learn makes us a better physician not matter what we do.” His writings contain many references to human evolutionary origins and how his readers can use them to improve their health in a modern context. Many of his readers have reported success from using his recommendations, but is the evolutionary basis behind them sound?
During a panel at PaleoFX in Austin Dr. Kruse said
Only humans who fail to listen to evolutions rule book of engagement die. You can eat a banana in the winter and feel fine but Mother Nature says it’s impossible………therefore we ought not to do it. I will follow her lead over a diet book guru or the opinions of a bunch of people who let their thoughts subjugate their genes. Feelings and thoughts do not trump neural biochemistry …
But his writings reveal a more ambivalent view of evolution. On one hand, on one blog post he says that “The speed of evolutionary change has far out stripped the ability of our paleolithic genes to catch up.#” But in his Quilt manifesto he outlines an extremely plastic view “A human is the only animal that can actually change its DNA just by thinking. Moreover, what we really think is just a biologic secretion. No different than any hormone released by the pituitary gland. Thinking is… well it is a meme that hijacks our brain’s chemistry. Any one thought can alter our genetic and biologic purpose in life.#” Therein lies an essential paradox- on one hand our genes are “paleolithic,” but on the other they are malleable simply by thinking differently. Perhaps it is just part of our genes that are “paleolithic.” For example, he refers to our brains being neolithic and able to “subjugate” our paleo genes, with negative consequences: "our neolithic brains allow us to make decisions that subjugate out paleolithic genes all too often.”
The idea that human genetics have not changed since the end of the Paleolithic roughly 10,000 years ago and that they are mismatched to the modern neolithic environment is a common one in paleo diet circles. Unfortunately, its origin can be traced back to academia and it lives on as a popular principle despite the fact that it has been the subject of considerable controversy in evolutionary biology. A common citation in paleo books is to An evolutionary perspective enhances understanding of human nutritional requirements, a paper written by Boyd Eaton, Melvin Konner, and Marjorie Shostak in 1996:
Geneticists believe that the increased human number and mobility associated with civilization have produced more, not less, inertia in the gene pool and that when the humans of 3000-10,000 years ago (depending on locality) began to take up agriculture, they were, in essence, the same biological organisms as humans are today (Neel 1994).
More recent research has come to the opposite conclusion, as newer statistical genetics models have actually found that human evolution has accelerated greatly in the past 40,000 years, certainly not freezing with the advent of agriculture. However, along with many other paleo authors, Dr. Kruse is still of the less dynamic mindset, writing that “Evolutionary pressures are selected for by the environments of our ancestors were exposed to and not for what we face today.”
n academic anthropology, new discoveries in archaeology and rapid advances in genetics have spawned a discipline in which textbooks are outdated as soon as they are published. Human origins are constantly under debate, which means that even if scientific education were adequate in the US, it is fairly hard for anyone to keep up. But even a basic outdated education on the topic would help a layman be critical of several pop-science fringe evolutionary theories that have cropped up, such as the “aquatic ape” theory. Why do such theories become popular? As anthropologist Jim Moore put it:
Even among scientists, as we've seen with Hoyle, there are times when assertions are put forth which are poorly drawn, yet, because they strike a chord, often of wishful thinking, they catch on and are repeated. Yet refuting them can be an exercise in futility. A good scientific review and critique is a lot of work, and takes a lot of time. But it's far easier to pop off with a theory that's poorly researched than it is to accurately go over all the things that the original theorist should've, and to provide a point by point refutation.
In the case of the Aquatic Ape theory, Jim Moore did a major service by creating a website to refute it, since as he elucidates there, there is very little incentive for academics to engage in debates with popular pseudoscientific theories, since they are so focused on doing their own original research for the benefit of their careers.
This is unfortunate, because pseudoscience has great power to shape the public’s consciousness. As anthropologist John Hawks puts it:
Is the Aquatic Ape Theory fairly described as pseudoscience? Every statement of natural causes is potentially scientific. What distinguishes science from pseudoscience is social. Pseudoscience is supported by assertions of authority, by rejection or ignorance of pertinent tests, by supporters who take on the trappings of scientific argument without accepting science's basic rules of refutation and replication. Pseudoscience is driven by charismatic personalities who do not answer direct questions. When held by those in power, like Lysenkoism, it destroys honest scientific inquiry. When held by a minority, it pleads persecution.
In a world where many people are unhappy and unhealthy despite our scientific advances, the idea that conventional science is wrong can be quite appealing. It allows people to buy into fringe pseudoscience for which little evidence exists and not question the lack of evidence.
Dr. Kruse has a new interesting spin on the "paleo" diet, though whether his approach is "paleolithic" is up for debate. His spin on the evolutionary approach lately has emphasized cold. Why?
Considering that 90% of the earth’s current biome lives in extreme conditions on our own planet today still, we might need to consider that what we think is “our normal environment” is not so normal for most of life on our planet or our evolutionary history. Life on Earth evolved in an environment much like we see on Titan today; in a deep ocean frozen solid at its surface with the capability of life buried deep with in it. The only escape was due to ejectants of water vapor from super heated water from underwater volcano’s. All these things are present today on Earth’s crust too. There is one major difference now between the two. We are warmer today than life began. There are others, but when one looks at Titan we see a frozen giant moon with a monsterous ocean beneath it.
This creates an issue of whether or not Dr. Kruse is even promoting a paleo diet or if instead he is promoting a Archean diet. But if he is referring to the Archean, it is mystifying that he emphasizes cold, considering that the Archean was probably warmer than today. Dr. Kruse has "references" on his blog, but I would challenge anyone to tie the meandering list on his blog post with any actual "facts" in his writing, like that cold promotes autophagy or that Neuropeptide Y is downregulated in cold weather.
Either way, debates as to the origin of life on Earth are still happening, with some (but not a consensus!) emphasizing the possibility that colder temperatures were the ideal place for single-celled life forms to originate. Dr. Kruse believes this is important because “life first adapted to extreme environments and then was naturally selected and adapted to a cyclic warming trend on our planets crust over time.” But he seems to believe that this adaptation was somehow incomplete: “Our hominid species may have adapted during this warming trend, but the DNA we inherited came from animals that were cold adapted.” Did DNA not adapt during this warming trend even through our hominin ancestors themselves adapted? It seems strange since Dr. Kruse believes that “One thought might just alter your DNA!”
Instead, he believes that these adaptations were mostly epigenetic, once again appealing to the idea that he knows more than conventional science “We know today that the power of epigenetics dictates a lot more about newer generations adaptations than we even knew ten years ago.” While epigenetics has become an important field, it is clear that a species adaptation to changing environments involves both epigenetic and genetic adaptations. Unfortunately, epigenetic is an appealing buzzword, which has been co-opted to absurdity. Dr. Kruse says that the warm paleoclimate that our primate ancestors were exposed to might not matter as much as we think it does because we might still carry an epigenetic imprint from the cold that engulfed Earth 3-4 billion years ago:
The modern science of epigenetics shows that who we came from and what they faced has a direct biologic effect upon subsequent generations DNA and phenotypes. It is crystal clear today, but the biologic implications remain unexplored in all modern day literature. What is happening on Titan maybe like opening up a blackhole back to a reality that used to be our own. The ability to see Earth at life’s evolutionary beginning.
This is confusing, because while here he blames epigenetic relics, in other writings he blames static genetics: “ The speed of evolutionary change has far out stripped the ability of our paleolithic genes to catch up. This mismatch causes major problems for modern humans.” Dr. Kruse seems to want to have it both ways.
The idea that epigenetic relics from billions of years ago are affecting us today is questionable, since the latest evidence shows that epigenetic changes are not stable enough to carry on from thirty generations and certainly not from 3-4 billion years ago. Evolutionary biologist Jerry Coyne has addressed people who hype up the evolutionary significance of epigenetics: “ There are a handful of examples showing that environmentally-induced changes can be passed from one generation to the next. In nearly all of these examples, the changes disappear after one or two generations, so they couldn’t effect permanent evolutionary change.“ This is in direct opposition to Dr. Kruse’s view of how evolution works:
I think evolution found that epigenetic modifications to be quite effective way to pass on environemental information to succeeding generations. So successful that it became a backbone law of genomic functioning. Evolution follows fractal patterning. So it is also highly conserved in all species. Today that appears to be true too.
While epigenetics has indeed led to important new understandings, it is less of a game-changer than Dr. Kruse presents: “Evolution uses epigenetics to determine adaptation to environments. We have discarded the strict definition of genetic determinism that came from Watson and Crick, as founders of DNA.” As evolutionary biologist Rama S. Singh succinctly put it:
While the new discoveries of the laws of developmental transformations are enriching our knowledge of the intricate relationship between genotype and phenotype, the findings of epigenetic inheritance do not challenge the basic tenets of the neo-Darwinian theory of evolution, as other than producing new variation no new processes of evolutionary change have been added to the ones we already know — mutation, migration, selection, and drift
Unfortunately, Dr. Kruse’s erroneous beliefs on human evolution have wide-ranging consequences holistically on his philosophy and recommendations, since it causes him to believe in conservation of many traits for which there is no evidence of conservation in Hominidae. Daniel F. Melia, professor of Celtic studies, recently wrote an article about characteristics of bad books that promote pseudoscience in his discipline. One of these characteristics is “Confident conclusions are often the result of chains of circumstance and supposition so long that even remembering their origin points while reading the books is difficult.” It’s easy to see that here and it shows how particularly ingenious this characteristic is since it makes anyone who tries to criticize the absurdities wade through the mire, further dis-incentivizing criticism.
A reasoning that Dr. Kruse uses for carbohydrate restriction is that epigenetics has sped up: “epigenetics has sped up and you eat a warm climate diet you by definition increase mitochondrial ROS that slowly kills you” A search of the scientific literature and academic databases found zero papers on epigenetics speeding up in modern humans. His article cites Wikipedia and himself.
The phenomena he blames on an epigenetic speed up also often do not have known epigenetic causes.
This is more fuel or proof that the “metabolic trap door” I found makes all starches less safe when they are eaten outside how our circadian biology accounts for them in our sped up systems. Evolution power laws has sped up too simultaneously to compound the issue. This explains why kids today are huge and bigger than their preceding genrations. It explains also why they reach puberty faster today.
Scientists are not sure what is causing earlier puberty(NYtimes article yesterday). If Dr. Kruse really knows, he should be eligible for a Nobel prize. It’s also interesting because paleo authors are often quick to point out that Paleolithic humans were bigger and stronger than modern humans because of their healthier diets and lifestyles. Yet Dr. Kruse is portraying this phenomenon as a bad thing, which begs the question as to why the paleolithic humans were bigger. Were they eating too many carbs? Why modern humans are catching up to paleolithic humans in terms of height is a matter of debate, but the best theory is it has to do with greater access to calories and less malnutrition stress.
Overall, on the issue of genetics and epigenetics, Dr. Kruse seems willing to rewrite definitions and build a pseudoscientific narrative that has little basis in reality. It shows that new scientific discoveries that become buzzwords in the public consciousness are very easy to manipulate in order to build such narratives. They make them sound scientific and cutting-edge, when in reality they are empty and devoid of factual basis.
I have MUCH more to write on this subject, including more on the evolution of mammals, hibernation, and hominin adaptations to cold. Also, a history of how he did not invent most of the regimens his followers praise so highly, as well as the actual non-pseudoscience evolutionary biology reasons that they work for those people.