This blog is about the intersection between evolutionary biology and food. But also about practical applications, sustainable agriculture, and general tasty things.
Every two years or so I notice a cyclical trend in the online “paleo” community. It’s the resurgence of dogmatic carnivory. It has two main themes: plants are “poisons” that cause most of our health problems and humans “evolved to be” very low carb. Always an undercurrent with some very zealous devotees (“The Bear” of Grateful Dead fame was probably one of its most prominent popularizers), it suddenly finds popularity among normally more moderate people, picking up some non-paleo low-carb followers in the process. Then it goes away again, hilariously with some of its top cheerleaders renouncing it in the process (like Danny Roddy).
It’s been back again lately. A few readers have written me about Anna who writes the blog Life Extension*. She is a graduate student in archaeology and social anthropology. Anna’s most popular post so far is “Debunking and Deconstructing Some ‘Myths of Paleo’. Part One: Tubers.” Sadly, an opportunity for greater communication to the public from a much-maligned discipline becomes a manifesto for low-carb diets. The tagline is “Glucose restriction represents not only the most crucial component of ancestral diets but is by far the easiest element to emulate.” I think we’ve heard this one before, but this time it is in language that is more authoritative than usual. This is the kind of writing I would have liked Paleofantasy to take on.
Unfortunately she doesn’t refer to sources directly in her text, so I’ve done my best to figure out which sources she is referring to.
Most archaeologists don’t go around promoting diets, because they recognize the limitations of their field. There is so much that is unknown and unknowable. It’s pretty easy for nearly anyone to pigeonhole what we do have to fit their own narratives.
The reduction in size and robusticity of the human skeleton is a clear temporal trend of newly agricultural communities. Diachronic skeletal comparisons reveal large-scale, significant reductions in growth rates.
Yes, of some newly agricultural communities, and that doesn't mean it stayed this way. I’ve written about it more than I would have liked. I just wrote about it in my last post about Paleofantasy (which cites this review).
Then a funny thing happened on the way from the preagricultural Mediterranean to the giant farms of today: people, at least some of them, got healthier, presumably as we adapted to the new way of life and food became more evenly distributed. The collection of skeletons from Egypt also shows that by 4,000 years ago, height had returned to its preagricultural levels, and only 20 percent of the population had telltale signs of poor nutrition in their teeth. Those trying to make the point that agriculture is bad for our bodies generally use skeletal material from immediately after the shift to farming as evidence, but a more long-term view is starting to tell a different story. - Marlena Zuk
It also brings up how questionably height is used in these narratives. The few hunter-gatherers that exist today are very very short (mostly due to genetics). The rest of the world has grown taller and taller. Staffan Lindeberg in his magnum opus suggests we are too tall from overnutrition. Other markers that extremists attempt to use to show that agricultural humans show a downward trend in terms of health suffer from similar limitations.
Instances of porotic hyperostosis brought on by iron deficiency anaemia increased dramatically in agricultural settings.
There is a new appreciation of the adaptability and flexibility of iron metabolism; as a result it has become apparent that diet plays a very minor role in the development of iron deficiency anemia. It is now understood that, rather than being detrimental, hypoferremia (deficiency of iron in the blood) is actually an adaptation to disease and microorganism invasion.”- Porotic hyperostosis: A new perspective
Either way, I’m not sure what the transition these communities in upheaval experienced has to do with whether or not tubers or any carbohydrates are bad for you. It wasn’t just the food that changed for these people, it was their entire way of life, and it was a transition that changed their biology. And while there are trends, there is no linear health decline. There is a more systematic database of human remains and health markers that is in the process of being created right now that should be a great resource in the future. At this point a lot of papers claiming a decline are using inappropriate sample sizes and statistical methods.
Far too little evolutionary time has passed for us to be successfully acclimated to the novel conditions of agricultural life.
Another common thread that is begging the question. How long is long enough? How many adaptations are enough?
Speaking of evolutionary time:
Spending most of our human history in glacial conditions, our physiology has consequently been modelled by the climatologic record, with only brief, temperate periods of reprieve that could conceivably allow any significant amount of edible plant life to have grown.
Like Nora Gedgauda's paleo book Primal Body, Primal Mind, which she cites for unknown reasons, this sentences implies to her lay readers than glacial conditions = something out of the movie Ice Age. Which is just not true. A glacial maximum left some people in the cold, but Africa was still quite warm, and if we are talking about evolutionary time, that’s where we spent most of it. Outside Africa, most humans seem to have clustered in fairly temperate refugia such as Southern Iberia during the last ice age.
Many think of the late Pleistocene as the “Ice Age”, a time when continental glaciers coveredmuch of the earth and where the land not under ice was inhabited by giant cold-adapted animals—wooly mammoth, wooly rhinoceros, and cave bears—pursued by hardy humanhunters. While this image may be somewhat accurate for part of the world, most of the earthremained unglaciated throughout the Pleistocene.” -In Glacial Environments Beyond Glacial Terrains: Human Eco-Dynamics in LatePleistocene Mediterranean Iberia
Of course “significant amount” is also going to be a point of contention. Only in the very coldest tip of the arctic do levels of plants in human diet fall to close to zero. Beyond that, many people might not be aware of levels of starch and sugar available in the environment because traditions surrounding them have died out. I have written quite a bit about Northern sources of carbohydrates- “Siberian potatoes” and Alaskan native plant foods.
Further information on the evolution of our diet can be garnered from the genetic data of present populations, which demonstrates the historically-late biological adaptation to less than minimal quantities of starch and to only few and specific starch compounds.
I assume this refers to amylyse (AMY1) copy number, the function and history of which is not quite clear, much like lactase persistence. For example, I do not possess lactase persistence, even though my ancestors probably raised livestock for dairy, they were diversified pastoralists, so it’s likely there was not enough selective pressure for them to develop this trait. They consumed dairy, but the majority of their diet was not dairy.
It is unlikely the ancestral human diet was as high in starch as some horticulturalist tropical diets are now, where the majority of calories come from starch. But in the end, the differences in AMY1 copy number between humans are small compared to our differences with other primates, indicating that perhaps this was selected for in our own evolution. And in the original paper it is kind of mind-boggling they use the Mbuti as a “low-starch” population given their high starch consumption.
The Mbuti are particularly interesting because they are hunter-gatherers, but trade their surplus meat for starch and have done this for quite some time (when this isn't available there are forest tubers utilized as fall-backs). The only time they don’t trade is when honey is in abundance.
Anna’s assertion that starch is comparatively “inefficient” compared to meat using optimal foraging models doesn’t mean that humans would have chosen to eat only or mostly meat. That data includes game from South American environments, which is unusually fatty in comparison to African game. Even in South America, such game is not available in unlimited amounts in the first place, which is why even hunter-gatherer cultures that have access to it like the Ache also extensively gather and process starch and gather honey.
The consequences of limited availability and time investment of edible Palaeolithic plant foods has been analysed by Stiner, who compared food class returns amongst contemporary hunter-gatherer groups. Stiner found the net energy yield of roots and tubers to range from 1,882 kj/hour to 6,120 kj/hour (not to mention the additional time needed to dedicate to preparation) compared to 63,398 kj/hour for large game.
Anna’s assertions stand in stark contrast to the paper she seems to cite:
Staple plant resources present an extreme contrast to large game animals with respect to prevailing economic currencies (Table 11.1). Large animals generally yield high returns per unit foraging time (kJ per hour) but are unpredictable resources. Seeds and nuts give much lower net yields per increment time (kJ per kilogram acquired), but they have potentially high yields with respect to the volume obtained and the area of land utilized.
Surveys of hunter-gatherers show overwhelmingly that preferred foods are fatty game and honey, highly caloric (and delicious), yet these are not the majority of the diet because they are not available in high predictable amounts, like the modern equivalents are.
As Kim Hill, who studies the Ache says “High-ranked items may be so rarely encounteredthat they represent only a very small proportion of the diet; low-ranked items in the optimalset may be encountered with sufficient frequency to contribute the bulk. It is interesting to note that on several occasions, reports of nearby palm fruit (ranked 12) were ignored, something that did not happen with oranges. On several other occasions people discussed the relative merits of hunting monkeys (ranked 11). reaching consensus that monkeys should not be pursued “because they are not fat.”
Anthropologists have theorized on the importance of having carbohydrate fallback foods in the event that high-fat game is not available, either because of seasonality or over-hunting. In these cases, “rabbit starvation” from excess protein is a real danger. Surviving off of game is a real challenge, which probably accounts for the fact that many humans have any exploited seemingly tedious to gather plant resources in nearly every environment.
Some of Anna’s arguments indicate that she has decided on some issues that are actually very controversial in anthropology and archaeology, such as the date of regular fire use (Anna asserts it was much later than many think) and that “However, plants have been preserved in the Lower Palaeolithic, and they are used primarily for functional and material – rather than nutritional – purposes.”
She does admit that “I will concede however that absence of evidence is not evidence of absence” but then goes on to list some sites that show possible non-food-related plant use that aren’t even associated with Homo sapiens, many are hominid offshoots that are unlikely to have contributed to our line (except for some of us who have a possible small amount of neanderthal ancestry). Other sites she mentions aren’t dated to the lower Paleolithic anyway.
Later sites such as Kebara she also dismisses, implying that legumes would have been used as fire starters rather than food. But admits that hominids would have supplemented their diet with “low glycemic” foods when meat was scarce.
Firstly, Neanderthals were highly carnivorous and physiologically inept at digesting plant foods. This can be measured using the megadontia quotient of Neanderthal postcanine tooth area in relation to body mass, which reveals that H. neanderthalensis must have consumed a greater than 80% animal diet. Nonetheless, the evidence of phytoliths and grains from Neanderthal skeletons at Shanidar Cave may reveal the rare consumption of starches in this singular context, but not the deleterious costs to the health of those that ate them.
The megadontia quotient, which is controversial in the first place, is not meant to be used in this way. Neither is the also mentioned expensive tissue hypothesis. They are meant to analyze use of uncooked fibrous plant foods and is not particularly enlightening in the case of large-brained hominids with cultural adaptations to food such as cooking. Some of the most recent research that reappraises the carnivorous theory of neanderthals is covered in this recent talk by neanderthal experts Dr. Margaret J. Schoeninger and Dr. Alison S. Brooks.
Humans show up as carnivores, even when they are known corn-eating agriuculturalists, like these people. But what happens when you plot other plants?
Now the data makes more sense (remember this data is showing where protein in the diet came from, it doesn't tell us how much protein was eaten).
As you can see, initial isotopic studies (which can only show where the protein came from, not the amount of protein in the diet) that showed neantherthals as top carnivores came into question when farming populations were showing similar values. They realized that they needed to consider analyzing plants based on their most nutritious fractions, since when was the last time anyone sane ate something like a whole stalk of corn, husk and all? Another great paper by John D. Speth also summarizes some of the recent research on neanderthal diets and debunks hypercarnivory.
humans were no longer able to transmute fibre into fat – as other primates can (consequently, they eat a high-fat diet) – through fermentation in the large intestine.
This, as anthropologist Dr. Richard Wrangham has pointed out, could also be an adaptation to cooking. And we didn’t lose this ability, it is just reduced, though no biologist would argue it the SCFA produced in the colon, which can provide calories and also modulate inflammation, are unimportant. SCFA metabolism is not comparable to longer chain fatty acid metabolism, so it’s not really appropriate to call these diets “high fat.” Furthermore, there are other primates with similar guts to ours like capuchins, who most certainly do not eat a carnivorous diet– they eat sugary fruit. But it’s very hard to compare our guts to the guts of other animals since cultural traits like cooking are so important for our food consumption.
I think it’s a bit amusing to read these posts alongside those of PaleoVeganology, written by a graduate student in paleontology who criticizes many popular paleo narratives. However much I disagree with him on the issue of modern diet choices, I commend him for not using his expertise to promote his chosen diet- he is explicit that his dietary choices are built on modern ethics and not the murky past.
The skeletons at Shanidar are certainly the first of many analyses of starches on teeth, which rules out theories like that plants were only used as decorations or fire starters. Since that first paper was published, others using the same method have followed and more will. But there is no way to use such data to speculate on how often or how much of these foods were consumed.
The coprolite “paper” that Nora Gedgaudas frequently cites also comes up, which I’ve addressed here.
Another common thread in carnivore narratives is that plants were used “only” as medicinals. I would not consider this as insignificant in any way– in most cultures, the line between food and medicine is a thin one. Many foods we enjoy as foods these days have medicinal roots.
Anna rightly criticizes the use of non-hunter-gatherers as hunter-gatherer proxies in writings about the so-called paleo diet and then cites a study that does the exact same thing-
In an attempt to reconstruct the diet of ice age hominids, a recent study analysed the macronutrient dietary composition of existing hunter-gatherer groups within latitude intervals from 41° to greater than 60°.
But where did this data come from? Anthropologist Katherine Milton responded quite well to this paper by Cordain:
The hunter-gatherer data used by Cordain et al (4) came from the Ethnographic Atlas (5), a cross-cultural index compiled largely from 20th century sources and written by ethnographers or others with disparate backgrounds, rarely interested in diet per se or trained in dietary collection techniques. By the 20th century, most hunter-gatherers had vanished; many of those who remained had been displaced to marginal environments. Some societies coded as hunter-gatherers in the Atlas probably were not exclusively hunter-gatherers or were displaced agricultural peoples. Because most of the ethnographers were male, they often did not associate with women, who typically collect and process plant resources.- Katherine Milton
The Ethnographic Atlas used in the “study” is available online and quite clearly does not contain 229 pure hunter-gatherer cultures. The 229 Cordain uses includes people who trade for or cultivate foods.
There is no evidence that mostly carnivorous groups of humans have particularly high longevity and in fact mummies, whatever their limits, have shown people eating these diets were not in fantastic condition, which of course like the bad condition of some early agriculturalists cannot be blamed on their diet.
It is awfully convenient to build a narrative to convince people to eat a limited diet based on the murky unknowns of the far past and near-mythical groups of supposedly extremely healthy carnivorous hominids. The carnivore-ape hypothesis is about as credible as the aquatic ape one.
One of the problems with human evolution, as opposed to, say, rocket science, is that everybody feels that their opinion has value irrespective of their prior knowledge (the outraged academic in the encounter above was a scientist, but not a biologist, still less an evolutionary biologist). The reason is obvious – we are all human beings, so we think we know all about it, intuitively. What we think about human evolution "stands to reason". Hardly a month goes by without my receiving, at my desk at Nature, an exegesis on the reasons how or why human beings evolved to be this way or that. They are always nonsense, and for the same reason. They find some quirk of anatomy, extrapolate that into a grand scheme, and then cherry-pick attributes that seem to fit that scheme, ignoring any contrary evidence. Adherence to such schemes become matters of belief, not evidence. That's not science – that's creationism.
I saw the same story building among vegans, who often craft similar narratives around our lineage's long plant-eating past. It speaks for a deep desire for people to justify their own choices. What all these dietary narratives have in common is that they confirm a particular limited diet is our “natural” diet and one that is best for humans, animals, and the environment. It’s not possible for them all to be right, and that’s because none of them are.
Ancient humans ate a large variety of foods, which is why we are adapted to so many. Human variation is high though, since our lineage has become so populous and geographically wide-ranging. There are many reasons for a modern human to adopt a low-carbohydrate or limited carbohydrate diet either temporarily or permanently. None of those have to do with this being the optimal diet for all humans or with a mostly-carnivorous ancestry.
How hard it is to read a scientific study? Should you bother to learn? I recently commented on a blog post on that subject.
Reading a study to figure out what to tell other people what to do is hard. Almost all the people who plant a pubmed reference in front of you to tell you to eat magic macronutrient XYZ or to avoid food X forever lest you perish from cancer are unqualified to pontificate on the subject. That includes many people with fancy titles. The people really qualified to talk about these things are not going to be pontificating. Nutrition science is too young for such surety.
But there is a much lower bar to be able to look up a reference and say whether or not it actually even possibly supports what the author who reference it was saying. That's fairly easy a lot of the time, since apparently many news outlets don't seem to care to fact check. I took a science journalism class in college and was taught a very meticulous and accurate way of writing that I don't see very often. A perfect and wonderfully topical example cropped up recently. The headline reads "Uh-Oh, Paleo: Cavemen Ate Less Meat Than Previously Thought." Surprisingly, the Fox News title, while stupid, is not completely inaccurate: "Secrets of the Caveman Diet." I get the feeling they are more interested in the SEO value of the paleo diet than ancient diets.
It took me more time than I would have liked to find the actual paper because they don't even link to the abstract. It turns out this paper is open-access, so anyone can read it, and that makes not linking to it even more suspicious. Well, I understand why. Just do a ctrl-f for "paleolithic." Don't bother with "caveman" because that's not even a technical or meaningful term. When I did that my computer made that annoying noise that I keep forgetting to disable that means it didn't find that word at all. Well, let's just try "paleo." Aha, something...but it's in the references...it's a paper in the journal " Biogeochemical approaches to paleodietary analysis." I could Google "paleodietary" and realize that the term encompasses all archeologically-studied diets from any time period, but without even reading anything, I've gained a lot of skepticism for the conclusions of news articles. The Fox News article is crafty and does say "first farmers" but makes a tenuous connection to the paleolithic.
I can then read the abstract and the discussion, the least science-y parts of the paper, which have several standouts anyone who is reading this blog post can probably pick up. Oh look at this sentence "This larger value goes some way to resolving the conundrum of interpretations of very high animal protein intake in isotopic studies of prehistoric farmers." Wait, so this whole thing was comparing to prehistoric farmers and not hunter-gatherers? Another minus point to the news articles. If you are a good reader, you can also figure out that the reason they did this study is that stable isotope analyses was based on animal data and they wanted some human data to compare that to.
If you want, you can delve further by reading a bit about that method. Considering how many of my readers are procrastinating computer coders with the next tab over open to their GitHub account or some API, I think a lot of them can handle this. The Fox News article, just like my science journalism teacher taught, describes this method.
To see how much meat ancient people ate, archeologists rely on the fact that protein is the only macronutrient that contains nitrogen. Different foods have different ratios of heavy and light nitrogen isotopes, or atoms of the same element with a different number of neutrons. So in a given ecosystem, scientists can reconstruct ancient diets by measuring the fraction of heavy-to-light nitrogen isotopes in fossilized bones.
But the body also preferentially stores heavier isotopes of nitrogen, so scientists calculate an offset to adjust for that tendency when determining what a person actually ate. Historically, the offset was derived from studies in which animals were fed diets with different protein amounts. [7 Perfect Survival Foods]
Using that offset, many studies estimate that between 60 and 80 percent of the prehistoric human diet came from proteins, with most of that from animal sources.
I'll just Google "isotopes diet." If you've taken a basic college level class in geology or archeology you probably know to Google "stable isotopes diet." The first results are a free and fairly readable paper and a blog post by a physical anthropology professor, John Hawks. Neither of these is easy to read, but if you can read .php or .ru files or are just a good reader, you can probably figure out the basics of the method. Fox News starts to get it right. But that last sentence is flat-out wrong. Isotope analysis is a way to determine trophic level of the protein in the diet, so where the protein might have come from in the local ecosystem. It is simply not capable of telling you what percentage of the total diet was protein.
There are more complexities to the method I could go into, such as potential inaccuracies of the method, but that's the overall gist of it. I'd note that I've also seen this method butchered in books popular with paleo dieters, claiming that because some skeletons from the paleolithic indicate they got most of their protein on the same trophic level as arctic foxes that their diet was like that of an arctic fox. That's the kind of thing this study is relevant to- whether or not we can extrapolate animal data to humans accurately in stable isotope analysis of diet. That's probably not as good for sexy headlines or SEO though, is it? The reality is that if we applied this we'd find paleolithic humans ate many different diets, with plant protein increasing with sedentism and with certain local ecologies. But in the wild plant proteins are not easy to come by. Most of them are not digestible by humans and many that are, such as certain wild legumes, are seasonal. And in the end, both of those articles fail to make the issue relevant in any comprehensible way, the blisstree taking nonsensical potshots at the paleo diet:
Many contemporary paleo diet gurus advocate a diet that’s 50 percent or more animal products (though contrary to what some people think, this doesn’t just mean chowing down on bacon and burgers — paleo dieters stress the importance of eating lean meat, fish and eggs that come from grass-fed livestock). This is based on the conventional wisdom that paleolithic humans ate a diet of between 60% and 80% protein, mostly from animal sources.
First, I don't know where I can get grass-fed fish but it sounds cool and if you know any sources, email me. Second of all, since when are animal products just protein? The ones I eat have plenty of fat. Maybe there is a parallel universe where I eat a 60% protein diet and have already wasted away from rabbit starvation, but in this universe I don't know anyone who eats an 80% protein paleo diet. Most people naturally gravitate away from absurd protein intakes because it's unappetizing and makes you feel bad, though lately I've found many people persist on diets that are exactly that for years and even decades. I don't like feeling bad or eating bad food, so I've never had that long-term problem.
We don't know what percentage of a paleolithic hunter-gatherer's diet was protein, we don't know that for "caveman" or for early farmers. It's just not knowable right now and probably never will be. We do know that for modern humans, there seems to be a physiology ceiling for protein intake which John Speth addresses quite readably in his excellent, though bizarrely expensive (worth getting on interlibrary loan) book, which requires humans not eat like an arctic fox, but be innovative and seek out either fat or carbohydrate in order to avoid potential costs of high protein intake. But that ceiling is controversial.
So there, those two news articles are essentially debunked and we didn't even have to discuss various nitrogen isotopes or anything really truly technical. In the end, we realize that the study in question doesn't tell us how those in the past really ate or what we should eat now. It's just a little piece of a large completely unsolvable puzzle. To even be able to realize that gives you immense power not to be deceived.
In the debate surrounding the NYC ban on large soft drinks earlier this year, the argument came up that we had to regulate them because liquid calories are evolutionarily novel and inappropriate for our species to consume because we cannot consume them moderately and their metabolism is harmful to our bodies. At the time I had already started reading Patrick McGovern's Uncorking the Past, which looks at human history through the very lens of liquid carbohydrates.
Not soda, something a bit more delicious and perhaps more enticing. I'm talking about alcoholic drinks. It was in the form of such a drink that I first encountered McGovern's work. I was not pulling in very much money at the time and my indulgence in luxury food and drink primarily came from volunteering at ritzy galas. After one long night, I was delighted to find a vendor had left quite a lot of good beer behind. One of them was Dogfish Head's Midas Touch. With a musky wine-like flavor, it was clear this was not a normal beer.
The idea for the beer came from a golden residue found in a tomb where either the real King Midas or his father was buried around 700 B.C. Archeologist Patrick McGovern had analyzed this residue, teasing out the various ingredients using infrared spectrometry, gas and liquid chromatography and mass spectrometry. Grapes, honey, and barley had were the ingredients of this ancient beverage. Together with the brewers at renowned microbrewery Dogfish Head, McGovern set out to recreate something with these elements for the modern palette. The result was well-received and the first of the Ancient Ales series went to market.
McGovern is the "Scientific Director of the Biomolecular Archeology Laboratory for Cuisine, Fermented Beverages, and Health at the University of Pennsylvania Museum in Philadelphia." He primarily works at analyzing ancient pottery residues to figure out what exactly our ancestors were imbibing in. And for fun he recreates some of these beverages for modern people to enjoy.
Uncorking the Past says "No containers have yet been recovered from the Palaeolithic period, not even one made from stone. Objects made of wood, grass, leather, and gourds have disintegrated and disappeared." Since it came out, several Paleolithic pottery specimens have been described, mainly from China. Earlier this year, one set of shards was dated to 20,000 years ago. It would not be surprising to me if much earlier pottery is discovered in Africa. Evidence for the earliest food grind stones used to process seeds has been dated to 105,000 years ago. It is possible though that humans in that region were using other containers for liquid such as skins or gourds, but pottery would have been a major advance useful for extracting fat from bones, detoxifying and cooking starches, and creating fermented drinks.
Such drinks would have not been terribly novel even then. As McGovern points out, our evolutionary line is frugivorous in origin, having inhabited warm tropical climates where "as the fruit matured, it would have fermented on the tree, bush, and vine. Fruits with broken skins, oozing liquid, would have been attacked by yeast and the sugars converted into alcohol. Such a fruit slurry can reach an alcohol content of 5 percent or more." Many cases of wild animals getting drunk on ripe fruit have been documented.
Malaysian tree shrews, subsist mainly on fermented palm nectar that is up to 3.8% alcohol. The researchers concluded:
The pentailed treeshrew is considered a living model for extinct mammals representing the stock from which all extinct and living treeshrews and primates radiated. Therefore, we hypothesize that moderate to high alcohol intake was present early on in the evolution of these closely related lineages.
Wherever primates live, they seem drawn to sugars. Chimpanzees use tools to gather honey in Africa. Hominids there have been adapt at exploiting honey for a very long time, devising elaborate gathering systems to thwart the aggressiveness of native bees. Surveys of foraging tribal peoples like the Hadza and Pygmies have revealed that honey is the food they most prefer. It can also be used to make alcohol:
Many African peoples have been drinking some variation of a fermented honey beverage for a very long time throughout the continent. The strongest versions have been reported from the Rift Valley, where added fruit (e.g., of the sausage tree, Kigellia africana, and tamarind), with additional yeast to spur an extended fermentation, boosted the alcohol concentration. Sub-Saharan Africa is a honey-eater's and mead-drinker's paradise.
It's not just shrews that enjoy palm wine either. Evidence for human exploitation of palm goes back 18,000 years in Africa:
The most important species for making palm wine are the oil palm (Elaeis guineensis), the ron or Palmyra palm (Borassus aethiopum), and the raphia palm (Raphia vinifera), which are concentrated along the humid east and west coasts as well as in the dense jungles of the interior...A healthy tree can produce nine or ten liters a day and about 750 liters over half a year...Within two hours, palm wine ferments to about a 4 percent alcohol content; give it a day, and the alcohol level goes up to 7 or 8 percent
Now back to those food grind stones. The papers that describe them typically talk as if they were used to make the world's crappiest bread out of miserable wild grains. Other grind stones had more obvious uses- they ground pigments for decoration. Why not smear your face with makeup and go out and party? What if the "food" grind stones were really used for making alcoholic drinks? What if people domesticated grains mainly to use in the creation of alcoholic drinks? Seems like more of an incentive than making bitter flat fibrous bread disks.
It would also explain why the wild relatives of so many grains are mystifying. Looking at teosinte, the wild ancestor of corn, it's kind of baffling why humans would have bothered with the plant at all:
A series of careful DNA studies identified teosinte (genus Tripsacum) as the wild ancestor of maize. This mountain grass grows in the Rio Balsas drainage of southwestern Mexico. One cannot imagine a less inspiring plant to domesticate. The ears of this primitive corn, which are barely three centimeters long and contain only five to twelve kernels, are trapped in a tough casing. Even if you manage to free up the kernels, their nutrient value is essentially nil.
The mystery might be solved by quids, chewed and spit out fibrous plant material. This might sound gross, but chewing of place materials and spitting it into a container is an alcohol-making process that has been documented around the world. It seems very likely that the stalks of teosinte were used for this purpose.
The human mouth converts the starch to more easily fermentable sugar using amylyse. Some mouths are better at this than others. Starch consuming peoples typically have a greater amylyse copy numbers, though all humans have a greater copy number than primates like chimpanzees and booboos. "Higher AMY copy numbers and protein levels probably improve the digestion of starchy foods and may buffer against the fitness-reducing effects of intestinal disease." Stephan Guyenet and I have discussed how the copy number thing is interesting because salivary amylyse, even at high copy numbers, contributes very little to digestion of starch relative to pancreatic amylase. What is the increased copy number for salivary amylyse for then? Perhaps for chewing starches like rice and corn to make delicious alcoholic beverages.
Chicha made with saliva remains an important part of the diet of many South American tribes, and a woman's ability to make it is important for her husband's social status. It is rude to refuse it, as this account written up in Salon describes
Patton maintains that the bulk of an Achuar’s daily calories do not come from meat. They come from chicha, a mildly alcoholic, vaguely nutritious, watered-down manioc mash. Achuar men drink up to four gallons a day.Isaac’s wife and mother are in constant motion, serving bowls of chicha to the 10 or so guests. Chicha is the backbone of Achuar society. As with the ankle bone and the knee bone, you feel an unalterable pressure to accept. Chicha is the holy communion, the Manischewitz, the kava-kava of Achuar life. It’s present at every ceremony, every visit, every meal. An Achuar woman’s desirability rests in no small part on her skill at chicha brewing and serving.
Given the amount of calories and nutrients such beverages can provide, it amazes me that many ethnographical and anthropological surveys seem to ignore or downplay their presence, as if they were just mere recreation.
Corn chicha, widely consumed in South America, could not only explain the domestication of teosinte, but it could also account for the fact that isotope studies during the time of corn's domestication don't seem to show people got their protein from corn:
Some very interesting results emerged when human bones from sites throughout the New World were examined. Because maize had been domesticated by about 6ooo B.P., one would have expected to see a specific carbon-isotope composition that reflected the increased consumption of maize, but it was strangely missing. Some scientists have proposed an explanation for this anomaly. Because the analyses measured only the collagen in bone, its main proteinaceous connective tissue, they were biased toward detecting high-protein foods. Solid foods made from maize, including gruel or bread (e.g., tortillas), fit this requirement, but not fermented beverages like maize chicha, largely composed of sugar and water. Consequently, if people between 6ooo and 3000 B.P. were consuming their maize as chicha, very little protein would have been incorporated into the collagen of their bones. The researchers speculated that humans began using maize as a solid food only after its ear had been substantially enlarged by selective breeding, around 3000 B.P. After this point, the carbon isotope compositions of bones dramatically changed.
Interestingly, going further north, the Native Americans there didn't seem to have any alcoholic beverages, or if they did, they had been spread from the South. Charle's Mann's 1491 discusses the hypothesis that the North and Southern Native Peoples were peopled differently, South America being populated by a sea-faring coastal society, rather than from Beringia up North. McGovern describes the culture of the coastal peoples, who consumed a tantalizing array of berries, fish, mollusks, wild tubers, mastodon meat and fat (they processed enough fat that it congealed on the floor, which my sister's roommate reenacted recently by pouring some bacon grease directly down the drain), bulrushes, and seaweed. It is theorized that the cold snap of the Younger Dryas around 13,000 BC may have forced them to rely more and more on underground tubers, spurring on the domestication of the potato.
However, the Siberians, like the North Americans, do not have alcoholic beverages (that we know of at least), relying on other resources for a buzz:
In place of any alcoholic beverage, the Siberian peoples engaged in shamanistic practices based on the hallucinogenic fly agaric mushroom (Amanita muscaria). When European explorers finally braved the frigid tundra of Siberia, beginning in the mid-seventeenth century, they recorded how the shaman often dressed in a deer costume with antlers, like the Palaeolithic creature depicted in Les Trois Freres cave (see chapter i). After consuming the mushroom, he would beat on a large drum, whose monotonous repetition reinforced the effects of the active hallucinogenic compounds (ibotenic acid and muscimole) and took him into the ancestral dreamtime.
Northern peoples in the Americas also smoked tobacco. Meanwhile, people in the Southern parts North America certainly did imbibe in alcohol. The Pima who are so infamous in nutritional circles consumed a sweet cactus wine. The health effects of another regional beverage, Pulque, which is made by fermenting agave sap, have been explored a bit. It was found that among highland tribes that consume it, it accounts for much of the iron and Vitamin C consumption in pregnant women. Pregnant women who consume too much or none are more likely to have low-BMI and reduced mental performance infants. Consumption of pulque might also increase the bioavailability of vitamins in other traditional foods.
The use of agave in fermented beverages should be considered when looking at data from that region that suggests a high fiber consumption from these plants, particularly given the presence of quids and the fact that these fermented beverages could enhance digestion of fructooligosaccharides in these plants. I've seen such papers conclude that this means that humans in these regions ate absurd amounts of fiber and we should emulate them. What is more likely: that anatomically moderns humans were eating 255 grams of fiber a day from plants like agave, well above what any known living culture consumes, or that they were making something a bit like tequila?
It is possible that such drinks have been under emphasized because of very real issues of alcoholism that plague many modern indigenous peoples. However, most of these traditional alcoholic beverages are not like the modern alcohol that is abused. Indigenous beverages are typically 3-6% alcohol, seasonal and contain many nutrients and phytochemicals, which are biologically active plant chemicals. McGovern's lab has been working on exploring the medicinal properties of many of these phytochemicals.
McGovern describes how many of the early beverages in the Middle East, ancestors of our modern wines and beers, contained potent medicinals. Early grape wines, for example, often contained tree resins:
Tree resins have a long and noble history of use by humans, extending back into Palaeolithic times. They could be used as glues and were perhaps even chewed to give pain relief, as suggested by lumps of birch resin with tooth marks that were found in a Neolithic Swiss lake dwelling...Resinated wines were greatly appreciated in antiquity, as we have come to see in analyzing wines from all over the Middle East, extending from the Neolithic down to the Byzantine period. Although some wine drinkers today turn up their noses at a resinated wine, now made only in Greece as retsina, the technique is analogous to ageing in oak. The result can actually be quite appealing: the Gaia Estate's Ritinitis has a mildly citrusy flavor, achieved by adding a very slight touch of Aleppo pine resin to a Greek grape variety. Even the Romans added resins such as pine, cedar, terebinth (known as the "queen of resins"), frankincense, and myrrh to all their wine except extremely fine vintages. According to Pliny the Elder, who devoted a good part of book 14 of his Natural History to resinated wines, myrrh-laced wine was considered the best and most expensive.
It is a powerful reminder to consider ancient diets holistically, that things were not just consumed for their nutritive value, but for recreational, medicinal, and religious purposes. And possibly some of these substances were "unwise" traditions and may account for some of the diseases found in mummies and skeletal remains if people drank too much or adulterated their beverages with carcinogens and other poisons. Even today, adaptation to alcohol seems uneven and imperfect in humans, as many Asians who experience Alcohol Flush Reaction will attest. Distilled high-alcohol spirits are also very much an evolutionary novelty. As someone with alcoholism running in the family, I very much understand that consumption of these kinds of alcohol can be difficult for certain people to moderate with terrible, even deadly consequences.
I think renewed study and emphasis on fermented alcoholic beverages in human evolution will provide much insight into human adaptations to food and the development of domesticated crops. Even with the knowledge we have now, I think it's wholly inappropriate to describe liquid carbohydrates as evolutionarily novel. Soda is novel in that it is a liquid carbohydrate devoid of any of the nutrients or phytochemicals in indigenous beverages, but mainly we need to look to modern science and biochemistry to tell us what effect soda has on the body and mind.
It's also fascinating to see some of these ancient beverages recreated and revived. I've since tasted several of McGovern's collaborations with Dogfish Head, such as Chateau Jiahu, which is made of rice, honey, and fruit recipe gleaned from 9000 year old Chinese pottery. I've also enjoyed some of the more modern spit-free chicha at several Peruvian restaurants and being a lightweight, I appreciate that it's pretty low in alcohol and also very tasty. There has also been renewed interest in home brewing ancient herbal ales. You can do it yourself with the book Sacred & Herbal Healing Beers. There are also some herbal beers on the market. I've enjoyed William Brother's spruce, seaweed, and heather beers. Unfortunately, none of these beers are gluten-free, which is slightly disappointing since the original Jiahu pottery probably did not contain barley.
I enjoyed Uncorking the Past, but it does read a bit like a textbook at times, which is why it took me so long to get through it. I'm looking forward to enjoying more of his brews though. Dogfish Head is even tried making Chicha the old fashioned way, though it didn't exactly work out, since it was more labor intensive than they expected.
If you like shrews, especially if you like them parboiled, you'll want to devour a 1994 study published in the Journal of Archaeological Science. Called Human Digestive Effects on a Micromammalian Skeleton, it explains how and why one of its authors – either Brian D Crandall or Peter W Stahl; we are not told which – ate and excreted a 90mm-long (excluding the tail, which added another 24mm) northern short-tailed shrew (Blarina brevicauda).
This was, in technical terms, "a preliminary study of human digestive effects on a small insectivore skeleton", with "a brief discussion of the results and their archaeological implications". Crandall and Stahl were anthropologists at the State University of New York in Binghamton. The shrew was a local specimen, procured via trapping at an unspecified location not far from the school. For the experiment's input, preparation was exacting. After being skinned and eviscerated, the report says, "the carcass was lightly boiled for approximately 2 minutes and swallowed without mastication in hind and fore limb, head, and body and tail portions".
Here's how Crandall and Stahl handled the output: "Faecal matter was collected for the following 3 days. Each faeces was stirred in a pan of warm water until completely disintegrated. This solution was then decanted through a quadruple-layered cheesecloth mesh. Sieved contents were rinsed with a dilute detergent solution and examined with a hand lens for bone remains." They then examined the most interesting bits with a scanning electron microscope, at magnifications ranging from 10 to 1,000 times.
A shrew has lots of bony parts. All of them entered Crandall's gullet, or maybe Stahl's. But despite extraordinary efforts to find and account for each bone at journey's end, many went missing. One of the major jawbones disappeared. So did four of the 12 molar teeth, several of the major leg and foot bones, nearly all of the toe bones, and all but one of the 31 vertebrae. And the skull, reputedly a very hard chunk of bone, emerged with what the report calls "significant damage".
The vanishing startled the scientists. Remember, they emphasise in their paper, that this meal was simply gulped down: "The shrew was ingested without chewing; any damage occurred as the remains were processed internally. Mastication undoubtedly damages bone, but the effects of this process are perhaps repeated in the acidic, churning environment of the stomach."
Chewing, they almost scream at their colleagues, is only part of the story. In each little heap of remains from ancient meals, there be mystery aplenty. Prior to this experiment, archaeologists had to, and did, make all kinds of assumptions about the animal bones they dug up, especially what those partial skeletons might indicate about the people who presumably consumed them. Crandall and Stahl, through their disciplined lack of mastication, have given their colleagues something toothsome to think about.
The human stomach was more capable at digesting bones than they expected. This isn't terribly surprising to me, as many cultures consume whole bone-in animals and there is plenty of archaeological evidence for this. Here's a bit from John Speth's book:
Well-preserved prehistoric human coprolites (feces) recovered in large numbers from dry caves throughout western North America are full of pulverized bone fragments, including pieces of broken skulls, as well as fur and feathers, indicating that rodents, rabbits, birds, lizards, snakes, and amphibians were often cooked whole, pounded in a wooden mortar or on a milling stone, and then consumed in their entirety – bones, fur, feathers, and all, including the precious DHA in the brains (Reinhard et al. 2007; Sobolik 1993; Yohe et al. 1991).
It would appear that the Desha people at Dust Devil Cave ate rabbit legs more-or-less whole, then pounded the rest of the carcass before eating it... The consumption of wood rats (Neotoma spp.), also known as pack rats, has been noted ethnographically. They were regarded as good food by the Yaqui (Spicer, 1954: 49), constituted a staple for all tribes along the lower Colorado River (Castetter & Bell, 1951: 217), and many were eaten by the Tohono O’Odham. The Cocopah set fire to their nests, clubbing the rats as they emerged, undoubtedly fragmenting some bone in the process.
In the past, there was perhaps more focus on big game hunting. And while big game bones are nice, they are harder to process than little animal bones. Primates have probably been digesting little bones for much longer than they have been breaking open larger bones for marrow. Excessive focus on big game has led to ignoring the contribution of small game to human nutrition, which has also led to the misconception that women don't hunt since some anthropologists classified small game hunting as gathering.
It would be interesting to know if other primates can also digest bone. Chimpanzees seem to degrade the bones of other primates they hunt and consume (PDF). Salad lovers might be interested to know that when chimpanzees consume a meal of meat, they consume it with leaves.
It would also be fascinating to know if humans process the same ability as some other carnivores to use animal parts as de-facto fiber and ferment it into SCFA.
At some point in human evolution, humans developed technology to extract nutrients from bones more efficient than their own stomaches, which is referred to as "grease processing" in many archaeological papers, but is close to what we do in making broths today. It is understandable why humans developed this, considering a meaty meal for a chimpanzee can take nearly the entire day to consume. Frankly, while I like a 6-hour tasting menu sometimes, I don't have time for that very often.
But today, could many humans handle bone? With dietary and medical factors like widespread use of proton pump inhibitors reducing acidity of the digestive tract, are we losing this capability?
For the record, I have never eaten a whole rodent bones and all, though I have eaten many small whole bony fish. There is some indication that humans degrade fish bones more completely, leading to their relative scarcity in coprolites and underestimation of their importance in diet.
Perhaps whole rat eating is becoming trendy again though, a posh rat dinner was featured in the New York Times recently.
Suggestions that humans may have obtained more calories from SCFA in the past are rooted in estimates of fiber consumption from the Paleolithic. Evidence is rather sparse and limited to coprolites. In the burgeoning field of evolutionary medicine, anthropologists have become very interested in the Paleolithic diet and its relevance for promoting health today. Some of the landmark papers in the field have cited these coprolite studies as evidence for fiber intakes as high as 150 grams as day, well over what any known human culture currently consumes (M. Konner & S Boyd Eaton, 2010). Even if the method for estimating fiber consumption from coprolites is accurate, they may not support the conclusion that they represent some species level optimal and may in fact suggest a ceiling for safe fiber consumption that great apes do not share.
The argument for studying the Paleolithic diet in order to see what is optimal for modern humans to eat stems from the fact that archeological remains suggest that Paleolithic hunter-gatherers were healthier than their agrarian descendents (S B Eaton, M. J. Konner, & Shostak, 1996). Generally, they were taller, had fewer skeletal pathologies, and their teeth were in much better condition 1. One of the sources for these high fiber estimates is coprolites from prehistoric Indians in the North American desert southwest, who consumed as much as 150 to 225 grams of fiber a day (Leach & Sobolik, 2010). Far from being in admirable health, evidence shows they had extensive dental caries. Some have speculated that this was caused by high levels of phytoliths in their diet, which wore down their teeth (Danielson & Reinhard, 1998). But other populations with extensive tooth wear do not exhibit high levels of caries2 (B. H. Smith, 1984). This raises the question of whether or not wear really caused their caries or if perhaps their fiber consumption caused them.
Clues come from modern nutritional biochemistry. Dietary fiber has the ability to reduce blood levels of Vitamin D, which is vital in tooth and bone mineralization (Batchelor & Compston, 1983). This may be the reason that some populations with high-fiber diets in Asia exhibit vitamin D deficiency despite adequate sun exposure. Children on macrobiotic diets, which are high in fiber, have higher than normal rates of rickets (Dagnelie et al., 1990). However, macrobiotic diets and those of rural Asians are notably low in animal products and high in plants different from those our Paleolithic ancestors ate, which contain mineral-leaching phytic acid (Raboy, 2001). The fiber in other primate diets and presumably in Paleolithic diets is mostly dicot vegetable fiber, whereas modern grain fibers come from monocotyledonous plants (Milton, 1989). It’s also possible that these problems would not occur at Paleolithic levels of animal product consumption, as animal products are rich sources of vitamin D and minerals (M. Konner & S Boyd Eaton, 2010).
Other anthropologists have tried to infer ancestral fiber consumption based on the diets of modern foraging populations and agrarians, but these have run into their own problems. Incorrect laboratory methods of analysis marred early data sets, though some of these data sets are still being cited in newer papers. Some examples include an estimate for Ugandan fiber consumption of 150 grams a day that was revised to 70 grams a day and an estimate for Kenyans of 130 grams a day reduced to 86 grams a day (Wrangham, Conklin-Brittain, & C. C. Smith, 2002). Other problems have come from analyzing fiber outside of dietary context. For example, much like we don’t consume the peels of bananas, Hadza don’t consume whole wild tubers. When they eat tubers, they chew them and the excess fiber is spit out. Obviously estimates of fiber consumption based on the whole tuber are overestimates (Schoeninger, 2001).
Fiber spit out by the Hadza
Given this, the use of extremely high estimates for Paleolithic fiber intake based on limited data as a baseline for optimal consumption seems misplaced. No known culture consumes over 100 grams of fiber. The highest recent estimate was 86 grams for some agrarian cultures in Africa (Wrangham et al., 2002).
Some of the issue is also overemphasis on fiber, when other food constituents that play a similar role may have been more important in human evolution. Early optimism that high fiber could prevent many diseases of civilization like heart disease and type II diabetes spurred many studies on the matter. These have had mixed results. There have been several expensive failed studies such as the forty-nine thousand women Dietary Modification Trial of the Women’s Health Initiative which found that increasing dietary fiber had no effect on risk of colon cancer, breast cancer, or heart disease and no effect on weight loss (Beresford et al., 2006). Those who cling to the fiber hypothesis insist that the trials have not been long enough or high enough in fiber (Byers, 2000).
Focus on fiber in the past was on its abilities as indigestible bulking matter to increase digestive transit time and bind up certain food constituents. (J. Smith, Yokoyama, & German, 1998). The dominant theory was that slower transit time allowed carcinogens and other potential toxins to fester in the body. This idea that spawned a cottage industry of quacks and religious movements advertising “cleanses” (Kellogg, 1923) that has remained robust to this day , but has not stood up to scientific scrutiny.
Next up: fiber or bacteria??
1. This argument seems suspect, since while early agrarians seemed to have had high levels of disease based on skeletal evidence, later agrarians and pastoralists are often much taller than people in the Paleolithic and also exhibit low incidence of pathology.
2. Including cultures that purposefully file down their teeth
Batchelor, A. J., & Compston, J. E. (1983). Reduced plasma half-life of radio-labelled 25-hydroxyvitamin D3 in subjects receiving a high-fibre diet. The British journal of nutrition, 49(2), 213-6. Retrieved May 2, 2011, from http://www.ncbi.nlm.nih.gov/pubmed/6299329.
Beresford, S. a a, Johnson, K. C., Ritenbaugh, C., Lasser, N. L., Snetselaar, L. G., Black, H. R., et al. (2006). Low-fat dietary pattern and risk of colorectal cancer: the Womenʼs Health Initiative Randomized Controlled Dietary Modification Trial. JAMA : the journal of the American Medical Association, 295(6), 643-54. doi: 10.1001/jama.295.6.643.
Byers, T. (2000). Diet, colorectal adenomas, and colorectal cancer. The New England journal of medicine, 342(16), 1206-7. doi: 10.1056/NEJM200004203421609.
Dagnelie, P., Vergote, F., Staveren, W. van, Berg, H. van den, Dingjan, P., & Hautvast, J. (1990). High prevalence of rickets in infants on macrobiotic diets. Am J Clin Nutr, 51(2), 202-208. Retrieved May 2, 2011, from http://www.ajcn.org/cgi/content/abstract/51/2/202.
Danielson, D. R., & Reinhard, K. J. (1998). Human dental microwear caused by calcium oxalate phytoliths in prehistoric diet of the lower Pecos region, Texas. American journal of physical anthropology, 107(3), 297-304. doi: 10.1002/(SICI)1096-8644(199811)107:3<297::AID-AJPA6>3.0.CO;2-M.
Eaton, S B, Konner, M. J., & Shostak, M. (1996). An evolutionary perspective enhances understanding of human nutritional requirements. The Journal of nutrition, 126(6), 1732-40. Retrieved March 26, 2011, from http://www.ncbi.nlm.nih.gov/pubmed/8648449.
Kellogg, D. J. H. (1923). Natural Diet of Man.
Konner, M., & Eaton, S Boyd. (2010). Paleolithic nutrition: twenty-five years later. Nutrition in clinical practice : official publication of the American Society for Parenteral and Enteral Nutrition, 25(6), 594-602. doi: 10.1177/0884533610385702.
Leach, J. D., & Sobolik, K. D. (2010). High dietary intake of prebiotic inulin-type fructans in the prehistoric Chihuahuan Desert. British Journal of Nutrition, 103(11), 1558-1561. Retrieved May 10, 2011, from http://journals.cambridge.org/abstract_S0007114510000966.
Milton, K. (1989). Primate diets and gut morphology: implications for hominid evolution. In M. Harris & E. B. Ross (Eds.), Food and Evolution: Toward a Theory of Human Food Habits (p. 93). Temple University Press. Retrieved May 8, 2011, from http://books.google.com/books?hl=en&lr=&id=xHYxSHr86T8C&pgis=1.
Raboy, V. (2001). Seeds for a better future: “low phytate” grains help to overcome malnutrition and reduce pollution. Trends in plant science, 6(10), 458-62. Retrieved May 9, 2011, from http://www.ncbi.nlm.nih.gov/pubmed/11590064.
Schoeninger, M. (2001). Composition of Tubers Used by Hadza Foragers of Tanzania. Journal of Food Composition and Analysis, 14(1), 15-25. doi: 10.1006/jfca.2000.0961.
Smith, B. H. (1984). Patterns of molar wear in hunger-gatherers and agriculturalists. American journal of physical anthropology, 63(1), 39-56. doi: 10.1002/ajpa.1330630107.
Smith, J., Yokoyama, W., & German, J. B. (1998). Butyric Acid from the Diet: Actions at the Level of Gene Expression. Critical Reviews in Food Science and Nutrition, 38(4), 259-297. doi: 10.1080/10408699891274200.
Wrangham, R., Conklin-Brittain, N.-L., & Smith, C. C. (2002). A Two-Stage Model of Increased Dietary Quality in Early Hominid Evolution: The Role of Fiber. In P. S. Ungar & M. F. Teaford (Eds.), Human diet: its origin and evolution (p. 206). Greenwood Publishing Group. Retrieved May 9, 2011, from http://books.google.com/books?hl=en&lr=&id=6GDELypdTUcC&pgis=1.
This will be one of the few series posts I'll actually finish since it's already written :) I'd like to thank Stephan Guyenet, Chris Masterjohn, and Professor Holloway for their tips, critiques, and inspiration! I welcome more such educated thoughts in the comments. Full disclosure: yes, I did write this for a class, but I thought some people might enjoy it and then I could also kill two birds with one stone. Haha.
In 1995, anthropologists Leslie C. Aiello and Peter Wheeler published a paper on a theory they termed The Expensive Tissue Hypothesis (ETH). Expensive refers to our brain tissue, which is uniquely metabolically demanding compared to other primate brains (Aiello & Wheeler, 1995). However, our total metabolic rate is close to what would be predicted for a primate our size, so according to the ETH, humans compensated for the increased metabolic costs of the brain by evolving less metabolically expensive splanchnic organs, which include the gut and liver. Humans were able to fuel their large brains using only a relatively small gut because increased dietary quality reduced the need for gut mass. The hypothesis was that the main driver of this increased dietary quality was the increased use of animal products.
Aiello and Wheeler
This hypothesis rests on assuming that reduced gut size coincided with the major jump in encephalization experienced by hominids millions of years ago. In their calculations, Aiello and Wheeler used the modern human gut to demonstrate its uniquely small size. Unfortunately, using the modern human gut as a hallmark has some problems, as there is some evidence that it has been reduced in size due to dietary innovations that may have taken place long after encephalization and since these innovations it has possibly continued to evolve. The trend in human innovation has been towards a diet of increased quality and this innovation continues even today. In response to these dietary changes, the human population shows variation in dietary adaptations. The reorganization and variation of the human colon provides important clues about this process.
Exactly how unusual is the modern human gut? Based on a reduced major axis equation computed for higher primates, the human gut should be about 781 grams larger (Aiello & Wheeler, 1995).
It is hard to know when this change started, as guts do not fossilize nor do they leave their impressions as brains do in endocasts. However, it is possible to infer some information from post-cranial anatomy. Living apes with big guts have protuberant abdomens to accommodate them.
Skeletally, they have a rounded abdomen continuous with the lower portion of the rib cage, giving it a funnel shape, as well as a wide pelvis with flared upper margins. In the fossil record we can see that Australopithecus afarensis had skeleton anatomy that would indicate a large gut if this pattern holds.
Figure 3: Chimpanzee, human, and Australopithecus afarensis, from Aiello and Wheeler
In contrast, the human pelvis size is reduced and the abdomen has a defined waist region. Hominids start exhibiting this in the fossil record starting with Homo erectus, about 1.5 million years ago. However, there is some evidence that this anatomical change may not have to do with gut size. For one, it is not entirely a consistent pattern among hominids. Reconstructions of a post-cranial Neanderthal skeleton based on the 70,000 year old La Ferrassie 1 and 60,000 year old Kebara 2 specimens shows a wider trunk showing up again (Sawyer & Maley, 2005).
It is possible that the trunk and pelvis size represented adaptations to cold, a type of hunting, or some other lifestyle variable (Bramble & Lieberman, 2004). Until more data is collected and analyzed tying post-cranial anatomy to gut mass, it is hard to tell if the inference is valid.
In response to the ETH paper in 1995, Katherine Milton questioned whether the data presented was really representative of our species. She stated that our guts may have played a larger role before the relatively recent invention of agriculture when fiber consumption was much greater and our guts might have been larger then because of “gut plasticity.” She mentioned that what really sets us apart from our primate relatives is the reorganization of the gut morphologically rather than the size.
In humans compared to primates, the gut is reorganized. The size of the colon is much reduced and the size of the small intestine is increased. The human colon takes up 17-23% of the digestive tract. In chimpanzees, orangutans, and gorillas it occupies 52-54%. Instead of a large colon, humans have a small intestine that represents 56-67% of the gut (Milton, 1989).
These are important to note because of their role in digesting food. The small intestine is where primate enzymes digest and absorb nutrients immediately available in food. In contrast, the colon can be thought of as a bioreactor, where bacteria digest otherwise useless dietary constituents into important nutrients and other chemical byproducts. These include short-chain fatty acids (SCFA), organic fatty acids with 1-6 carbon atoms created by the fermentation of polysaccharides, oligosaccharides, protein, peptides, and glycoprotein precursors in the colon. The major source of these in primates is through the fermentation of fiber and some types of starch. The major difference in this matter between humans and the other great apes is that apes such as the gorilla are able to use their larger colons to obtain as much as 60% of their caloric intake from SCFA alone (Popovich et al., 1997). Upper estimates for human caloric use of SCFA range from seven to nine percent. (McNeil, 1984).
Figure 6: The contribution of SCFA to metabolism in gorillas from Popovich, et al.
Aiello, L. C., & Wheeler, P. (1995). The Expensive-Tissue Hypothesis: The Brain and the Digestive System in Human and Primate Evolution. Current Anthropology, 36(2), 199. doi: 10.1086/204350.
Bramble, D. M., & Lieberman, D. E. (2004). Endurance running and the evolution of Homo. Nature, 432(7015), 345-52. Nature Publishing Group. doi: 10.1038/nature03052.
McNeil, N. (1984). The contribution of the large intestine to energy supplies in man. Am J Clin Nutr, 39(2), 338-342. Retrieved May 2, 2011, from http://www.ajcn.org/cgi/content/abstract/39/2/338.
Milton, K. (1989). Primate diets and gut morphology: implications for hominid evolution. In M. Harris & E. B. Ross (Eds.), Food and Evolution: Toward a Theory of Human Food Habits (p. 93). Temple University Press. Retrieved May 8, 2011, from http://books.google.com/books?hl=en&lr=&id=xHYxSHr86T8C&pgis=1.
Popovich, D. G., Jenkins, D. J. A., Kendall, C. W. C., Dierenfeld, E. S., Carroll, R. W., Tariq, N., et al. (1997). The Western Lowland Gorilla Diet Has Implications for the Health of Humans and Other Hominoids. J. Nutr., 127(10), 2000-2005. Retrieved April 28, 2011, from http://jn.nutrition.org/cgi/content/abstract/127/10/2000.
Sawyer, G. J., & Maley, B. (2005). Neanderthal reconstructed. Anatomical record. Part B, New anatomist, 283(1), 23-31. doi: 10.1002/ar.b.20057.
Physical Anthropologist John D. Speth wrote a fantastic book called The Paleoanthropology and Archaeology of Big-Game Hunting: Protein, Fat, or Politics? It's kind of a crime that it's not more widely available. It sells for $134 on Amazon, which is totally lame. If you are a student though you can probably get it for free. For my institution Springerlink had a free ebook download! I don't have time to do it justice right now, but there is a great chapter I just wanted to mention. It's about the high-fat African game animals, which are disproportionately represented in many sites tied to paleolithic hunting.
This is an opportune moment to take another brief detour into the realm of archaeology, this time to look at views about the hunting capabilities of hominins who occupied sub-Saharan Africa between about 300,000 years ago and roughly 40,000 years ago, give or take a few millennia. For those not too familiar with
archaeology, in Africa this period of the Paleolithic is known as the Middle Stone Age (MSA). During more or less the same period of time, Europe and western Asia were inhabited by Neanderthals, and in these more northerly latitudes of the Old World the comparable part of the archaeological record is referred to as the Middle Paleolithic (MP). Richard Klein has written extensively about the hunting strategies of MSA peoples, focusing particularly on the faunal record from two well-preserved and widely known cave sites located east of Cape Town along South Africa’s Indian Ocean coast – Klasies River Mouth and Die Kelders.
Klein has argued for many years that MSA hominins lacked not only the technological know-how of the people who followed them during the ensuing Later Stone Age (LSA), but they also lacked the cognitive wherewithal. Interestingly, eland remains in these caves are central to his line of thinking, and hence the reason for this detour. And, as I have been doing throughout the book, I will let Klein speak for himself.
In contrast to the other ungulates, the eland in MSA sites include a large proportion of primeage adults, and the age profile has a catastrophic shape…. The most likely explanation is that MSA people had learned that, unlike most other large African bovids, eland can be easily driven, without much personal risk. An eland herd caught in the right position could be forced over a cliff or into a trap…. However, MSA people could not have driven eland herds to their death very often or the species would have become extinct, since its reproductive vitality would have been sapped by the continuing loss of a large proportion of the available prime adults. Not only did the eland survive, but there is no evidence that it became less numerous during the long MSA time span…
Thus, MSA people were probably not very successful at hunting eland, and this makes it especially interesting that eland is the most abundant ungulate in the MSA faunas. The clear implication is that MSA people must have been even less successful at hunting other species that are less common in the sites but were more common in the environment. In short, MSA impact on the large mammal fauna was negligible. By extension, it may be argued that LSA peoples, in whose sites eland and other species are represented more in proportion to their live abundance, probably took a higher proportion of game overall. In short, LSA people were almost certainly more proficient hunters. Klein (1987:36–37)
I think this argument needs to be turned on its head. Judging by the many quotes from historic accounts that I have already presented, all of which extol the virtues of the eland as the “game-of-game” in a land of fat-poor animals, the eland is precisely the animal that one should target if the animal is available and the hunters possess the means. If anything the abundance of prime-adult elands in MSA sites is testimony to just how good, and successful, they were as hunters, not evidence of their impoverished cognitive capacity. It is the LSA hunters that should be the focus of interest here. Why were they (as it would seem) compelled to concentrate more on the far leaner and smaller game, the prey that explorer after explorer considered inferior food, especially when they were short of adequate sources of carbohydrates or alternative means of acquiring fat? It seems far more likely that the hunters of the LSA were under some level of stress, either because they managed to overhunt the elands, or perhaps because environmental changes reduced the numbers of elands. All of this remains speculative, of course, but I think the one conclusion we can safely draw from this is that the presence or absence of eland in archaeological sites tells us nothing about innate cognitive capacities.
Incidentally, the abundance of prime adults, evidence that led Klein to postulate that MSA hunters may have driven groups of eland over cliffs or into traps (the “catastrophic” age profile that Klein refers too – that is, an age structure that resembles what one would observe in a living population) need not imply mass kills. Since the faunal assemblages are aggregates or palimpsests of countless individual hunting episodes, the abundance of adult animals in their prime is what one might expect if hunters often deliberately sought out animals that were at their peak in condition, but also now and then killed whatever eland came within their sights, regardless of age. It would be interesting to know what proportion of the adult eland at Klasies and Die Kelders were males….
Hmm, possible dissertation topic? What's so important about eland? Why would hunters target them?
An 18th century Swedish naturalist quoted in the book gives some clues:
This animal [“Cape-elk” or eland] has a great deal of fat, especially about the heart: from an old male which we gave chase to and shot, we got such a quantity of fine and tender fat, as we could with difficulty get into a box that would hold about ten pounds of butter. As at the commencement of our journey homewards through the desert, the hounds we had with us had unluckily devoured our stock of butter, a farmer, who still accompanied us, showed us how to prepare the fat from about the heart of the elk, and to use it for dressing victuals with and for eating on bread in the same manner as is generally practised with goose-grease and hogs-lard. The taste of it also was very similar to these, and to the full as good; and, indeed, if I may be supposed to have been able to form any judgment of the matter at a time when we were so sharp set, and in absolute want of any thing else of the kind, it was rather better. The breast is likewise extremely fat, and is always looked upon as a great delicacy. The flesh is universally of a finer grain, more juicy and better tasted than that of the hart-beest. Sparrman (1785:207–208)
Speth has great information on early food containers where hunter-gatherers may have stored things like fat or boiled bone grease. Pottery may date back as far as 200,000 years, but it's also possible to store and cook with liquids in skins and other containers that would not be present in the record. He also takes down the common use of the San (bushman) as Paleolithic proxies.
I'll post more about this book soon. I've been very busy with school, but luckily this is what I study so I have ample fodder for posts now. I feel very bad for people who don't have access to this things.
It's possible they will take old books that are outdated and try to push their own rather narrow conclusions. Altough you don't need to have research journal access to find that Boyd Eaton thinks his own conclusions were wrong (though Konnor is still holding out):
Meanwhile, paleo eating continues to evolve. In 1985, Eaton and Konnor allowed foods like skim milk and whole-wheat bread. Konnor still thinks that was the right call, and believes his original concerns about fat were prudent. “You can’t just go to the supermarket and buy meat loaded with fat and say you’re doing the Paleolithic diet. You’re not.” Animals of 10,000 ago, Konnor says, were less fatty—so we must compensate by eating leaner meats, and less. Eaton has gone the other way. He says he had failed to consider the contribution of non-muscle meat like brain and fat depots, and thus underestimated the amount of fat we need. “It makes me feel stupid!”
In full discloser, I don't think there is enough evidence either way to draw a conclusion about fat in the paleolithic and we are dependent on modern nutritional science to elucidate whether or not fat is healthy (or types of fat). I also am a big fan of the idea that evolution of human being is on-going and didn't freeze in lower-paleolithic Africa. I personally cycle low and high-fat, but do best on high-fat (I lose my period on low-fat, for example).
As interesting as Venus-gate is, I don't think art from the paleolithic really tells us much about the health of the average person. Think of some famous artwork from our era, imagine there is a nuclear disaster and everything is destroyed except that piece of artwork. What incorrect conclusions would a society come to if they just had that piece of artwork? Humans have an incredible ability to see things in nothing. Like this "time traveler" discovered in an old photo.
But Venus isn't the only victim. Particularly since remember here there is no evidence that the Venus of Willendorf was a portrait of a person. It's not an n=1 situation, it's an n=0 situation. There is no way to prove that someone would have had to have been familiar with excess adiposity in order to create a figurine like that and certainly nothing you can draw from that single figurine that would suggest we should eat a low fat diet!
Another great example of grasping at straws to use art in paleopathology (the study of disease in archeological remains, though this is obviously stupid here since there ARE no remains, just random statues) is Male Genital Representation in Paleolithic Art: Erection and Circumcision Before History.
In this paper the authors suggest that various features of phallic figurines from the paleolithic suggest all sorts of pathologies such as phimosis. Furthermore, the authors saw that they show circumcision was practiced, though he admits reluctantly it they could also be retracted penises. There is a reason this isn't in the Journal of Physical Anthropology, but in Urology instead. Here is some of the diagnostic "evidence"
Why are the weird phallic pieces proof of phimosis but the drawings with giant phalli not proof that people back then had abnormally ginormous penises? Why would you assume any sort of anatomical correctness for these sort of things? Just because you didn't put a foreskin in your art doesn't mean they didn't have them. I can report that in bars in Europe where few men are circumcised, when people draw penises in the bathroom (co-ed bathrooms are rather unpleasant BTW), they look like those in Figure 5.
So yeah, using art to muse about what people were like back then is interesting. Using it to diagnose illness or make inferences about the population is just silly.
Edit: is it true that circumcision is practiced by at least 7 forager groups. Interesting.
FYI: if you've told your family about the paleo diet, some time this week you are liable to get sent this article Neanderthals may have feasted on meat and two veg diet by your Aunt Maude, who was dismayed last Thanksgiving when you didn't want two helpings of her refried bean casserole.
Since apparently reporters are unable to write the name of the paper or the journal in their lame articles, I went to the trouble of finding this paper, which is from the Proceedings of the National Academy of Sciences and is titled Microfossils in calculus demonstrate consumption of plants and cooked foods in Neanderthal diets (Shanidar III, Iraq; Spy I and II, Belgium).
While the method used in this paper represents considerable inguinity, the theory that Neanderthals ate plants has been around for quite some time. This evidence is more convincing than past evidence though, which focused mostly on the environments where neanderthals were thought to live, like this paper The Exploitation of Plant Resources by Neanderthals in Amud Cave (Israel): The Evidence from Phytolith Studies, which was published in 2004.
As usual, journalists didn't get the memo and have been portraying neanderthals as top level carnivores (they were in some regions) and have parroted the idea that they died out because they couldn't diversify their diet.
Michael, a commenter, pointed me to that paper and other commenters criticized it because it relied on plant remains in cave sediment, which could have been there for other reasons. True, but this paper is much more convincing because they looked at the actual teeth of neanderthals. Apparently these neanderthals had some dental plaque (it would be intersting to know if more northern populations had this) and in that dental plaque there were some teeny tiny fossils. And in those fossils they found evidence of particular plants consumed, as well as evidence that some of them had been cooked. So there were cooked plants in the mouths of these neanderthals. Which makes it likely that they were eating plants.
Some of the fossils identified were from date palms (tasty), water lily roots, flowering legumes, and grasses. Of course there could be errors in identifying these fossils and we can't know how many of these foods were eaten.
But as more and more of this evidence comes to light, the idea that a diet low in plant antinutrients = a paleo diet is going to be increasingly disputed.
Rough chart of this idea. Comments on additions/subtractions welcome.
The writer of what was formerly Plant Poisons and Other Rotten Stuff has been saying this for some time. I think it's wise to avoid the major neolithic agents of inflammation: gluten, sugar, and high omega-6 vegetable oils, but what you eat beyond that should be targeted towards your own constitution rather than what "Grok" ate.
Chris Kessler had a great post about that today which I suggest reading.
I was curious recently about use of bones as food in the paleolithic. One interesting paper I found was Gazelle bone fat processing in the Levantine Epipalaeolithic. Epipalaeolithic is a term for an era confined to a particular geographical space of the Levant in the Eastern Mediterrean about the same time as the mesolithic era in Western Europe (21-11.5 thousand years ago). Hunter-gatherers in this era had more advanced tools than in previous eras. One thing they apparently used these tools for was to extract greater nutrition from animal bones. The major important products from bones were marrow and grease. Humans might not have the jaws of hyenas, who also consume bones, but we have the smarts to devise tools to get these nutritionally valuable products. The amount of time spent processing bones speaks to their nutritonal importance and also leaves good evidence.
While bone marrow is hard to extract, it was worth it for these hunter-gatherers considering how nutritious it was. There is evidence for marrow processing as early as 5 million years ago. Grease is also present in bones within the spongy microstructure, but it requires more technology to extract than marrow. The epipalaeolithic represented a bridge between foraging and sedentism, so at this point food was being stored. Grease could be stored in solid cakes, skin bags, or mixed with meat as pemmican. Extracting grease required pounding or breaking the bones and boiling them. The grease can then be skimmed from the surface. Back then most containers used were made of organic matter, which means there isn't a lot of good evidence for their exact nature. I remember some time ago seeing an argument on a paleo message board about containers, but this paper references evidence of organic containers that were probably heated with hot rocks from a fire, a method still used in some saunas I have attended.
How much and what kind of grease and marrow varied by animal species, age, season, weight, and physical condition. The species found in the Levant sites studied included fallow deer, tortoises, hare, and partridge.
This paper interested me because I've been thinking a lot about cooking methods and adopting those that are gentler than frying. The evidence is quite clear that boiling has been in use for a long time and also represents an excellent way to extract further nutrition from animals. The point that bones need to be broken to get the most of them is something to remember. Ask your butcher to cut your bones open so you can enjoy the marrow and make more nutritious and delicious stock. A cookbook that has some great info about what cuts to ask for is Bones by Jennifer McLagen.