Decolonizing the Diet: Nutrition, Immunity, and the Warning from Early America

Decolonizing the Diet challenges the common claim that Native American communities were decimated after 1492 because they lived in “Virgin Soils” that were biologically distinct from those in the Old World. Comparing the European transition from Paleolithic hunting and gathering with Native American subsistence strategies before and after 1492, the book offers a new way of understanding the link between biology, ecology and history. Synthesizing the latest work in the science of nutrition, immunity and evolutionary genetics with cutting-edge scholarship on the history of indigenous North America, Decolonizing the Diet highlights a fundamental model of human demographic destruction: human populations have been able to recover from mass epidemics within a century, whatever their genetic heritage. They fail to recover from epidemics when their ability to hunt, gather and farm nutritionally dense plants and animals is diminished by war, colonization and cultural destruction. The history of Native America before and after 1492 clearly shows that biological immunity is contingent on historical context, not least in relation to the protection or destruction of long-evolved nutritional building blocks that underlie human immunity.

1126407494
Decolonizing the Diet: Nutrition, Immunity, and the Warning from Early America

Decolonizing the Diet challenges the common claim that Native American communities were decimated after 1492 because they lived in “Virgin Soils” that were biologically distinct from those in the Old World. Comparing the European transition from Paleolithic hunting and gathering with Native American subsistence strategies before and after 1492, the book offers a new way of understanding the link between biology, ecology and history. Synthesizing the latest work in the science of nutrition, immunity and evolutionary genetics with cutting-edge scholarship on the history of indigenous North America, Decolonizing the Diet highlights a fundamental model of human demographic destruction: human populations have been able to recover from mass epidemics within a century, whatever their genetic heritage. They fail to recover from epidemics when their ability to hunt, gather and farm nutritionally dense plants and animals is diminished by war, colonization and cultural destruction. The history of Native America before and after 1492 clearly shows that biological immunity is contingent on historical context, not least in relation to the protection or destruction of long-evolved nutritional building blocks that underlie human immunity.

22.99 In Stock
Decolonizing the Diet: Nutrition, Immunity, and the Warning from Early America

Decolonizing the Diet: Nutrition, Immunity, and the Warning from Early America

by Gideon Mailer, Nicola Hale
Decolonizing the Diet: Nutrition, Immunity, and the Warning from Early America

Decolonizing the Diet: Nutrition, Immunity, and the Warning from Early America

by Gideon Mailer, Nicola Hale

eBook

$22.99  $30.36 Save 24% Current price is $22.99, Original price is $30.36. You Save 24%.

Available on Compatible NOOK devices, the free NOOK App and in My Digital Library.
WANT A NOOK?  Explore Now

Related collections and offers


Overview

Decolonizing the Diet challenges the common claim that Native American communities were decimated after 1492 because they lived in “Virgin Soils” that were biologically distinct from those in the Old World. Comparing the European transition from Paleolithic hunting and gathering with Native American subsistence strategies before and after 1492, the book offers a new way of understanding the link between biology, ecology and history. Synthesizing the latest work in the science of nutrition, immunity and evolutionary genetics with cutting-edge scholarship on the history of indigenous North America, Decolonizing the Diet highlights a fundamental model of human demographic destruction: human populations have been able to recover from mass epidemics within a century, whatever their genetic heritage. They fail to recover from epidemics when their ability to hunt, gather and farm nutritionally dense plants and animals is diminished by war, colonization and cultural destruction. The history of Native America before and after 1492 clearly shows that biological immunity is contingent on historical context, not least in relation to the protection or destruction of long-evolved nutritional building blocks that underlie human immunity.


Product Details

ISBN-13: 9781783087167
Publisher: Anthem Press
Publication date: 03/22/2018
Sold by: Barnes & Noble
Format: eBook
Pages: 354
File size: 744 KB

About the Author

Gideon A. Mailer is associate professor in early American history at the University of Minnesota, Duluth, USA.

Nicola E. Hale, specializing in genetics, cell biology and biochemistry, has worked in assistant scientist positions at the University of Cambridge, UK.

Read an Excerpt

CHAPTER 1

THE EVOLUTION OF NUTRITION AND IMMUNITY: FROM THE PALEOLITHIC ERA TO THE MEDIEVAL EUROPEAN BLACK DEATH

To understand the biological effects of nutritional disruption on Native American immunity and fertility after 1492, it is necessary to consider what we know, and what we do not yet know, about three vital stages in human nutritional history. The earliest two stages affected the nature of human evolution. The first began more than 2.5 million years ago, when nutrient-dense foods from land mammals allowed an increasingly small human gut to complement an expanding brain. The second, according to a newer and more controversial hypothesis, took place between 200,000 and 90,000 years ago, when coastal marine migrations from Africa provided greater access to the omega-3 fatty acid docosahexaenoic acid (DHA). Those migrations contributed to what some scholars now describe as a second stage in the evolving growth of the human brain, due to the greater reliance on DHA as an exogenous nutrient. These evolutionary interactions, and their nutritional basis, coincided with a hunter-gatherer lifestyle and preceded the intensified farming of "Neolithic" foods such as wheat and corn, which began around 10,500 years ago throughout the world. We consider the rupturing of hunter-gatherer food systems as a third major stage in human nutritional history, beginning with the rise of Neolithic agriculture in Europe and Asia, and slightly later with the rise of maize intensification in parts of North America.

Assessing what we define as the third stage in human nutritional history allows us to consider how immunity, and thus demography, might be compromised by rupturing the food requirements that developed during the two earlier evolutionary stages. Scholarship on declining health markers and increasing disease in Europe and Asia during the Neolithic era, and slightly later during the rise of maize intensification in North America, offers an important model and conceptual framework to explain similarities and differences in post-contact North America, when populations were also faced with sudden changes to subsistence strategies and threats to their immunological health. There is still much that we do not know regarding the evolution of the human immune system, including the extent to which it continued to evolve and adapt during the rise of the Homo genus around two million years ago; or even the nature of the role of an immune system during the evolutionary divergence between vertebrates and invertebrates before that period. Nonetheless, we are comfortable with the suggestion that selective pressures may well have contributed to ongoing refinement of the inflammatory and immunological response during the Paleolithic era, coinciding with the development of a small gut in relation to a large brain, including in relation to the micronutrient and metabolic requirements for optimal immune function.

The brain utilizes micronutrients as well as energy. The former is required for the proper function of enzymes and other features that underlie chemical and hormonal signaling between the brain and the rest of the body. The relatively recent enlargement of the brain thus risked constraining the function of other parts of the body that preceded its evolutionary growth, including the cellular processes that allow the immune system to function against pathogens, or perceived pathogens. The consumption of nutrient-dense foods during the Paleolithic era thus allowed a smaller digestive system in relation to the enlarged brain, while also continuing to supply the immune system with all that it continued to require; or even with micronutrients and metabolic sources that allowed it to continue to evolve advantageously. If those micronutrient-dense foods were suddenly replaced with nutrient-poor foods, the consequences for optimal health, including immune function, would be deleterious. We ought to examine those consequences at relevant historical junctures. The problematic health consequences of the Neolithic transition to agriculture 10,000 years ago, for example, may have compromised immune function at just the point when the changing societal context made diseases more likely to proliferate. Examining that phenomenon provides a paradigm to understand the problematic curtailment of nutrient-dense foods in Native American history, particularly if we can inform our assessment of both historical phenomena and their mutual relevance with new insights from the fast-developing scientific literature on the association between nutrition and immunity.

The scientific literature on the link between nutrition and immunity has developed significantly since anthropologists and archaeologists first identified declining health in transitional Neolithic populations. It has evolved even further since historians such as Jones and Kelton began to highlight ruptured food access as one of several contingent factors in the decline of post-contact Native American populations. A vast number of human immune cells reside in the human gut. The immune response to pathogens that enter the body via the gut begins with these immune cells. Yet we have only very recently begun to realize the full extent of the inflammatory response that follows the gut's encounter with foods that are maladapted to its evolved structural and hormonal mechanisms: a release of inflammatory proteins that upregulate the human immune response, often chronically. It is likely that in such a chronically inflamed state, the efficacy of the acute immune response to pathogens is reduced. In this chapter we examine whether such a state was likely during the third stage of human nutritional history, which corresponded with the rise of Neolithic grains in Europe and the Middle East, and which also witnessed the proliferation of new diseases.

We take care to avoid overstating the importance of the concept of chronic inflammation, given that its scientific literature is still in its infancy, leading some in the sphere of functional medicine toward possible exaggeration or misunderstanding. Doing so will require examination of the optimal operation of the immune response to invading pathogens both in relation to and separate from the process of inflammation. We consider the connection between inflammatory health markers, declining working immunity to disease and the introduction of new subsistence patterns in the Neolithic Old World. Scholarship on these connections — including that which has examined Neolithic skeletal evidence — offers important models and conceptual frameworks to explain similarities and differences in post-contact North America, when populations were also faced with sudden changes to subsistence strategies and threats to their immunological health.

By synthesizing the latest historical, archaeological and anthropological assessments of Neolithic health outcomes with the most recent biological literature on nutrition, inflammation, autoimmunity and immunity, we will be able to form a related hypothesis, which will frame the chapters that follow: whether during the intensification of maize agriculture in precontact indigenous North America from around 4,500 years ago, or following the disruptive arrival of Europeans and European agriculture among Native Americans, autoimmunity and chronic inflammation, and a compromised immune system, could have been strongly affected by dietary changes that deviated from the repertoire of foods that Native Americans, like all human populations, had evolved to consume during the Paleolithic era (from around 2.4 million years ago to around 10,000 years ago). This phenomenon represented a centrally important contingent factor in Native American demographic decline, which was distinct from any supposed genetic differences in the working immunity of Native Americans as compared to Old World populations.

Expanding the Expensive Tissue Hypothesis: Evolutionary Nutritional Interactions between the Small Gut and the Large Brain

It is now well accepted among evolutionary biologists that the increased consumption of animal meat by early hominids played a profound role in the evolution of modern humans. It was fundamental in allowing the development of the exceptionally large brain of Homo sapiens. A new generation of scholars has identified the importance of marine animals during a later period of brain evolution, separate from that which was enabled by access to land mammals. It is worth identifying and synthesizing the latest scholarship on these interactions, to understand why nutrition is central to the enhancement or diminishment of working immunity in human populations, including those in indigenous North America who suffered an assault on long-evolved nutritional frameworks at just the point when diseases began to proliferate. The nutrient sources that allowed the development of a small human gut in relation to a large human brain likely included other benefits, particularly in relation to the growth and optimal function of the human immune system.

The exact point in our evolutionary history at which we started eating meat is uncertain, but meat eating likely originated before the appearance of "a human-like primate some 6–8 million years ago," before the Paleolithic period that extended from the earliest known use of stone tools by early hominids, around 2.4 million years ago, to the end of the Pleistocene around 11,500 years ago. Several lines of evidence, including changes in morphology through the evolution of early hominids to modern humans, and archaeological evidence of tools used in hunting and meat consumption, suggest that meat eating increased from the earliest hominids to modern humans. By 500,000 years ago, moving from the Upper Paleolithic toward the Middle Paleolithic, there is clear evidence of meat consumption by modern humans.

With these evolutionary and archaeological discussions in mind, scholars have drawn an association between meat consumption and the inverse relationship between the size of the human gut and the human brain. Animals with large (and even multiple) guts, such as ruminants, spend much energy converting nutrient-poor foods, such as grass, into nutrient-dense end products (their own tissue). With an increasingly small gut, conversely, evolving humans required nutrient density exogenously, from other animal meats. There is a linear correlation between body weight and basal metabolism in terrestrial mammals, suggesting that the supply of metabolic fuel to the brain is a limiting factor for brain growth, and that an increase in brain volume must be compensated for by a decrease in the size of other organs. In The Expensive Tissue Hypothesis, therefore, Aiello and Wheeler propose that gut tissue was sacrificed as brain tissue expanded in human evolution. As brain growth occurred alongside the increased consumption of animal products, the digestive organs were able to decrease in size without compromising nutrient supply: the nutrient density of meat enabled a smaller digestive system to provide all the nutrients required for a metabolically demanding larger brain.

As Hardy et al. have recently suggested, it is not necessary to rule out the consumption of plant carbohydrates in any discussion of the important role of animal meat in the evolution of the brain. Both, in their view, were "necessary and complementary dietary components in hominin evolution." Discussing work by Conklin-Brittain et al. and Wrangham, they do not discount the hypothesis that "concentrated starch from plant foods was essential to meet the substantially increased metabolic demands of an enlarged brain [...] [and] to support successful reproduction and increased aerobic capacity." Immune cells require glucose, either from gluconeogenesis (the production of glucose from non-carbohydrate sources within the body) or from ingested carbohydrates, suggesting that starch consumption may have been important even as animal meats contributed to the enlargement of the brain and the shrinking of the gut. Animal products, as we shall see, supply many of the nutrients that are necessary for the multicellular processes that enable optimal immune function. But the metabolic energy used for sound immune function may have relied in part on exogenous starch consumption throughout the history of modern human beings.

More recent human evolutionary studies have suggested an association between marine animal consumption and enhanced brain function, revising our focus solely on land animals. The Expensive Tissue Hypothesis ought to be understood alongside, and even synthesized with, an expanding separate literature on the role of DHA, in human evolution. Though the levels of DHA from terrestrial sources might have been sufficient for smaller-brained early humans, several recent assessments suggest that exploitation of seafood resources from coastal and estuarine environments later in human evolution was vital in allowing a continued increase in brain size, coinciding with changes in human behavior that likely required greater processing capacity.

The brain began to increase in size long before the evolution of modern humans. Between the evolution of Homo erectus and modern humans, brain size almost doubled. Yet as recently as in the last 200,000 years, the increase in size may have been "exponential." The negative logarithmic relationship between body size and brain size to which most other mammalian species conform, does not apply to modern humans. Given that DHA is thought to be one of the nutrients whose abundance acts as a limiting factor for brain size, some scholars have posited that increased consumption of DHA was central in allowing the possible continued expansion of the brain in more recent human evolutionary history.

Although early humans may not have been able to exploit marine resources intensively, they may have consumed marine foods sporadically, similarly to the periodic consumption of marine foods by primates such as monkeys and chacma baboons, thereby consuming more DHA than could be found in terrestrial sources alone. Intense exploitation of coastal resources would have required regional societal knowledge of the association between lunar dates and tidal cycles. The seasonal variability of marine foods would have necessitated movement toward and away from the coast at different times of year. The high level of cognition required for the exploitation of marine resources may in part explain why marine foods did not form a substantial part of the human diet earlier in human history.

Alpha-linoleic acid (ALA), the precursor for all omega-3 fatty acids, cannot be manufactured by mammals, and so must be obtained from dietary sources. Plants contain ALA, which can be converted into the omega-3 fatty acids eicosa-pentaenoic acid (EPA) and DHA. However, this conversion is thought to be inefficient and unreliable. Consequently, several researchers now suggest that DHA is "conditionally essential" — challenging previous assumptions that 'ALA is the essential omega-3 nutrient." Thus the brain size of other mammalian species may have been limited by their lesser supply of DHA, which is not manufactured by the "primary source of the terrestrial food chain" and can only be produced in small quantities in mammals, making it a scarce resource for terrestrial mammals in their historical development. A significant part of human evolution, conversely, occurred with at least some access to coastal regions with abundant availability of marine foods, which would have supplied their nutrients.

We can also point to the potential importance of other trace elements (particularly iodine and selenium) that are found at much higher levels in marine foods than terrestrial foods, and which are crucial for brain function. Their scarcity may also have acted as limiting factors in brain size before the widespread consumption of marine foods. Thus the more recent dramatic expansion of the brain from 200,000 to 90,000 years ago may have required a higher consumption of DHA, and therefore more intense and organized exploitation of marine resources, through a process defined by scholars as "coastal adaptation." In this framework, we can identify "a potential early coastal route of modern humans out of Africa via the Red Sea coast." This second evolutionary stage may have increased brain size further, and possibly also reduced the size of the gut, thanks in part to newly abundant dietary DHA from marine animals.

Such a hypothesis, to be sure, does not reflect a consensus among scholars of the nutritional framework for human brain evolution, not least due to the vexing nature of identifying marine fossil evidence versus that from land mammals. But although the hypothesis remains relatively controversial, the overlooked evolutionary importance of DHA for immunity and health more generally, as described by Crawford, Cunnane and other proponents of the hypothesis, is much less contentious. DHA has been defined as anti-inflammatory both within and without the gut, and has been linked to enhanced immunological function, in addition to its role in cognitive reasoning and brain development.

(Continues…)


Excerpted from "Decolonizing the Diet"
by .
Copyright © 2018 Gideon A. Mailer and Nicola E. Hale.
Excerpted by permission of Wimbledon Publishing Company.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

Acknowledgments; Introduction; Nutrition and Immunity in Native America: A Historical and Biological Controversy; Chapter 1:The Evolution of Nutrition and Immunity: From the Paleolithic Era to the Medieval European Black Death; Chapter 2: More Than Maize: Native American Subsistence Strategies from the Bering Migration to the Eve of Contact; Chapter 3: Micronutrients and Immunity in Native America, 1492– 1750; Chapter 4: Metabolic Health and Immunity in Native America, 1750– 1950; Epilogue: Decolonizing the Diet: Food Sovereignty and Biodiversity; Notes; Index.

What People are Saying About This

From the Publisher

"Mailer and Hale challenge us to consider how colonization’s multiple consequences—dietary changes, diseases, and settler invasions—resulted in long-term problems for Indigenous Peoples. Based on cutting-edge research on nutrition, immunity and ethnohistory, Decolonizing the Diet offers a fascinating analysis that both illuminates the past and informs the present."
—Paul Kelton, Professor of History and Robert David Lion Gardiner Chair, Department of History, Stony Brook University, USA


"Mailer and Hale provide a powerful critique of Virgin Soil theory and its claim that epidemics were the inevitable consequence of European colonization. Drawing on cutting-edge nutrition science, immunology, and archeology, they conclusively demonstrate how the chaos of encounter disrupted American Indian agriculture, triggered widespread malnutrition and left Indians susceptible to dire mortality."
—David S. Jones, A. Bernard Ackerman Professor of the Culture of Medicine, Faculty of Arts and Sciences and the Faculty of Medicine, Harvard University, USA

From the B&N Reads Blog

Customer Reviews