Protecting Ourselves from Food

Spices and morning sickness may shield us from toxins and microorganisms in the diet

Biology Microbiology

Current Issue

This Article From Issue

March-April 2001

Volume 89, Number 2
Page 142

DOI: 10.1511/2001.18.142

Although a well-prepared meal may be an unalloyed pleasure, the act of eating is one of the most dangerous things many of us do every day. Ingesting bits and pieces of the outside world provides a free pass to the bloodstream for whatever lurks within.

Microorganisms and toxins are pervasive—benign or not, they are almost always present at some level in the food we eat. Indeed most of us have experienced the unpleasant consequences of "food poisoning." Likewise, we often read newspaper accounts of deaths and illnesses associated with contaminated foods. And all of this occurs despite our modern practices of refrigeration and hygienic food preparation.

Trustees of the V&A

Ad Right

One can imagine a time in the past, however, when the absence of refrigeration and the scarcity of food encouraged people to eat meals that might not have met the highest standards of sanitation. The dangers of eating leftovers in those days must have been great indeed. Certainly there must have been many more deaths and illnesses in the past. What did people do?

As evolution-minded biologists, we were intrigued by traditions associated with preparing and eating food. Might they provide some means of protecting us from the dangers of contamination? Consider the humble cookbook. Traditional versions contain recipes that have been prepared for hundreds, perhaps even thousands, of years. Surely one reason for a recipe's success must be its palatability, but could a recipe also have some adaptive value? That is, might it be adaptive to find certain recipes palatable and others not? Cooking food not only increases its flavor in certain instances, but also has the added value of killing microorganisms that are potentially harmful. Might some of the ingredients in a recipe do the same?

Spices, for example, are used almost universally throughout the world. We wondered whether there might be a relation between their use and their effectiveness in protecting us from food-borne microorganisms. In the first part of this article we discuss research that seeks to test the validity of this idea.

In the second half of this article we approach a seemingly unrelated subject: the adaptive value of "morning sickness"—nausea and vomiting during early pregnancy. About two-thirds of pregnant women experience some degree of morning sickness. From an evolutionary perspective, the prevalence of the phenomenon raises two questions: Why do so many women feel nauseated, as opposed to some other symptom, early in pregnancy? And how does "morning sickness" affect the outcome of pregnancy? We believe that the answers to these questions suggest that morning sickness is another mechanism, this time a physiological one, that protects us from the inherent dangers of eating.

Cooking with Spice

Spices are plant products. They come from various woody shrubs, vines, trees and aromatic lichens, as well as the roots, flowers, seeds and fruits of herbaceous plants. Each spice has a unique aroma and flavor that derives from "secondary compounds," chemicals that are secondary (not essential) to the plant's basic metabolism. Most spices contain dozens of them. These phytochemicals evolved in plants to protect them from being eaten. Phytochemicals are legacies of multiple coevolutionary races between plants and their enemies—parasites, pathogens and herbivores. These chemical cocktails are the plants' recipes for survival.

American Scientist staff

People have made use of plant secondary compounds in food preparation and embalming for thousands of years. So valuable were spices that when Alarich, a Gothic leader, laid siege to Rome in 408 A.D. he demanded as ransom various precious metals and 3,000 pounds of pepper! In the Middle Ages, the importance of spices was underscored by the willingness of seafarers like Marco Polo, Ferdinand Magellan and Christopher Columbus to undertake hazardous voyages to establish routes to spice-growing countries. Spice trade was so crucial to national economies that rulers repeatedly mounted costly raids on spice-growing countries, and struggles for their control precipitated several wars. In modern times the spice trade still flourishes. Black pepper, for example, is the world's most widely used spice even though Piper nigrum grows naturally only in the New World tropics.

What accounts for the enduring value of spices? The obvious answer is that they enhance the flavor, color and palatability of food. This proximate, or immediate-cause, explanation is true, but it does not address the ultimate, or long-term, questions of why people find foods more appealing when they contain pungent plant products, why some secondary compounds are tastier than others and why preferences for these chemicals differ among cultures. Answers to proximate and ultimate questions are complementary rather than mutually exclusive. A full understanding of spice use, or any other trait, requires explanations at both these levels of analysis.

The ultimate reason plants possess secondary compounds is to protect themselves from their natural enemies. This may be a clue to the ultimate reason we use these chemicals. Our foods also are attacked by bacteria and fungi, often the same ones that afflict the spice plants. If spices were to kill microorganisms or inhibit their growth or production of toxins, then spice use could protect us from food-borne illnesses and food poisoning.

The Antimicrobial Hypothesis

In a series of recent studies, one of us (Sherman), along with Jennifer Billing and Geoffrey Hash, set out to test this "antimicrobial hypothesis." We located 107 "traditional" cookbooks from 36 countries, representing every continent and 16 of the world's 19 major linguistic groups. Written primarily as archives of the native cuisine, these cookbooks are artifacts of human behavior. Our cookbook database contains information on the use of 42 spices in 4,578 meat-based recipes and 2,129 vegetable-based recipes (containing no meat). Although salt has antimicrobial properties, and it has been used in food preparation and preservation for centuries, it was not included in our analyses because it is a mineral, not a plant product.

Tom Dunne and Emma Skurnick

The cookbooks confirm what every gastronome knows—across the world there is tremendous variability in the use of different spices. Black pepper and onion are the most frequently used, each appearing in more than 60 percent of the meat-based recipes, with garlic, chili pepper and lemon/lime juices following close behind (Figure 3). Most spices, however, are rarely used. In our sample, about 80 percent of spices are used in fewer than 10 percent of the meat-based recipes, with horseradish, fennel and savory at the bottom of the list.

To test the antimicrobial hypothesis, we developed five critical predictions and examined them by combining information from the microbiology literature with analyses of traditional recipes.

Prediction 1: Spices used in cooking should exhibit antimicrobial activity. Microbiologists and food scientists have challenged various food-borne bacteria and fungi with spice chemicals. Although the data are heterogeneous—owing to differences in laboratory techniques, phytochemical concentrations and definitions of microbial inhibition—there is, nonetheless, overwhelming evidence that most spices have antimicrobial properties (Figure 3).

Inhibition of bacteria is especially important because they are more common in outbreaks of food-borne diseases and food poisoning than are fungi. Of the 30 spices for which information was available, all inhibited or killed at least one-quarter of the bacterial species on which they were tested, and half the spices inhibited or killed three-quarters of these bacteria. The four most potent spices—garlic, onion, allspice and oregano—killed every bacterial species tested. Most of the bacteria that were tested are widely distributed and are frequently implicated in food-borne illness.

The antimicrobial hypothesis assumes that the amounts of spices called for in a recipe are sufficient to produce the desirable effects and that cooking does not destroy the active chemicals. Although the efficacy of spices in prepared meals has not been evaluated directly, both assumptions are reasonable. The minimum concentrations of purified phytochemicals necessary to inhibit growth of food-borne bacteria in vitro are well within the range of spice concentrations used in cooking. Phytochemicals are generally thermostable, and spices containing those that are not, such as cilantro and parsley, are typically added after cooking, so their antimicrobial effects are not lost.

Prediction 2: Use of spices should be greatest in hot climates, where unrefrigerated foods spoil quickly. Uncooked foods and leftovers, particularly meat products, can build up massive bacterial populations if they are stored at room temperature for more than a few hours, especially in the tropics where temperatures are highest and the diversity of food-borne pathogens also is the greatest. Under the antimicrobial hypothesis, there should be a positive relationship between annual temperatures and spice use, assuming that traditional recipes predate the availability of refrigeration.

Emma Skurnick

In the 36 countries sampled, the average annual temperatures range from 2.8 degrees Celsius in Norway to 27.6 degrees in Thailand. Consistent with this prediction, the use of spices is greater in countries with comparatively high temperatures (Figure 4). In particular, the fraction of both meat- and vegetable-based recipes that called for at least one spice, the mean number of spices per recipe and the number of different spices used all were greater in the warmer countries. These trends were especially strong for the "highly inhibitory" spices—those that reduced the growth of 75 percent or more of the bacterial species tested—including chili peppers, garlic, onion, cinnamon, cumin, lemongrass, bay leaf, cloves and oregano. Only the use of dill and parsley was negatively correlated with temperature, and neither is highly inhibitory.

In five of the six hottest countries (India, Indonesia, Malaysia, Nigeria and Thailand) every meat-based recipe called for at least one spice, whereas in the two coldest countries (Finland and Norway) more than a third of the meat-based recipes did not call for any spices at all. In India, 25 different spices were used, and the average meat-based recipe called for about nine of them, whereas the Norwegians used only 10 spices in total and usually less than two spices per recipe. In general, the most powerful spices are the most, popular and they become increasingly popular as the climate gets hotter.

Prediction 3: The spices used in each country should be particularly effective against the local bacteria. Spices are only useful if they inhibit or kill the indigenous microbes. Unfortunately, comprehensive lists of native food-borne bacteria do not exist for any country. To work around this problem, we decided to look at the effectiveness of native recipes in killing or inhibiting 30 common food-borne bacteria, including such well-known nasties as Clostridium botulinum, Escherichia coli, Salmonella pullorum, Staphylococcus aureus and Streptococcus faecalis. Most of these are distributed worldwide, and they often are implicated in outbreaks of food-borne disease. Consistent with this prediction, as annual temperatures increased among countries, the estimated percentage of bacteria that would be inhibited by the spices in an average recipe from each country also rose (Figure 4).

Prediction 4: Within a country, meat recipes should be spicier than vegetable recipes. Unrefrigerated meats are more often associated with food-borne disease outbreaks and food poisoning than are vegetables. This may be because dead plants are naturally better protected from microbial invasions. As we noted earlier, many contain inhibitory phytochemicals, but plants also have specially modified cell walls that contain cellulose and lignin, which are difficult to decompose for most aerobic microorganisms. Furthermore, the pH of plant cells (ranging from 4.3 to 6.5) is below the ideal growth conditions (between pH 6.6 and 7.5) for most bacteria.

In contrast, the cells of dead animals are relatively unprotected chemically and physically, and their internal pH is usually greater than 5.6. The primary defense of animal tissues against bacteria is the immune system, which ceases to function at death. For all these reasons, any relation between spoilage and spice use should be more apparent in meat-based recipes.

Tom Dunne and Emma Skurnick

Indeed, traditional meat-based recipes from all 36 countries combined called for an average of about 3.9 spices per recipe, significantly more than the 2.4 spices in the average vegetable-based recipe. This is even true within the collection of recipes from each of the 36 countries, indicating that the availability of spice plants in different countries cannot account for the global distinctions we observed (Figure 5).

Prediction 5: Within a country, recipes from lower latitudes and altitudes should be spicier because of the presumably greater microbial diversity and growth rates in these regions. Although we could not find altitude-specific cookbooks, there were large samples of traditional recipes from different latitudes in China and the United States. The results were consistent with this prediction. Recipes from the southern latitudes used a greater variety of spices and used individual spices more often in their recipes. Moreover, the typical southern recipe contained an assortment of spices that was more likely to kill or inhibit bacteria.

Alternative Hypotheses

One can imagine other reasons why spices may have become so common in the human diet. Perhaps the most prominent of these suggests that spices disguise the smell and taste of spoiled foods. Although this idea is consistent with the greater use of spices in hotter climates, it ignores the potential dangers of ingesting bacteria-infested foods. Even a very hungry person would do better to err on the conservative side by passing up rather than covering up the taste of contaminated foods that would be deadly to someone in a weakened condition.

Another popular idea suggests that spices might serve as medicaments. It is true that many spice plants have pharmacological uses in traditional societies. This includes use as topical or ingested antimicrobials, as aids to digestion, as treatment for high blood pressure, as sources of micronutrients and as aphrodisiacs. However, the use of spices in food preparation differs from that in traditional medicine in a number of ways. In cooking, spices are routinely added to specific recipes in relatively small quantities and regardless of the diner's health status. By contrast, in medicinal usage phytochemicals are taken occasionally, usually in response to particular maladies, in much larger quantities and not necessarily in association with food. Use of a spice as a medicament is more like taking a pill than preparing a meal.

It has also been suggested that spicy foods might be preferred in hot climates because they increase perspiration and thus help cool the body. Indeed, chilis and horseradish do cause some people to sweat. However, the use of many other spices also increases with temperature, and these do not increase perspiration. In general, physiological mechanisms of temperature regulation operate to keep us cool without the necessity of finding, eating and dealing with the side effects of phytochemicals.

Finally, it could be that people use whatever aromatic plants grow locally just because they taste good. Under this proximate-level hypothesis, spice chemicals should be highly palatable, and spice-use patterns should correspond to availability. Neither prediction is fully supported. There is no relation between the number of countries in which each spice plant grows and either the number of countries in which it is used or their annual temperatures. Second, pungent spices like garlic, ginger, anise and chilis are initially distasteful to most people. For most unpalatable substances, an initial negative response is sufficient to maintain avoidance for a lifetime. Yet most people come to like spicy foods, often as a result of urging by family members and friends—another sign that spice use is beneficial.

Morning Sickness

There also are costs to using spices. When eaten in sufficient amounts, many phytochemicals act as allergens, mutagens, carcinogens, teratogens and abortifacients. This may suggest why pre-adolescent children typically dislike spicy foods: Children are particularly susceptible to mutagens because some of their tissues are undergoing rapid cell division. Their alternative is to avoid foods that might contain pathogens or phytochemicals—perhaps this is why children have acquired a reputation as "picky eaters."

American Scientist staff

Rapid cell division also takes place within the body of a pregnant woman. Moreover, pregnant women are especially susceptible to food-borne illnesses and infectious diseases because their cell-mediated immune response is depressed—lest the woman's body reject the foreign tissue that is her baby-to-be. The risks for the mother create even greater dangers for the embryo. Miscarriages and birth defects can result if a pregnant woman contracts an illness, especially during the first trimester. Toxoplasma gondii, for example, is a common food-borne parasite that can be acquired by handling or eating raw or undercooked meat. It is usually only dangerous to those whose immune systems are compromised, such as pregnant women. The parasite has been linked to congenital neurological birth defects, spontaneous abortions and neonatal diseases.

What's a mother to do? She might consider cooking with powerful spices, but this would expose the embryo to phytotoxins. We believe that natural selection has provided another type of answer: morning sickness.

Tom Dunne and Emma Skurnick

Morning sickness is the common term for nausea and vomiting during pregnancy. It is actually a complete misnomer: Symptoms occur throughout waking hours, not just in the morning, and "sickness" implies pathology whereas healthy women experience the symptoms and bear healthy babies. For these reasons, the medical community uses the acronym NVP, short for "nausea and vomiting of pregnancy." Symptoms first appear about five weeks after the last menstrual period, peak during weeks 8 to 12, and gradually decline thereafter (Figure 7). Some women experience symptoms severe enough to dramatically disrupt their daily lives, and about one out of a hundred experiences symptoms so severe that she requires hospitalization, a condition known as hyperemesis gravidarum.

As with spice use, the origin of NVP can be analyzed at two levels. At the proximate level, the physiological mechanisms underlying the symptoms are relatively well studied. Reproductive hormones set the stage by sensitizing neural pathways that trigger the symptoms. Notable among these is chorionic gonadotropin, which follows a time course that closely parallels the symptoms of NVP. At the ultimate level, two questions arise: Why do women feel nauseated, as opposed to some other symptom, early in pregnancy? And how does NVP affect the outcome of pregnancy?

Protecting Mother and Child

Other scientists have preceeded us in exploring the function of morning sickness. In the 1970s, Ernest Hook, then at Albany Medical College, suggested that nausea and vomiting protect the embryo by expelling potentially dangerous foods and causing women to develop aversions to foods that might contain harmful chemicals, such as alcohol and caffeinated beverages. In the 1980s, Margie Profet, then at the University of California, Berkeley, extended Hook's hypothesis, focusing on the idea that NVP protects the embryo from phytochemicals in strong-tasting vegetables and spices.

We recently evaluated this "embryo-protection hypothesis" quantitatively, and our results generally support the idea. However, the hypothesis proposed by Hook and Profet did not predict all the foods that trigger nausea and vomiting. We therefore developed a more comprehensive and specific hypothesis: Morning sickness protects the developing embryo from teratogens and also protects both the mother and her embryo from food-borne microorganisms. We tested five critical predictions of this "maternal-and-embryo-protection" hypothesis using information from the medical, psychological and anthropological literature.

Prediction 1: NVP symptoms should peak when the embryo is most susceptible to disruption. Embryonic tissues are most sensitive when cells are rapidly dividing and differentiating into organs. Teratogenic chemicals and infections rarely cause congenital anomalies during the first four post-menstrual weeks. In the fifth week, the developing central nervous system, heart and ears become sensitive. Other organ systems follow soon thereafter, and most embryonic organ systems reach their peak sensitivity during weeks 6 to 12 and then decline. The central nervous system is the notable exception, as it continues to be critically sensitive through week 18. A plot of the time course of NVP and the periods of tissue sensitivity shows an obvious—indeed a striking—correspondence (Figure 7).

Tom Dunne and Emma Skurnick

Prediction 2: Foods that pregnant women find aversive should potentially contain phytotoxins and pathogenic microorganisms, but foods that they crave should not. We located 20 studies of food aversions (among 5,432 women) and 21 studies of food cravings (among 6,239 women), which were based on questionnaires administered to women during pregnancy or soon after parturition. In general, pregnant women were most often averse to foods categorized as "meat, fish, poultry and eggs" and "nonalcoholic beverages," mostly caffeinated ones. They also found "vegetables" and "alcoholic beverages" to be aversive. Interestingly, the pattern of food cravings was virtually the opposite: the categories "fruit and fruit juice," "sweets, desserts and chocolate" and "dairy" were the most sought after (Figure 8).

Tom Dunne and Emma Skurnick

Consistent with this prediction, the three most averse food categories were the ones most likely to contain microorganisms (meat products) and phytochemicals (vegetables, coffee and tea). Alcohol, the fourth most aversive category, is also a teratogen in sufficient quantities. In contrast, the food categories that were more often craved than found aversive (fruits, grains, sweets and dairy products) were the ones least likely to contain microorganisms or phytochemicals. Surprisingly, however, aversions to "ethnic, strong and spicy" foods were as rare as cravings for them.

Prediction 3: Aversions to foods that potentially contain harmful substances should peak in the first trimester, when embryonic organogenesis is most sensitive to disruption. Detailed information on dietary preferences of women throughout pregnancy indicates that aversions to all food categories are highest in the first trimester and decline dramatically thereafter (Figure 9). Judith Rodin of Yale University and Norean Radke-Sharpe of Bowdoin College found that women in their first trimester report significantly more aversions than nonpregnant controls to all food categories, particularly to meat, fish, poultry and eggs.

Tom Dunne and Emma Skurnick

Prediction 4: NVP should be associated with positive pregnancy outcomes. Information on the relation between NVP symptoms and miscarriage is available from nine studies involving 22,305 pregnancies. (Only seven are reproduced in Figure 10.) Consistent with this prediction, women who experienced NVP were significantly less likely to miscarry than women who experienced no symptoms in all nine studies. Also, as the severity of the NVP symptoms increased (within the normal range of symptoms), it was less likely that the pregnancy would end in miscarriage. However, the occurrence of NVP did not correlate with the incidence of stillbirth, low birth weight or birth defects.

Prediction 5: The expression of NVP should depend on diet, occurring least often among women whose staple foods are unlikely to contain dangerous substances. Across the modern world there is considerable variation in the frequency of NVP, from a high of 84 percent among Japanese women to a low of 35 percent among women in India. This variability might be used to evaluate the prediction, but no one has quantified diets of pregnant women in relation to NVP within or among countries. However, information on both diet and the occurrence of NVP among 27 traditional societies exists in a database called the Human Relations Area Files. Interestingly, ethnographers reported that NVP did not occur in seven of these. Compared to the 20 societies in which NVP did occur, these seven were significantly less likely to have meat as a dietary staple and significantly more likely to have only plants as staple foods. Societies without NVP also were significantly more likely to have corn as a staple. Most strains of domesticated corn have few secondary compounds, and dried corn is resistant to many microorganisms. Societies whose diets consist primarily of corn or other bland vegetables rarely encounter foods that are predicted to trigger NVP symptoms—and they exhibit the lowest incidence of NVP.

Alternative Hypotheses

If NVP indeed serves a protective function, then alleviating the symptoms should leave the mother and embryo more vulnerable. Arnold Seto and his colleagues of the University of Toronto conducted a meta-analysis of studies of women who had taken antihistamines to treat NVP in their first trimester. They reported that these women were slightly less likely to bear children with major malformations than women who had not taken these antinauseants. However, this result does not necessarily disconfirm the hypothesis because the direction of cause and effect is unclear. For example, if women carrying the healthiest embryos were the most likely to experience symptoms severe enough to seek chemical relief, then use of antihistamines was an effect, not the cause, of the positive outcome. In addition, since women took the drugs to alleviate persistent symptoms, they already may have developed aversions to foods that could contain pathogens or teratogens before NVP was suppressed.

Nonetheless, this raises the question of whether NVP could be better explained by an alternative hypothesis. Three have been proposed. First, nausea and vomiting may be inevitable side effects of the high hormonal levels associated with viable pregnancies. If so, the symptoms themselves have no function, as Zena Stein and Mervyn Susser of Columbia University have suggested. However, contrary to this nonadaptive hypothesis, NVP is neither a necessary concomitant of a viable pregnancy nor only associated with viable pregnancies. Data compiled by Ronald and Margaret Weigel of the University of Illinois revealed that of 5,235 pregnancies in which NVP did not occur, 90 percent resulted in live births. And of 13,192 pregnancies in which NVP did occur, 4 percent resulted in miscarriages. Moreover, as we noted earlier, there are seven societies in which NVP has never been reported, although viable pregnancies routinely occurred. Finally, this hypothesis does not predict or explain the specificity of food aversions and cravings.

Another hypothesis, proposed by Anthony Deutsch of University of California, San Diego, suggests that nausea and vomiting may be communication signals that alert a woman's husband and family to her impending need for additional food and protection, or the desirability of reducing sexual intercourse. This hypothesis is contradicted by three facts. First, intercourse does not affect the viability of a pregnancy, except possibly during the final 4–6 weeks of gestation, by which time NVP has typically waned. Second, under this hypothesis NVP should occur unless there is no one with whom to communicate; the apparent absence of NVP in seven societies is inexplicable under this hypothesis. Third, NVP peaks six to eight weeks after conception, by which point other, less costly and uncomfortable, indications of pregnancy are apparent—such as the cessation of menstruation which is universally recognized as a sign of pregnancy.

Rachel Huxley of the University of Oxford has suggested that NVP may reduce energy intake in early pregnancy, thereby suppressing maternal tissue synthesis, which results in placental weight increases. Under this hypothesis, nausea and vomiting help ensure that scarce nutrients are partitioned in favor of the developing placenta. However, this hypothesis does not explain why the vast majority of pregnancies that do not involve NVP are successful. It also does not account for the seven societies in which NVP has not been observed, or the fact that pregnant women crave energy-rich foods such as fruits, sweets, grains and starches, whereas energy-poor foods such as caffeinated beverages and vegetables are aversive.

In 1993, David Haig of Harvard University concluded his review of genetic conflicts in human pregnancy by stating that "various hypotheses have been proposed to account for nausea during pregnancy, but I am unable to come to clear conclusions because the evidence remains equivocal." We believe a front-running hypothesis has emerged: maternal and embryo protection.

Evolutionary Anachronisms?

Nowadays our vegetables have minimal secondary compounds, owing to artificial selection. Refrigeration protects us from food-borne pathogens, supplemented by cooking, salting and the use of artificial preservatives. Are spice use and morning sickness just evolutionary anachronisms? We think not. Food-borne pathogens are still a major health threat, and new food-borne diseases are always evolving. In the United States alone food-borne illnesses afflict an estimated 80 million people per year, and one in ten Americans experiences bacterial food poisoning annually. Elsewhere in the world food-borne illnesses exact far greater human and economic tolls.

The potential importance of spice use in quelling modern pathogens is illustrated by the rates of food-borne illnesses in Japan and Korea, which are neighboring countries with similar, temperate climates. A study by Won-Chang Lee of Kon-Kuk University and his colleagues found that from 1971 to 1990, food poisoning, primarily of bacterial origin, affected 29.2 out of 100,000 Japanese but only about 3.0 of every 100,000 Koreans. These authors suggested that the difference was due to commercial food-handling procedures. In addition, Korean recipes are spicier. They more frequently call for spices and contain more highly antimicrobial spices per recipe. Indeed, Korean food is among the hottest in the world. As a result, the estimated fraction of food-borne bacteria inhibited by the average Korean recipe is significantly greater than its Japanese counterpart.

A possible reason why traditional Japanese recipes call for so few spices is that they date from times when fresh seafood was continuously available from local waters. Today more food is imported and it comes from farther away, so there is more time for the growth of microbial populations. Studies show that imported seafood is about five times more likely to contain pathogenic bacteria than the domestic catch and that imported seafood is a major source of food-borne diseases in Japan. Traditional Japanese recipes may just not include enough spices to cope with the pathogens in imported food. It is probably not a coincidence that the highest reported rates of NVP in the world also occur in Japan.

The possibility that NVP may be adaptive in the 21st century is supported by the relation between the occurrence of NVP and reduced chances of miscarriage that has been documented multiple times in the past two decades (Figure 10). This does not mean that pregnant women should eliminate animal products and vegetables from their diets. These foods contain essential vitamins and nutrients. Rather, the message is that NVP is not a "disease" in the normal sense of that word, but rather serves a useful function. There are good evolutionary reasons why so many women respond adversely to the smells or tastes of particular foods. Uncomplicated nausea and vomiting will not hurt the embryo, and may even help to protect it. We hope pregnant women will derive some comfort from this analysis, knowing that NVP indicates the evolved "wisdom" of their bodies rather than their frailty.

Acknowledgments

The authors thank Jennifer Billing and Geoffrey A. Hash for their enthusiastic participation in the spice study, and Jennifer Billing and Mark E. Hauber for comments on preliminary drafts. Funding was provided by the National Science Foundation, the Howard Hughes Foundation, the Olin Foundation, and the College of Agriculture and Life Sciences at Cornell University.

Bibliography

  • Billing, J., and P. W. Sherman. 1998. Antimicrobial functions of spices: why some like it hot. Quarterly Review of Biology 73:3–49.
  • Deutsch, J. A. 1994. Pregnancy sickness as an adaptation to concealed ovulation. Rivista di Biologia 87:277–295.
  • Flaxman, S. M., and P. W. Sherman. 2000. Morning sickness: A mechanism for protecting mother and embryo. Quarterly Review of Biology 75:113–148.
    • Haig, D. 1993. Genetic conflicts in human pregnancy. Quarterly Review of Biology 68:495–532.
    • Hook, E. B. 1976. Changes in tobacco smoking and ingestion of alcohol and caffeinated beverages during early pregnancy: Are these consequences, in part, of feto-protective mechanisms diminishing maternal exposure to embryotoxins? In Birth Defects: Risks and Consequences, ed. S. Kelly, E. B. Hook, D. T. Janerich and I. H. Porter. New York: Academic Press, pp.173?183.
    • Huxley, R. R. 2000. Nausea and vomiting in early pregnancy: its role in placental development. Obstetrics and Gynecology 95:779–782. [CrossRef]
    • Inoue, S., A. Nakama, Y. Arai, Y. Kokubo, T. Maruyama, A. Saito, T. Yoshida, M. Terao, S. Yamamoto and S. Kumagai. 2000. Prevalence and contamination levels of Listeria monocytogenes in retail foods in Japan. International Journal of Food Microbiology 59:73–77. [CrossRef]
    • Lee, W.-C., T. Sakai, M.-J. Lee, M. Hamakawa, S.-M. Lee and I.-M. Lee. 1996. An epidemiological study of food poisoning in Korea and Japan. International Journal of Food Microbiology 29:141–148. [CrossRef]
    • Profet, M. 1992. Pregnancy sickness as adaptation: A deterrent to maternal ingestion of teratogens. In The Adapted Mind: Evolutionary Psychology and the Generation of Culture, eds. J. H. Barkow, L. Cosmides and J. Tooby. New York: Oxford University Press, pp. 327?365.
    • Rodin, J., and N. Radke-Sharpe. 1991. Changes in appetitive variables as a function of pregnancy. In Chemical Senses. Volume 4: Appetite and Nutrition, eds. M. I. Friedman, M. G. Tordoff and M. R. Kare. New York: Marcel Dekker, pp. 325?340.
    • Seto, A., T. Einarson and G. Koren. 1997. Pregnancy outcome following first trimester exposure to antihistamines: Meta-analysis. American Journal of Perinatology 14:119–124.
    • Sherman, P. W. 1988. The levels of analysis. Animal Behaviour 36:616–619.
    • Sherman, P. W., and J. Billing. 1999. Darwinian gastronomy: Why we use spices. BioScience 49:453–463.
    • Sherman, P. W., and G. A. Hash. In press. Why vegetable recipes are not very spicy. Evolution and Human Behavior.
    • Stein, Z., and M. Susser. 1991. Miscarriage, caffeine, and the epiphenomena of pregnancy: The causal model. Epidemiology 2:163–167.
    • Weigel, R. M., and M. M. Weigel. 1989. Nausea and vomiting of early pregnancy and pregnancy outcome. A meta-analytical review. British Journal of Obstetrics and Gynaecology 96:1312–1318.

American Scientist Comments and Discussion

To discuss our articles or comment on them, please share them and tag American Scientist on social media platforms. Here are links to our profiles on Twitter, Facebook, and LinkedIn.

If we re-share your post, we will moderate comments/discussion following our comments policy.