During the nineteenth and early twentieth centuries, as in many preceding centuries, wheat therefore changed little. The Pillsbury’s Best XXXX flour my grandmother used to make her famous sour cream muffins in 1940 was little different from the flour of her great-grandmother sixty years earlier or, for that matter, from that of a distant relative two or three centuries before that. Grinding of wheat became mechanized in the twentieth century, yielding finer flour on a larger scale, but the basic composition of the flour remained much the same.
That all ended in the latter half of the twentieth century, when an upheaval in hybridization methods transformed this grain. What now passes for wheat has changed, not through the forces of drought or disease or a Darwinian scramble for survival, but through human intervention.
Wheat has undergone more drastic transformation than the Real Housewives of Beverly Hills, stretched, sewed, cut, and stitched back together to yield something entirely unique, nearly unrecognizable when compared to the original and yet still called by the same name: wheat.
Modern commercial wheat production has been intent on delivering features such as increased yield, decreased operation costs, and large-scale production of a consistent commodity. All the while, virtually no questions have been asked about whether these features are compatible with human health. I submit that, somewhere along the way during wheat’s history, perhaps five thousand years ago but more likely sixty years ago, wheat changed in ways that yielded exaggerated adverse effects on human health.
The result: A loaf of bread, biscuit, or pancake of today is different from its counterpart of a thousand years ago, different even from what our grandmothers made. They might look the same, even taste much the same, but there are fundamental biochemical differences. Small changes in wheat protein structure, for instance, can spell the difference between a devastating immune response to wheat protein versus no immune response at all.
WHAT HAPPENED TO THE FIRST WHEAT-EATERS?
After not consuming the seeds of grasses for the first 99.6 percent of our time on this planet, we finally turned to them for sustenance ten thousand years ago. Desperation, caused by a shortage of wild game and plants due to a natural shift in climate, prompted Neolithic hunter-gatherers to view seeds of grasses as food. But we cannot save grass clippings gathered from cutting our lawns to sprinkle on top of a salad with a little vinaigrette; likewise, we found out the hard way that, when ingested, the leaves, stalks, and husks of grasses are tasteless and inedible, wreaking gastrointestinal havoc like nausea, vomiting, abdominal pain, and diarrhea, or passing through the gastrointestinal tract undigested. The grasses of the earth are indigestible to humans (unlike herbivorous ruminants, who possess adaptations that allow them to graze on grasses, such as multi-compartment stomachs and spiral colons that harbor unique microorganisms that break grasses down).
It must have taken considerable trial and error to figure out that the seeds of grass, removed from the husk, then dried, pulverized with stones, and heated in water, would yield something that could be eaten and provide carbohydrate nourishment. Over time, increased efficiencies in harvesting and grinding allowed grass seeds to play a more prominent role in the human diet.
So what became of those first humans who turned to the seeds of wheat grass to survive?
Anthropologists tell us that there was an explosion of tooth decay and tooth abscess; microorganisms of the mouth and colon changed; the maxillary bone and mandible of the skull shrank, resulting in crooked teeth; iron deficiency anemia became common; the frequency of knee arthritis doubled; and bone length and diameter decreased, resulting in a reduced height of five inches in males, three inches in females.1, 2, 3, 4
The explosion of tooth decay, in particular, is telling: Prior to the consumption of the seeds of grasses, tooth decay was uncommon, affecting only 1 to 3 percent of all teeth recovered. This is extraordinary, as non-grain-eating humans had no fluoridated water or toothpaste, no toothbrushes, no dental floss, no dentists, no dental insurance card, yet had perfectly straight, healthy teeth even to old age. (Yes, ancient humans lived to their fifties, sixties, and seventies, contrary to popular opinion.) When humans first turned to grains—einkorn wheat in the Fertile Crescent, millet in sub-Saharan Africa, and maize and teosinte in Central America—humans developed an explosion of tooth decay: 16 to 49 percent of teeth showed decay and abscess formation, as well as misalignment, even in young people.5
Living in a wild world, hunting and gathering food, humans needed a full set of intact teeth to survive, sometimes having to eat their food raw, which required prolonged, vigorous chewing. The dental experience with wheat and grains encapsulates much that is wrong with their consumption. The amylopectin A carbohydrate that provides carbohydrate calories may allow survival for another few days or weeks, but it is also responsible for the decline in dental health months to years later—trading near-term survival in exchange for long-term crippling changes in health at a time when mercury fillings and dentures were not an option. Over the centuries, human grain consumers learned they had to take extraordinary steps to preserve their teeth. Today, of course, we have a multi-billion dollar industry delivered by dentists, orthodontists, toothpaste manufacturers, and so forth, all to largely counter the decay and misalignment of teeth that began when humans first mistook the seeds of grasses for food.
WHEAT BEFORE GENETICISTS GOT HOLD OF IT
Wheat is uniquely adaptable to environmental conditions, growing in Jericho, 850 feet below sea level, to Himalayan mountainous regions 10,000 feet above sea level. Its latitudinal range is also wide, ranging from as far north as Norway, 65° north latitude, to Argentina, 45° south latitude. Wheat occupies sixty million acres of farmland in the United States, an area equal to the state of Ohio. Worldwide, wheat is grown on an area ten times that figure, or twice the total acreage of Western Europe. After all, Domino’s has lots of pizzas to sell at $5.99.
The first wild, then cultivated, wheat was einkorn, the great-granddaddy of all subsequent wheat. Einkorn has the simplest genetic code of all wheat, containing only fourteen chromosomes. Circa 3300 BC, hardy, cold-tolerant einkorn wheat was a popular grain in Europe. This was the age of the Tyrolean Iceman, fondly known as Ötzi. Examination of the intestinal contents of this naturally mummified Late Neolithic hunter, killed by attackers and left to freeze in the mountain glaciers of the Italian Alps, revealed the partially digested remains of einkorn wheat consumed as unleavened flatbread, along with remains of plants, deer, and ibex meat.6
Shortly after human cultivation of the first einkorn plant, the emmer variety of wheat, the natural offspring of parents einkorn and an unrelated wild grass, Aegilops speltoides or goatgrass, made its appearance in the Middle East.7 Consistent with the peculiar promiscuity unique to grasses, goatgrass added its genetic code to that of einkorn, resulting in the more complex twenty-eight-chromosome emmer wheat. Grasses such as wheat have the ability to retain the sum of the genes of their forebears. Imagine that, when your parents mated to create you, rather than mixing chromosomes and coming up with forty-six chromosomes to create their offspring, they combined forty-six chromosomes from Mom with forty-six chromosomes from Dad, totaling ninety-two chromosomes in you. This, of course, doesn’t happen in higher species. Such additive accumulation of chromosomes in grasses is called polyploidy and you and other mammals like hedgehogs and squirrels are incapable of it. But the grasses of the earth, including the various forms of wheat, are capable of such chromosomal multiplication.
Einkorn and its evolutionary successor emmer wheat remained popular for several thousand years, sufficient to earn their place as food staples and religious icons, despite their relatively poor yield and less desirable baking characteristics compared to modern wheat. (These denser, cruder flours would have yielded lousy ciabattas or bear claws.) Emmer wheat is probably what Moses referred to in his pronouncements, as well as the kussemeth mentioned in the Bible, and the variety that persisted up until the dawn of the Roman Empire.
Sumerians, credited with developing the first written language, left us tens of thousands of cuneiform tablets. Pictographic characters, dated to 3000 BC, describe recipes for breads and pastries, all made by taking mortar and pestle or hand-pushed grinding wheel to emmer wheat. Sand was often added to the mixture to hasten the laborious grinding process, leaving bread-eating Sumerians with sand-chipped teeth.
Emmer wheat flourished in ancient Egypt, its cycle of growth suited to the seasonal rise and fall of the Nile. Egyptians are credited with learning how to make bread “rise” by the addition of yeast. When the Jews fled Egypt, in their hurry they failed to take the leavening mixture with them, forcing them to consume unleavened bread made from emmer wheat.
Sometime in the millennia predating Biblical times, twenty-eight-chromosome emmer wheat (Triticum turgidum) mated naturally with another grass, Triticum tauschii, yielding primordial forty-two-chromosome Triticum aestivum, genetically closer to what we now call wheat. Because it contains the sum total of the chromosomal content of three unique grasses with forty-two chromosomes, it is the most genetically complex. It is therefore the most genetically “pliable,” an issue that will serve future genetics researchers well in the millennia to come.
Over time, the higher yielding and more baking-compatible Triticum aestivum species gradually overshadowed its parents, einkorn and emmer wheat. In the ensuing centuries, Triticum aestivum wheat changed little. By the mid-eighteenth century, the great Swedish botanist and biological cataloger, Carolus Linnaeus, father of the Linnean system of the categorization of species, counted five different varieties falling under the Triticum genus.
Wheat did not evolve naturally in the New World, but was introduced by Christopher Columbus, whose crew first planted a few grains in Puerto Rico in 1493. Spanish explorers accidentally brought wheat seeds in a sack of rice to Mexico in 1530, and later introduced it to the American Southwest. The namer of Cape Cod and discoverer of Martha’s Vineyard, Bartholomew Gosnold, first brought wheat to New England in 1602, followed shortly thereafter by the Pilgrims, who transported wheat with them on the Mayflower.
WILL THE REAL WHEAT PLEASE STAND UP?
What was the wheat grown ten thousand years ago and harvested by hand from wild fields like? That simple question took me to the Middle East—or more precisely, to a small organic farm in western Massachusetts.
There I found Elisheva Rogosa. Eli is not only a science teacher but an organic farmer, advocate of sustainable agriculture, and founder of the Heritage Grain Conservancy (www.growseed.org), an organization devoted to preserving ancient food crops and cultivating them using organic principles. After living in the Middle East for ten years and working with the Jordanian, Israeli, and Palestinian GenBank project to collect nearly extinct ancient wheat strains, Eli returned to the United States with seeds descended from the original wheat plants of ancient Egypt and Canaan. She has since devoted herself to cultivating the ancient grains that sustained her ancestors.
My first contact with Eli began with an exchange of e-mails that resulted from my request for 2 pounds of einkorn wheat grain. She couldn’t stop herself from educating me about her unique crop, which was not just any old wheat grain, after all. Eli described the taste of einkorn bread as “rich, subtle, with more complex flavor,” unlike bread made from modern wheat flour that she believes tastes like cardboard.
Eli bristles at the suggestion that wheat products might be unhealthy, citing instead the yield-increasing, profit-expanding agricultural practices of the past few decades as the source of the adverse health effects of wheat. She views einkorn and emmer as the solution, restoring the original grasses, grown under organic conditions, to replace modern industrial wheat.
And so it went, a gradual expansion of the reach of wheat plants with only modest and continual evolutionary selection at work.
Today einkorn, emmer, and the original wild and cultivated strains of Triticum aestivum have been replaced by thousands of modern human-bred offspring of Triticum aestivum, as well as Triticum durum (pasta) and Triticum compactum (yielding very fine flours used to make cupcakes and other products). To find einkorn or emmer today, you’d have to look for the limited wild collections or modest human plantings scattered around the Middle East, southern France, northern Italy, or Eli Rogosa’s farm. Courtesy of modern human-managed hybridizations and other genetic manipulations, Triticum species of today are thousands of genes apart from the original einkorn wheat that grew naturally, farther apart than you are from the primates hanging from trees in the zoo.
Modern Triticum wheat is the product of breeding to generate greater yield and characteristics such as disease, drought, and heat resistance. In fact, wheat has been modified by humans to such a degree that modern strains are unable to survive in the wild without human support such as nitrate fertilization and pest control.8 (Imagine this bizarre situation in the world of domesticated animals: an animal able to exist only with human assistance, such as special feed or antibiotics, else it would die.)
Differences between the wheat of the Natufians and what we call wheat in the twenty-first century are evident to the naked eye. Original einkorn and emmer wheat were “hulled” forms, simply meaning that the seeds clung tightly to the stem. Modern wheats are “naked” forms, in which the seeds depart from the stem more readily, a characteristic that makes threshing (separating the seed from the chaff) easier, determined by mutations at the Q and Tg (tenacious glume) genes.9 But other differences are even more obvious. Modern wheat is much shorter. The romantic notion of tall fields of wheat grain gracefully waving in the wind has been replaced by “dwarf” and “semi-dwarf” varieties that stand barely a foot or two tall, yet another product of breeding experiments to increase yield and reflecting the extensive genetic changes that this grass has undergone.
SMALL IS THE NEW BIG
For as long as humans have practiced agriculture, farmers have strived to increase yield. Marrying a woman with a dowry of several acres of farmland was, for many centuries, the primary means of increasing crop yield, arrangements often accompanied by several goats and a sack of potatoes. The twentieth century introduced mechanized farm machinery that replaced animal power and increased efficiency, providing another incremental increase in yield per acre. While production in the United States was usually sufficient to meet demand (with distribution limited more by poverty than by supply), many other nations were unable to feed their populations, resulting in widespread hunger.
In modern times, humans have tried to increase yield by creating new strains, crossbreeding different wheats and grasses and generating new genetic varieties in the laboratory. Hybridization efforts involved techniques such as introgression and “back-crossing,” in which offspring of plant breeding are mated with their parents or with different strains of wheat or even other grasses. Such efforts, though first formally described by Austrian priest and botanist Gregor Mendel in 1866, did not begin in earnest until the mid-twentieth century, when concepts such as heterozygosity and gene dominance were better understood. Since Mendel’s early efforts, geneticists have developed elaborate techniques to obtain a desired trait, though much trial and error is still required.
Much of the current world supply of purposefully bred wheat is descended from strains developed at the International Maize and Wheat Improvement Center (IMWIC), located at the foot of the Sierra Madre Oriental mountains east of Mexico City. IMWIC began as an agricultural research program in 1943 through a collaboration of the Rockefeller Foundation and the Mexican government to help Mexico achieve agricultural self-sufficiency. It grew into an impressive worldwide effort to increase the yield of corn, soy, and wheat, with the admirable goal of reducing world hunger. Mexico provided an efficient proving ground for plant hybridization, since the climate allows two growing seasons per year, cutting the time required to hybridize strains by half. By 1980, these efforts produced thousands of new strains of wheat, the most high-yielding of which have since been adopted worldwide, from Third World countries to modern industrialized nations, including the United States.
One of the practical difficulties solved during IMWIC’s push to increase yield is that, when large quantities of synthetic nitrogen-rich fertilizer are applied to wheat fields, the seed head at the top of the plant grows to enormous proportions. The top-heavy seed head, however, buckles the stalk (what agricultural scientists call “lodging”). Lodging kills the plant and makes harvesting problematic. University of Minnesota–trained agricultural scientist Norman Borlaug, working at IMWIC, is credited with developing the exceptionally high-yielding semi-dwarf wheat that was shorter and stockier, allowing the plant to maintain erect posture and resist buckling under the large seed head. Short stalks are also more efficient; they reach maturity more quickly, which means a shorter growing season with less fertilizer required to generate the otherwise useless stalk.
Dr. Borlaug’s wheat-hybridizing accomplishments earned him the title of “Father of the Green Revolution” in the agricultural community, as well as the Presidential Medal of Freedom, the Congressional Gold Medal, and the Nobel Peace Prize in 1970. On his death in 2009, the Wall Street Journal eulogized him: “More than any other single person, Borlaug showed that nature is no match for human ingenuity in setting the real limits to growth.” Dr. Borlaug lived to see his dream come true: His high-yield semi-dwarf wheat did indeed help solve world hunger, with the wheat crop yield in China, for example, increasing eightfold from 1961 to 1999.
Semi-dwarf wheat today has essentially replaced virtually all other strains of wheat in the United States and much of the world thanks to its extraordinary capacity for high yield. According to Allan Fritz, PhD, professor of wheat breeding at Kansas State University, semi-dwarf wheat now comprises more than 99 percent of all wheat grown worldwide.
BAD BREEDING
The peculiar oversight in the flurry of breeding activity, such as that conducted at IMWIC, was that, despite dramatic changes in the genetic makeup of wheat and other crops in achieving the goal of increased yield, no animal or human safety testing was conducted on the new genetic strains that were created. So intent were the efforts to increase yield, so confident were plant geneticists that hybridization yielded safe products for human consumption, so urgent was the cause of world hunger, that products of agricultural research were released into the food supply without human safety concerns being part of the equation.
It was simply assumed that, because breeding efforts yielded plants that remained essentially “wheat,” new strains would be perfectly well tolerated by the consuming public. Agricultural scientists, in fact, scoff at the idea that breeding manipulations have the potential to generate strains that are unhealthy for humans. After all, breeding techniques have been used, albeit in cruder form, in crops, animals, even humans for centuries. Mate two varieties of tomatoes, you still get tomatoes, right? Breed a Chihuahua with a Great Dane, you still get a dog. What’s the problem? The question of animal or human safety testing was never raised. With wheat, it was likewise assumed that variations in gluten content and structure, modifications of other enzymes and proteins, qualities that confer susceptibility or resistance to various plant diseases, would all make their way to humans without consequence.
Judging by research findings of agricultural geneticists, such assumptions are unfounded and just plain wrong. Analyses of proteins expressed by a wheat hybrid compared to its two parent strains have demonstrated that, while approximately 95 percent of the proteins expressed in the offspring are the same, 5 percent are unique, found in neither parent.10 Wheat gluten proteins, in particular, undergo considerable structural change with a method as basic as hybridization. In one hybridization experiment, fourteen new gluten proteins were identified in the offspring that were not present in either parent wheat plant.11 Moreover, when compared to century-old strains of wheat, modern strains of Triticum aestivum express a higher quantity of genes for gluten proteins that are associated with celiac disease.12
The changes introduced into wheat go even further, involving a process called chemical mutagenesis. BASF, the world’s largest chemical manufacturer, holds the patent on a strain of wheat called Clearfield that is resistant to the herbicide imazamox (Beyond). Clearfield wheat is impervious to imazamox, allowing the farmer to spray it on his field to kill weeds but not the wheat, similar to corn and soy that are genetically modified to be resistant to glyphosate (Roundup). In their marketing, BASF proudly declares that Clearfield is not the product of genetic-modification. So how did they get Clearfield wheat to be herbicide resistant?
Clearfield wheat was developed by exposing seeds and embryos to sodium azide, a toxic chemical used in industrial settings. If the compound is mixed with water or an acid or comes into contact with metal (for example, as a result of an accident in a laboratory) it can create a potentially deadly toxic gas. The sodium azide was used to induce genetic mutations in wheat seeds and embryos until the desired mutation was obtained. Problem: Dozens of other mutations were also induced, but as long as the wheat plant did its job in yielding satisfactory bagels and biscuits, no further questions were asked and the end product was sold to the public.13 In addition to the process of chemical mutagenesis, there are also gamma ray and high-dose x-ray mutagenesis, all relatively indiscriminate methods to introduce mutations.
In the semantic game that Big Agribusiness likes to play, these methods do not fall under the umbrella of “genetic modification” even though they yield even more genetic changes than genetic modification. Clearfield wheat is now grown on about a million acres in the Pacific Northwest of the United States.
Surely the wheat industry deserves an honorary doctorate from the Vladimir Putin College of Obfuscation.
A GOOD GRAIN GONE BAD?
Given the genetic distance that has evolved between modern-day wheat and its evolutionary predecessors, is it possible that ancient grains such as emmer and einkorn can be eaten without the unwanted effects that accompany modern wheat products?
I decided to put ancient wheat to the test, grinding 2 pounds of whole einkorn grain to flour, which I then used to make bread. I also ground modern conventional organic whole wheat flour from seed. I made bread from both the einkorn and conventional flour using only water and yeast with no added sugars or flavorings. The einkorn flour looked much like conventional whole wheat flour, but once water and yeast were added, differences became evident: The light brown dough was less stretchy, less pliable, and stickier than a traditional dough, and it lacked the moldability of conventional wheat flour dough. The dough smelled different, too, more like peanut butter rather than the standard neutral smell of dough. It rose less than modern dough, rising just a little, compared to the doubling in size of modern bread. And, as Eli Rogosa claimed, the final bread product did indeed taste different: heavier, nutty, with an astringent aftertaste. I could envision this loaf of crude einkorn bread on the tables of third century BC Amorites or Mesopotamians.