The old, epic story of agriculture in North America had two heroes, long sung and much venerated. One was human ingenuity. The other was corn.
That story went something like this. On this continent, agriculture—and therefore civilization—was born in Mesoamerica, where corn happened to be abundant. The more advanced people there began cultivating this knobbly little plant and passed their knowledge north, to people in more temperate climes. When Europeans arrived, corn ruled the fields, a staple crop, just like wheat across the ocean. If the Middle East’s Fertile Crescent was agriculture’s origin point for Europe, Mexico was agriculture’s origin point here. This very human innovation had unspooled in the same rare way in these two places. Superior men tamed nature and taught other superior men to follow.
Part of this story is true. The first ear of corn—although calling it corn might be a stretch—likely grew somewhere in the highlands of Central Mexico, as far back as 10,000 or so years ago. The oldest known bits of recognizable corn, a set of four cobs each smaller than a pinky finger, are some thousands of years younger than that. They were uncovered in Oaxaca, in 1966, and that site, cuna del maiz, the “cradle of corn,” is in concept a landmark of human advancement on Earth. In appearance, like many archaeological sites, it is unimpressive, a cave so shallow that even the designation “cave” is questionable. But sometimes a whole history is preserved by chance on a dry cave floor. Sometimes a handful of seeds can help confirm a theory about the dawn of agriculture, or help unravel it.
Humans have been living in the valley of Oaxaca for ages; now the main road passes a boomlet of mezcalerias, flat fields of corn, and an antique cliffside etching of a cactus. Some nearby caves, too, have traces of ancient wall paintings—a jaguar, two stick figures, and la paloma, “the dove.” When, starting in 1964, the archaeologist Kent Flannery came to this valley looking for a place to dig, he examined more than 60 of these caves, tested 10 or so, and eventually focused his work on just two. And in one of those, he found some notably old corn cobs. Today, that cave is contained in a biological preserve where council members of the nearest town patrol the grounds and, from time to time, guide visitors up the ridge. Mostly they show off the ancient paintings, in vaulted caves with views that stretch for miles.
The corn cave, which is no taller or roomier than a modest corner office, likely served as a storeroom or shelter for nomadic peoples, who left behind bones and plant detritus as far back as 10,000 years ago. Amid the remains of deer, rabbit, mud turtle, mesquite, pine nuts, squash, and prickly pear, Flannery and his crew found those four scant specimens of corn. These days, the cobs are usually stored in Mexico City’s fabulous Museo Nacional de Antropología, but the winter I visited they happened to be on display in Oaxaca’s cultural museum. They, too, are not much to look at—skinny nubbins of plant, black and cragged with empty spaces where kernels once grew. Really, they’re hardly corn. And that gap, the distance between these hardly-corns and the flush, fleshy ears that sustain nations, is where the old story of agriculture’s origins starts to break down.
The development of agriculture, the Marxist archaeologist V. Gordon Childe declared in 1935, was an event akin to the Industrial Revolution—a discovery so disruptive that it spread like the shocks of an earthquake, transforming everything in its path. Childe’s work on what he termed “the Neolithic Revolution” focused on just one site of innovation in the Near East, the famous Fertile Crescent, but over time archaeologists posited similar epicenters in the Yangtze River valley of East Asia and in Mesoamerica. From that third point of origin, corn is supposed to have converted naive, nomadic hunter-gatherers into rooted, enlightened farmers throughout the continent, all the way up into the northern plains.
This long-held narrative now seems to be incomplete, at best. After all, corn took its sweet time fomenting that revolution—thousands of years to transform from scraggly specimens like the ones found in Oaxaca to full-on corn, thousands more to migrate up from Mesoamerica, and still more to adapt to the growing season at higher latitudes. In the rolling fields of the Midwest, the breadbasket of the United States, maize-based agriculture took over only with Mississippian culture, which began just one short millennium ago.
Over the past few decades, a small group of archaeologists have turned up evidence that supports a different timeline, which begins much, much earlier. Plant domestication in North America has no single center, they have discovered. In the land that’s now the U.S., domestication was not an import from farther south; it emerged all on its own. Before Mexico’s corn ever reached this far north, Indigenous people had already domesticated squash, sunflowers, and a suite of plants now known, dismissively, as knotweed, sumpweed, little barley, maygrass, and pitseed goosefoot. Together, these spindly grasses formed a food system unique to the American landscape. They are North America’s lost crops.
The lost crops tell a new story of the origins of cultivation, one that echoes discoveries all around the world. Archaeologists have now identified a dozen or more places where cultivation began independently, including Central America, Western and Eastern Africa, South India, and New Guinea. Even in the Fertile Crescent, the old story of a single agricultural revolution does not hold. People there domesticated more than one kind of wheat, and they did it multiple times, in disparate places. The agricultural revolution was both global and fragmented, less an earthquake than an evolutionary shift.
If correct, this new reading would debunk what is effectively a “Great Yeoman Theory of History.” No isolated bolts of human inspiration caused a wholesale shift in how humans live and eat; instead, one of civilization’s most important turns would be better understood as the natural outcome, more or less, of biology and botany, a marvel that could (and did) occur almost everywhere that people lived. The global food system that we have now is based on just a tiny fraction of all the plants on Earth. But other paths were always open.
It used to be that few people believed in America’s lost crops. The evidence was too limited, their seeds too small. Think of how tiny quinoa seeds are; pitseed goosefoot is closely related, but its seeds are even smaller—too small to register with Americans as food. A prominent lost-crops scholar, Gayle Fritz, once called this the “real men don’t eat pigweed” problem. At an archaeological symposium in the 1980s, a giant in the field dismissed these plants as little more than food for birds: Fritz recalls him saying something like, “All of the crops that have been recovered from the entire Eastern United States would not feed a canary for a week.”
The evidence that he was wrong has been sitting in archaeological archives for decades. Back in the ’30s, just as the idea of the Neolithic Revolution was taking hold, an archaeologist named Volney Jones was studying seeds found in a rock shelter in eastern Kentucky, similar to Flannery’s cave in Oaxaca. The Kentucky cave was littered with the remains of corn, gourds, and squash, along with the ancient seeds of sumpweed and goosefoot—“local prairie plants,” Jones called them. These plants did register as food to people back then: Some of their seeds were found preserved in human fecal matter. And the seeds were unusually large for plants of the kind, a sign of domestication.
Determining the age of archaeological specimens is an inexact art, and before radiocarbon dating was invented, in the ’40s, it was still less exact. Jones couldn’t say for sure how old the prairie seeds were, but if they were older than the corn and squash, he wrote, “we could hardly escape the startling conclusion that agriculture had a separate origin in the bluff shelter area.” He passed over this idea quickly, perhaps because it seemed so impossible. Even in American archaeology, a relatively quiet corner of human prehistory, a Kentucky cliff was considered a nothing place, where nothing important could have happened. If agriculture had a separate origin here, Western narratives of global human development would have to be rewritten.
“The Ozarks were supposed to be a backwater,” Fritz, who is a paleoethnobotanist and professor emerita at Washington University in St. Louis, told me. “We called it the ‘hillbilly hypothesis of Ozark nondevelopment.’ You know, they were probably mostly hunter-gatherers, throwbacks to the Archaic.” Deep into the first millennia A.D., these people were supposed to have been stuck in subsistence-level living. “Well, it turns out that’s just not true,” Fritz said.
Early in her career, Fritz came across a collection of ancient seeds from the Ozarks, beautiful specimens, many of which were unusually large and some of which had never been examined closely for subtle signs of domestication. Domesticated seeds develop traits that make them more appealing to humans: They are larger than wild ones, offering more nutrition, and sometimes their seed coats are thinner, granting easier access to the succulent bits. When Fritz examined the Ozarks goosefoot seeds, which had been excavated from yet another unassuming cave, she found that by the standards of wild seeds, their seed coats were notably thin. She spent some of her scant funding on accelerator-mass-spectrometry analysis, a new type of radiocarbon dating, to show that the seeds were older than anyone had imagined. “We thought the Ozark rock-shelter assemblages didn’t have much in the way of time depth, maybe 1,000 to 500 years,” she told me. “My dates went back 3,000 years.”
This was in the ’80s. “That was what the game was at that time,” Bruce D. Smith, an archaeologist who dedicated much of his career to plant domestication, told me. “You wanted to get a date and demonstrate the specimen was different from all the wild specimens of the same species.” Smith is now retired (he lives in New Mexico and writes mystery novels), but for decades he was a curator at the Smithsonian Institution’s National Museum of Natural History, in Washington, D.C. He began to look at seed collections held at the museum and found the same results: People in eastern North America had cultivated prairie plants as food. His and Fritz’s analyses, along with similar work from a small group of like-minded scholars, made a convincing archaeological case: People had grown these spindly grasses deliberately, saved their seeds, and then eaten them. Sumpweed, little barley, and goosefoot, these birdseed plants that couldn’t possibly be of interest to humans—they weren’t wild things anymore, but crops.
The seeds Smith studied are still in the collection at the National Museum of Natural History; Logan Kistler, who’s now the museum’s curator of archaeobotany and archaeogenomics, showed them to me. Many are kept these days in one-dram vials, each containing 100 seeds, but Smith originally found 50,000 seeds stored in a single cigar box in the museum’s attic. Under a microscope, a domesticated goosefoot seed looks like a golden disc; some of the seeds in the Smithsonian’s collection are early enough in the process of domestication that they still resemble lumps of coal, black and uneven. It is not entirely clear what about them would have attracted human attention, or led someone to taste one.
Go back far enough, and this is true of so many plants we now eat: Their ancestors were unpalatable, possibly inedible, or even toxic to the human body. Corn itself is descended from a grass called teosinte, the obvious appeal of which is so limited that some researchers once hypothesized that ancient humans were first drawn to the plant for its stalk, as a base for an alcoholic brew. Smith had a theory to explain the draw of the lost crops, though: They were easily available. Ancient people would have encountered them in the flood plains of the Missouri and Mississippi River basins, where water would have cleared ground as a farmer tills a field, creating bountiful spreads of plant-based food.
Or perhaps, as a pair of younger paleoethnobotanists have proposed, it was not only the landscape, but animals—large animals—that led people to these plants. Robert Spengler, who studied with Fritz and now directs the paleoethnobotany labs at the Max Planck Institute for the Science of Human History, thinks that all over the world, people have been attracted to plants that evolved to appeal to grazing animals. In the Mississippi basin, those animals would have been bison. When Spengler first told Natalie Mueller, once his grad-school colleague, now a professor at their alma mater, Washington University in St. Louis, that he thought bison could have led people to the lost crops, she was skeptical. “I was like, ‘Rob, what the hell are you talking about?’” she told me.
But she started to find hints that he might be onto something. Most of the lost crops are rarities these days: Throughout her career, Mueller had painstakingly sought them out on the disturbed land at the edge of human development—the strip between a farmed field and the road, or by a path leading to an old mine. Bison, too, are scarce, but where they have been reintroduced to the prairie, she has had little trouble finding the lost crops. They were growing in the places the animals had cleared.
In other words, before anyone thought to save sumpweed seeds, or plant little barley, perhaps those plants, having come to depend on bison for their survival, were changing to fit the tastes of humans who wandered along the bisons’ trails, gathering food from the stands of grass growing there. In 2019, Mueller started visiting a prairie preserve in Oklahoma more regularly, to see what she might find, and she invited me along. Once you see the prairie, she told me, I would see what she meant—that the bison and these plants, thriving together, make their own case. Being there had made her imagine the past anew, and it could do the same for anyone willing to carefully consider how a few overlooked plants now behaved in a landscape that more closely resembled the one where humans would have first met them.
The early morning fog erased the rolling hills of the Joseph H. Williams Tallgrass Prairie Preserve. It erased most of the road ahead, and any sign of the bison—“our big boys,” as Mueller and Ashley Glenn, her friend and go-to botanist, liked to call them. It muted the sun into a smear of yellow; it washed color from the grass, graying the prairie into a dense muddle that hid birds, spiders, and the coyote (or was it a wolf?) that called somewhere in the near distance. But even on a clear morning, I could not have picked out the plant we were seeking—sumpweed, or Iva, as Mueller called it, from its scientific name, Iva annua. Perhaps it should have stuck out: Fall had purpled its leaves and seeds, and it grew tall enough. But mixed among the other grasses, the plant was easy to miss.
Every time Mueller saw it, she perked up. The first specimen we found was puny, but its fruit was chonky—“really big,” she noted with satisfaction—and as we drove through the preserve, she pointed out the Iva lining the road to me and Fritz, who had come on the trip as well: “Oh, there’s Iva … It’s all Iva over here … Look at this stand; it’s a beautiful one.” At one point, she stopped the car suddenly by the roadside, having spotted, she thought, a sunflower (domesticated, too, on this continent, around the same time as Iva), the first she had seen on the preserve, growing right next to Iva, a coincidence that was going to make her head explode, she was saying, when Glenn, who had wandered deeper afield, cupped her hands around her mouth and yelled—
She was standing in a pool of purple that in the late-day light stood out like a bruise against the fading green of the prairie. Even I could pick it out, easily. So much bushy sumpweed surrounded her that she could have stayed in that one spot and harvested for hours.
And to Mueller, that made perfect sense. “Usually the bison are all over this spot,” she told me.
Like humans, bison are landscapers, and their influence on their environs could have been what led people to the lost crops to begin with. Out on the prairie, where the grass and sky swallowed our gangly bipedal figures, the bison were scaled to fit. From a distance, their dark, curved backs dotted hillsides. One morning we found a herd of them gathered near the fence. Spread out in a column 100-some strong, they began to run, harrumphing through the grass, hurtling up and down the dips and ditches beside the road, muscling forward half tons of flesh and clearing paths through the tall grass.
Many of the bison traces we walked were just about wide enough for a single person, and it’s easy to imagine that people traveling the prairies millennia ago would have chosen to follow these paths. Without the bison, the tall grasses grow so thick together that moving anywhere requires tramping down thickets of ornery stalks almost guaranteed to be hiding snakes or other dangers. Whenever we left the road, we sought out these bison traces.
Just like a flood on the banks of a river, bison create the fresh-turned earth that an annual grass needs to sow its seeds. When they’re not galloping across the prairie, bison graze patches into the grass, or wallow in it, clearing plots of land with their massive bulk as effectively as any farmer might and opening ground for small fields of Iva and other lost crops. During one of her first spring visits, Mueller stood in a green pool of growth and marveled at three of them—little barley, maygrass, and tiny Iva seedings—mingled together, as if someone had planted them for an archaeologist to find. Based on their observations at the preserve, Mueller and Glenn have argued, along with Spengler, that ancient foragers might have first thought of the lost crops as a potential food when they encountered these dense stands along bison trails.
So many domesticated plants started out this way, as what we now derisively refer to as weeds. They showed up and showed up and showed up at the edges of human experience, until someone started interacting with them. Wild grasses would not have been so different from the wolves that hung around the edges of human campgrounds and over time evolved into dogs. Though we rarely give plants credit for such improvisation, some of the more flexible species could have found opportunity, too, in the disturbed ground of those campsite edges.
Seeing the Iva in such abundance on the prairie only reinforces the notion that humans might have begun to gather its seeds, so that selection pressure eventually shaped the plant into a form ever more appealing. In a way, this story is simpler than one that casts humans as heroic inventors who discover agriculture with their big human minds. And this less deliberate version could have happened over and over again, in many places across the planet.
Wheat, barley, and lentils; corn, squash, and beans; rice, peas, potatoes—humans didn’t necessarily choose them as domesticates, and we’re a rebound relationship for some. Like any species, plants can be opportunistic, and many that we now eat had other partners in a previous era, when megafauna dominated North and South America. Squash, for example, started as compact fruit packed with bitter compounds that only mastodons and their ilk could handle. Avocados, too, evolved to feed these giant creatures, with big shiny pits that slid down megafaunal gullets as easily as raspberry seeds pass through ours. But we turned out to be excellent seed distributors too.
We also have our own predilections. Agriculture has slowly rid fruits of bitterness, but the seeds that Mueller and her colleagues harvest from fields, or from the experimental gardens where they’ve grown lost crops, have not undergone that long negotiation with human taste. Boiled or sautéed, goosefoot greens still have a bitter bite. Mueller and the archaeologist Elizabeth T. Horton, another lost-crops scholar, have both tried cooking Iva, with similar outcomes. “It smelled really, really bad,” Horton said. One student had more success grinding it up and making a simple bread. It had “a light herbal flavor,” Mueller reported.
She has in the past dropped off seeds for Rob Connoley, the chef of the St. Louis restaurant Bulrush, whose tasting menus feature locally foraged foods. When I asked him how he handled the lost crops, he described air-popping goosefoot seeds into garnishes, or working them into chocolate, as a sort of “foraged Nestle’s Crunch Bar.” Raw, the seeds have an unappealing flavor—“dusty, earthy, but oily,” in his experience. Iva is even harder to cook with. Connoley and his crew tried shelling, popping, and toasting the seeds, and only that last strategy worked, kind of. Ground into a paste, the toasted seeds were edible, technically, but “imagine tasting house paint,” Connoley said. “It’s not the best thing by itself.”
Confronted with teosinte, corn’s wild ancestor, a chef might have the same trouble. Like the lost crops, teosinte so little resembles what we think of as food that for decades archaeologists argued whether it could possibly have given rise to corn, or if they were missing some link, an ancient form of maize. Now that debate is settled: Teosinte is it. At first glance, its long, green leaves do seem like corn’s—I saw a small stand in Oaxaca, grown in the city’s ethnobotanical garden. But it’s wider than corn, less organized in its makeup, and only thin, dried tendrils keep its seeds connected. When the seeds fall to the ground, they look like lost human teeth, gnarled and off-white.
Genetic evidence suggests that domestication makes more sense when you think of it as a long, drawn-out process, rather than an event. At the beginning of a human-plant relationship, humans would have unconsciously exerted selection pressure on plants, which would respond by, say, producing larger seeds or clustering their seeds near the top. Eventually, humans started choosing plants with certain qualities on purpose. Thinking about agriculture’s origins in this way fills some of the gaping holes in the traditional narrative. For instance: How does a person envision a domesticated plant if they’ve never seen a domesticated plant? (They don’t have to.) And how does a society keep after that vision, generation after generation, for the thousands of years that domestication can take? The slow, evolutionary story, as opposed to the fast, revolutionary one, “doesn’t rely on a few clever people in every society making the decision,” Kistler said. “It just happens. It emerges.”
In this evolutionary process, the domestication of any particular plant need not be a one-off. Again, genetic evidence bears this out: Rice was domesticated at least three separate times, in Asia, South America, and Africa. In the Fertile Crescent, domestication took about 2,000 years, and early versions of wheat and other important crops were spread across the region.
Kistler is an archaeologist by training, and he might, on any given day, have ancient plant samples—pale-orange squash, when I visited—sitting out in his cavernous office in the museum’s back halls. Although he sometimes travels far afield in search of new plant material, much of his actual work takes place on a computer, as he searches the genetic code of ancient seeds for secrets about plants’ pasts. His work has helped show, for example, that teosinte’s journey to become fully domesticated corn took thousands of years and spanned continents. And that hardy bottle gourds likely reached the Americas by floating across the Atlantic, to be independently domesticated on this side of the ocean. Looking at domestication at this level of detail has teased out how each emerging partnership between human and plant has its own story: Cassava, a perennial vine whose roots are packed with enough cyanide compounds to cause paralysis or death, necessarily took a different route to domestication than teosinte. A plant that evolved fruits to attract some animal or bird as a seed disperser might have a different meet-cute with humans than one that serves us its seeds or roots.Some of these stories have ended. In the Middle East, a different type of wheat was domesticated in parallel with the one we eat now, grown for hundreds of years, and then, for some reason, slowly abandoned. It is now extinct. In South India, a staple crop called browntop millet largely disappeared. Almost certainly, archaeologists have yet to unearth evidence of other lost crops; some we’ll never rediscover. The era of agriculture still accounts for only a fraction of human history’s 200,000 years, and even in this short time we have narrowed down our options, discarding whole crop systems. We think of ourselves as omnivorous foodies, but we are picky eaters, dedicated to a small group of select foods.
North America’s lost crops were already disappearing from the archaeological record by A.D. 1200, though here and there people were still cultivating them, sometimes for hundreds of years more. An archaeological site in Arkansas, for instance, contained a trove of fat Iva seeds that date to the 15th century A.D., and a couple of glancing references in the journals of early European arrivals hint that some people might still have been eating goosefoot in the 16th century. Perhaps the upheaval of European colonization ended this agriculture heritage altogether. But by then it was already disappearing.
Why did these plants fall out of use? And, in turn, why did corn succeed? On a genetic level, changes in certain parts of the plant genome are associated with domesticated traits, but no one knows exactly which genetic traits might predispose a plant to flip from wild to domesticated, or which might act as barriers to domestication. If we understood that, it would be possible to say more definitively why so few plants have made it into the human diet and stuck there. “There are 300,000 plant species, and humans have a known use for, like, 10 percent of them,” Kistler said. “We get half our calories from three of them. And we owe our history to a lot more than the ones we think about right now.”
According to its partisans, maize was simply a better crop. But scholars of the lost crops have gone to great pains to show that goosefoot, Iva, and the others are nutritionally competitive with corn. They also know that corn did not supplant the lost crops for hundreds of years. At one moment, corn and those crops thrived as compatible, complementary foods. In a spot not far from where St. Louis sits today, the ancient city of Cahokia, the largest ever discovered dating to the Mississippian period in what’s now the U.S., used to host feasts. Often, Cahokia is considered a corn city, built on maize-centric agriculture, but in the remains of those feasts, squash, sunflower seeds, and all five of the lost crops—maygrass, goosefoot, knotweed, little barley, and sumpweed—are preserved alongside corn cobs.
Those cobs are still only a few inches long, neither the catalyst for domestication in this part of the world nor a panacea that transformed human life here immediately. Corn now rules American fields, but is that a historical contingency, one of those realities that swung a particular way by chance, or the necessary end to the story of American agriculture? In the Andes, goosefoot’s cousin, quinoa, stayed a staple; why didn’t goosefoot settle in America’s midwestern plains? In some parts of the world, crops we think of as winners—crops such as rice—started domestication then disappeared, nudged into obscurity by biology, history, or both. “I don’t think we’re ready to answer why we have the few dominant crops we have,” Kistler told me.
With the right care and attention, the lost crops might still reveal their allure. They are, Mueller and her colleagues have found, eager to please. In plots scattered across the country, she and a small group of other archaeologists had started cultivating these plants, the first time in hundreds of years that humans have treated them as food. Mueller originally planted her garden with seeds sourced from across the Midwest, including Iva seeds from Arkansas, where Horton had started growing Iva and other lost crops too. For a while, she and Mueller competed over how tall they could get their Iva, Mueller told me. And Horton kept winning.
The plants started with a population of Iva that Horton found right outside her old office, at the Arkansas Archaeological Survey. (She now has her own macrobotanical consulting company, Rattlesnake Master.) That original stand of sumpweed grows “big and healthy and lush and gorgeous,” she told me, but never more than about five feet in height, typical for wild Iva. In the Arkansas garden, the first year, the Iva grew six feet. The next year, seven. Then eight, and sometimes nearly nine feet tall. Already, she’s finding unusually large seeds too. Mueller and Horton think these plants might have descended, distantly, from domesticated Iva, which could explain their quick changes. Or Iva’s plasticity makes it respond easily to environmental influences. Transforming the plant’s genes such that it becomes a true domesticate might take ages, but perhaps Iva has a natural flexibility in how it expresses those genes. A plant like that, which responds to human influence so readily, might have been attractive, too, even to someone with no conception of domestication.
Ultimately, Mueller hopes that the lost crops might help reveal the fundamental mechanisms of domestication. When I visited her experimental garden plot, she was growing goosefoot, Iva, and erect knotweed, in configurations that might tell her a little more about the secrets their seeds hold. “What I want to do is redomesticate them,” she told me.
Who knows? A generation from now goosefoot could be rebranded as North American quinoa, and eaten across the world; Iva could become an acquired taste. By rediscovering the crops that we’ve lost, we could revitalize our idea of what counts as food. We tend to think that we, in our globalized world, eat a variety of goodies greater than any available to humanity in eras past, but like the professor who couldn’t abide pigweed, we have a narrow vision of what passes muster. Historically, domesticating a particular species might have taken thousands of years, but archaeological experiments have shown that the same work can be done in just a few dozen. If we took our cues from ancient diets, we could quickly expand our pantries again. By sampling some of the first foods humans ever grew themselves, we might think again about the possibilities of the world and its growing things, or of rekindling old relationships for millennia to come. We might notice other plants that are growing on the edge of our experience, and wonder what they have to offer.
The post Our Food System Could Have Been So Different appeared first on The Atlantic.