By Paul Roberts, published on March 1, 1998 - last reviewed on June 13, 2012
Our food is better than ever. So why do we worry so much about what we eat? An emerging psychology of food reveals that when we swap sit-down for take-out, we cut our emotional ties to the table and food ends up fueling our worst fears. Call it spiritual anorexia.
Early in the 1900s, as America struggled to digest yet another wave of immigrants, a social worker paid a visit to an Italian family recently settled in Boston. In most ways, the newcomers seemed to have taken to their new home, language, and culture. There was, however, one troubling sign. "Still eating spaghetti," the social worker noted. "Not yet assimilated." Absurd as that conclusion seems now—especially in this era of pasta—it aptly illustrates our long-standing faith in a link between eating and identity. Anxious to Americanize immigrants quickly, U.S. officials saw food as a critical psychological bridge between newcomers and their old culture and as a barrier to assimilation.
Many immigrants, for example, did not share Americans' faith in large, hearty breakfasts, preferring bread and coffee. Worse, they used garlic and other spices, and mixed their foods, often preparing an entire meal in a single pot. Break these habits, get them to eat like Americans—to partake in the meat heavy, superabundant U.S. diet—and, the theory confidently held, you'd have them thinking, acting, and feeling like Americans in no time.
A century later, the link between what we eat and who we are is not nearly so simple. Gone is the notion of a correct American cuisine. Ethnic is permanently in, and the national taste runs from the red-hot spices of South America to the piquancy of Asia. U.S. eaters are in fact inundated by choice--in cuisines, cookbooks, gourmet magazines, restaurants, and, of course, in food itself. Visitors are still struck dumb by the abundance of our supermarkets: the myriad meats, year-round bonanza of fresh fruits and vegetables, and, above all, the variety--dozens of kinds of apples, lettuces, pastas, soups, sauces, breads, gourmet meats, soft-drinks, desserts, condiments. Salad dressings alone can take up several yards of shelf space. All told, our national supermarket boasts some 40,000 food items, and, on average, adds 43 new ones a day—everything from fresh pastas to microwavable fish-sticks.
Yet if the idea of a correct American cuisine is fading, so, too, is much of that earlier confidence we had in our food. For all our abundance, for all the time we spend talking and thinking about food (we now have a cooking channel and the TV Food Network, with celebrity interviews and a game show), our feelings for this necessity of necessities are oddly mixed. The fact is, Americans worry about food--not whether we can get enough, but whether we are eating too much. Or whether what we eat is safe. Or whether it causes diseases, promotes brain longevity, has antioxidants, or too much fat, or not enough of the right fat. Or contributes to some environmental injustice. Or is a breeding ground for lethal microbes. "We are a society obsessed with the harmful effects of eating," grouses Paul Rozin, Ph.D., professor of psychology at the University of Pennsylvania and a pioneer in the study of why we eat the things we eat. "We've managed to turn our feelings about making and eating food—one of our most basic, important, and meaningful pleasures—into ambivalence."
Rozin and his colleagues aren't just talking here about our frighteningly high rates of eating disorders and obesity. These days, even normal American eaters are often culinary Sybils, by turns approaching and avoiding food, obsessing over and negotiating (with themselves) what they can and can't have—generally carrying on in ways that would have flabbergasted our ancestors. It's the gastronomic equivalent of too much time on our hands.
Liberated from the "nutritional imperative," we've become free to write our own culinary agendas--to eat for health, fashion, politics, or many other objectives—in effect, to use our food in ways that often have nothing to do with physiology or nutrition. "We love with it, reward and punish ourselves with it, use it as a religion," says Chris Wolf, of Noble & Associates, a Chicago-based food marketing consultancy. "In the movie Steel Magnolias, somebody says that what separates us from the animals is our ability to accessorize. Well, we accessorize with food."
One of the ironies regarding what we eat—our psychology of food—is that the more we use food, the less we seem to understand it. Inundated by competing scientific claims, buffeted by conflicting agendas and desires, many of us simply wander from trend to trend, or fear to fear, with little idea of what we're seeking, and almost no certainty that it will make us happier or healthier. Our entire culture "has an eating disorder," argues Joan Gussow, Ed.D., professor emeritus of nutrition and education at Teachers College, Columbia University. "We are more detached from our food than at any time in history."
Beyond clinical eating disorders, the study of why people eat what they eat remains so uncommon that Rozin can count his peers on two hands. Yet for most of us, the idea of an emotional link between eating and being is as familiar as, well, food itself. For eating is the most basic interaction we have with the outside world, and the most intimate. Food itself is almost the physical embodiment of emotional and social forces: the object of our strongest desire; the basis of our oldest memories and earliest relationships.
Lessons from Lunch
As children, eating and mealtimes figure hugely in our psychic theater. It's through eating that we first learn about desire and satisfaction, control and discipline, reward and punishment. I probably learned more about who I was, what I wanted, and how to get it at my family dinner table than anywhere else. It was there that I perfected the art of haggling—and had my first major test of will with my parents: an hours-long, almost silent struggle over a cold slab of liver. Food also gave me one of my first insights into social and generational distinctions. My friends ate differently than we did—their moms cut the crusts off, kept Tang in the house, served Twinkies as snacks; mine wouldn't even buy Wonder bread. And my parents could not do Thanksgiving dinner like my grandmother.
The dinner table, according to Leon Kass, Ph.D., a culture critic at the University of Chicago, is a classroom, a microcosm of society, with its own laws and expectations: "One learns self-restraint, sharing, consideration, taking turns, and the art of conversation." We learn manners, Kass says, not only to smooth our table transactions, but to create a "veil of invisibility," helping us avoid the disgusting aspects of eating and the often violent necessities of food production. Manners create a "psychic distance" between food and its source.
As we reach adulthood, food takes on extraordinary and complex meanings. It can reflect our notions of pleasure and relaxation, anxiety and guilt. It can embody our ideals and taboos, our politics and ethics. Food can be a measure of our domestic competence (the rise of our souffle, the juiciness of our barbecue). It can also be a measure of our love—the basis of a romantic evening, an expression of appreciation for a spouse—or the seeds of a divorce. How many marriages begin to unravel over food-related criticisms, or the inequities of cooking and cleaning?
Nor is food simply a family matter. It connects us to the outside world, and is central to how we see and understand that world. Our language is rife with food metaphors: life is "sweet," disappointments are "bitter," a lover is "sugar" or "honey." Truth can be easy to "digest" or "hard to swallow." Ambition is a "hunger." We are "gnawed" by guilt, "chew" over ideas. Enthusiasms are "appetites," a surplus, "gravy."
In fact, for all its physiological aspects, our relationship with food seems more a cultural thing. Sure, there are biological preferences. Humans are generalist eaters—we sample everything—and our ancestors clearly were too, leaving us with a few genetic signposts. We're predisposed to sweetness, for example, presumably because, in nature, sweet meant fruit and other important starches, as well as breast milk. Our aversion to bitterness helped us avoid thousands of environmental toxins.
A Matter of Taste
But beyond these and a few other basic preferences, learning, not biology, seems to dictate taste. Think of those foreign delicacies that turn our own stomachs: candied grasshoppers from Mexico; termite-cakes from Liberia; raw fish from Japan (before it became sushi and chic, that is). Or consider our capacity to not only tolerate but cherish such inherently off tastes as beer, coffee, or one of Rozin's favorite examples, hot chilies. Children don't like chilies. Even youngsters in traditional chili cultures like Mexico require several years of watching adults consume chilies before assuming the habit themselves. Chilies do spice up the otherwise monotonous diet—rice, beans, corn—many chili cultures must endure. By rendering starchy staples more interesting and palatable, chilies and other spices, sauces, and concoctions made it more likely that humans would eat enough of their culture's particular staple to survive.
In fact, for most of our history, individual preferences were not only probably learned, but dictated (or even subsumed entirely) by the traditions, customs, or rituals a particular culture had developed to ensure survival. We learned to revere staples; we developed diets that included the right mix of nutrients; we erected complex social structures to cope with hunting, gathering, preparation, and distribution. This isn't to say we had no emotional connection with our food; quite the contrary.
The earliest cultures recognized that food was power. How tribal hunters divided their kill, and with whom, constituted some of our earliest social relations. Foods were believed to bestow different powers. Certain tastes, such as tea, could become so central to a culture that a nation might go to war over it. Yet such meanings were socially determined; scarcity required hard and fast rules about food—and left little room for differing interpretations. How one felt about food was irrelevant.
Today, in the superabundance that characterizes more and more of the industrialized world, the situation is almost entirely reversed: food is less a social matter, and more about the individual—especially in America. Food is available here in all places at all times, and at such low relative cost that even the poorest of us can usually afford to eat too much—and worry about it.
Not surprisingly, the very idea of abundance plays a large role in American attitudes toward food, and has since colonial times. Unlike most developed nations of the time, colonial America began without a peasant diet reliant on grains or starches. Faced with the New World's astonishing natural abundance, especially of fish and game, the European diets many colonists brought over were quickly modified to embrace the new cornucopia.
Yankee Doodle Diet
Gluttony this wasn't; our early Protestantism allowed no such excesses. But by the 19th century, abundance was a hallmark of American culture. The portly, well-fed figure was positive proof of material success, a sign of health. At the table, the ideal meal featured a large portion of meat—mutton, pork, but preferably beef, long a symbol of success—served separately from, and unsullied by, other dishes.
By the 20th century, this now-classic format, which English anthropologist Mary Douglas has dubbed "1A-plus-2B"—one serving of meat plus two smaller servings of starch or vegetables--symbolized not only American cuisine but citizenship. It was a lesson all immigrants had to learn, and which some found harder than others. Italian families were constantly lectured by Americanizers against mixing their foods, as were the rural Polish, according Harvey Levenstein, Ph.D., author of Revolution at the Table. "Not only did [Poles] eat the same dish for one meal," Levenstein notes, "they also ate it from the same bowl. They therefore had to be taught to serve food on separate plates, as well as to separate the ingredients." Getting immigrants from these stew-cultures, which extended meat via sauces and soups, to adopt the 1A-plus-2B format was regarded a major success for assimilation, adds Amy Bentley, Ph.D., professor of food studies at New York University.
The emerging American cuisine, with its proud protein emphasis, effectively reversed eating habits developed over thousands of years. In 1908, Americans consumed 163 pounds of meat per person; by 1991, according to government figures, this had climbed to 210 pounds. According to food historian Elisabeth author of The Universal Kitchen, our tendency to top one protein with another—a slab of cheese on a beef patty, for example—is a habit many other cultures still regard as wretched excess, and is only our latest declaration of abundance.
There was more to America's culinary cockiness than mere patriotism; our way of eating was healthier—least according to the scientists of the day. Spicy foods were overstimulating and a tax on digestion. Stews were non-nutritious because, according to the theories of the time, mixed foods couldn't efficiently release nutrients.
Both theories were wrong, but they exemplify how central science had become to the American psychology of food. The early settlers' need for experimentation—with food, animals, processes—had helped feed a progressive ideology that, in turn, whetted a national appetite for innovation and novelty. When it came to food, newer nearly always meant better. Some food reformers, like John Kellogg (inventor of corn flakes) and C. W. Post (Grape-Nuts), focused on increasing vitality through newly discovered vitamins or special scientific diets--trends that show no signs of fading. Other reformers lambasted the poor hygiene of the American kitchen.
In short order, the very concept of homemade, which had sustained colonial America—and is so prized today--was found unsafe, obsolete, and low class. Far better, reformers argued, were heavily processed foods from centralized, hygienic factories. Industry was quick to comply. In 1876, Campbell's introduced its first tomato soup; in 1920, we got Wonder bread and in 1930, Twinkies; 1937 brought the quintessential factory food: Spam.
Some of these early health concerns were valid—poorly canned goods are deadly—but many were pure quackery. More to the point, the new obsessions with nutrition or hygiene marked a great step in the depersonalization of food: the average person was no longer deemed competent to know enough about his or her food to get along. Eating "right" required outside expertise and technology, which American consumers increasingly embraced. "We just didn't have the food traditions to hold us back from the helter-skelter of modernity," says Gussow. "When processing came along, when the food industry came along, we didn't put up any resistance."
By the end of the second World War, which brought major advances in food processing (Cheerios arrived in 1942), consumers were increasingly relying on experts—food writers, magazines, government officials, and, in ever-greater proportions, advertisements—for advice on not only nutrition but cooking techniques, recipes, and menu planning. More and more, our attitudes were being shaped by those selling the food. By the early 60s, the ideal menu featured plenty of meat, but also concocted from the growing pantry of heavily-processed foods: Jello, canned or frozen vegetables, green-bean casserole made with cream of mushroom soup and topped with canned french-fried onions. It sounds silly, but then so are our own food obsessions.
Nor could any self-respecting cook (read: mother) serve a given meal more than once a week. Leftovers were now a blight. The new American cuisine demanded variety—different main courses and side-dishes every night. The food industry was happy to supply a seemingly endless line of instant products: instant puddings, instant rice, instant potatoes, gravies, fondues, cocktail mixers, cake mixes, and the ultimate space-age product, Tang. The growth in food products was staggering. During the late 1920s, consumers could choose among only a few hundred food products, only a portion of them branded. By 1965, according to Lynn Dornblaser, editorial director at the Chicago based New Product News, nearly 800 products were being introduced every year. And even that number would soon seem small. In 1975, there were 1,300 new products: in 1985 there were 5,617; and, in 1995, a whopping 16,863 new items.
In fact, in addition to abundance and variety, convenience was rapidly becoming the center of American food attitudes. As far back as Victorian times, feminists had eyed central food processing as a way to lighten the homemakers' burdens.
While the meal-in-a-pill ideal never quite arrived, the notion of high-tech convenience was all the rage by the 1950s. Grocery stores now had freezer cases with fruits, vegetables, and--joy of joys—pre-cut french fries. In 1954, Swanson made culinary history with the first TV dinner—turkey, cornbread stuffing, and whipped sweet potatoes, configured in a compartmentalized aluminum tray and packaged in a box that looked like the TV set. Although the initial price—98 cents—was high, the meal and its half-hour cooking time were hailed as a space-age marvel, perfectly in synch with the quickening pace of modern life. It paved the way for products ranging from instant soup to frozen burritos and, as importantly, for an entirely new mind-set about food. According to Noble & Associates, convenience is the first priority in food decisions for 30 percent of all American households.
Granted, convenience was, and is, liberating. "The number-one attraction is spending time with the family instead of being in the kitchen all day," explains Wenatchee, Washington, restaurant manager Michael Wood, of the popularity of take-out home-cooked meals. These are called "home meal replacement" in industry parlance. But convenience's allure wasn't limited to the tangible benefits of time and saved labor.
Anthropologist Conrad Kottak has even suggested that fast-food restaurants serve as a kind of church, whose decor, menu, and even conversation between counter-clerk and customer are so unvaried and dependable as to have become a kind of comforting ritual.
Yet such benefits aren't without considerable psychic cost. By diminishing the wide variety of social meanings and pleasures once associated with food—for example, by eliminating the family sit-down dinner--convenience diminishes the richness of the act of eating and further isolates us.
New research shows that while the average upper-middle class consumer has some 20 contacts with food a day (the grazing phenomenon), the amount of time spent eating with others is actually falling. That's true even within families: three-quarters of Americans don't breakfast together, and sit-down dinners have fallen to just three a week.
Nor is convenience's impact simply social. By replacing the notion of three square meals with the possibility of 24-hour grazing, convenience has fundamentally altered the rhythm food once bestowed upon each day. Less and less are we expected to wait for dinner, or avoid spoiling our appetites. Instead, we eat when and where we want, alone, with strangers, on the street, on a plane. Our increasingly utilitarian approach to food creates what the University of Chicago's Kass calls "spiritual anorexia." In his book The Hungry Soul, Kass notes that, "Like the one-eyed Cyclops, we, too, still eat when hungry, but no longer know what it means."
Worse, our increasing reliance on prepared foods coincides with a diminished inclination or capacity to cook, which in turn, only further separates us—physically and emotionally—from what we eat and where it comes from. Convenience completes the decades long depersonalization of food. What is the meaning—psychological, social, or spiritual—of a meal prepared by a machine in a factory on the other side of the country? "We're almost to the point where boiling water is a lost art," says Warren J. Belasco, head of American studies at the University of Maryland and author of Appetite for Change.
Add Your Own... Water
Not everyone was satisfied with our culinary progress. Consumers found Swanson's whipped sweet-potatoes too watery, forcing the company to switch to white potatoes. Some found the pace of change too quick and intrusive. Many parents were offended by the pre-sweetened cereals in the 1950s, preferring, apparently, to spoon the sugar on themselves. And, in one of the true ironies in the Age of Convenience, lagging sales of the new just-add-water cake mixes have forced Pillsbury to un-simplify its recipes, excluding powdered eggs and oil from the mix so that homemakers could add their own ingredients and feel they were still actively participating in cooking.
Other complaints weren't easily assuaged. The post-WWII rise of factory food sparked rebellions by those who feared we were becoming alienated from our food, our land, our nature. Organic farmers protested the rising reliance on agri-chemicals. Vegetarians and radical nutritionists repudiated our meat passion. By the 1960s, a culinary counterculture was underway, and today, there are protests not just against meat and chemicals, but fats, caffeine, sugar, sugar substitutes, as well as foods that are not free-range, that contain no fiber, that are produced in an environmentally destructive way, or by repressive regimes, or socially unenlightened companies, to name but a few. As columnist Ellen Goodman has noted, "Pleasing our palates has become a secret vice, while fiber-fueling our colons has become an almost public virtue." It has fueled an industry. Two of the most successful brands ever are Lean Cuisine and Healthy Choice.
Clearly, such fads often have a scientific basis--the research on fat and heart disease is hard to dispute. Yet just as often, evidence for a particular dietary restriction is modified or eliminated by the next study, or turns out to have been exaggerated. More to the point, the psychological appeal of such diets has almost nothing to do with their nutritional benefits; eating the right foods is for many of us very satisfying--even if what's right may change with the next day's newspapers.
In truth, humans have been assigning moral values to foods and food practices forever. Yet Americans seem to have taken those practices to new extremes. Numerous studies have found that eating bad foods--those prohibited for nutritional, social, or even political reasons—can cause far more guilt than any measurable ill-effects might warrant, and not just for those with eating disorders. For example, many dieters believe they have blown their diets simply by eating a single bad food—irrespective of how many calories were ingested.
The morality of foods also plays a huge role in how we judge others. In a study by Arizona State University psychologists Richard Stein. Ph.D., and Carol Nemeroff, Ph.D., fictitious students who were said to eat a good diet—fruit, homemade wheat bread, chicken, potatoes—were rated by test subjects as more moral, likable, attractive, and in shape than identical students who ate a bad diet—steak, hamburgers, fries, doughnuts, and double-fudge sundaes.
Moral strictures on food tend to be heavily dependent on gender, with taboos against fatty foods strongest for women. Researchers have found that how much one eats can determine perceptions of attractiveness, masculinity, and femininity. In one study, women who ate small portions were judged more feminine and attractive than those who ate larger portions; how much men ate had no such effect. Similar findings turned up in a 1993 study in which subjects watched videos of the same average-weight woman eating one of four different meals. When the woman ate a small salad, she was judged most feminine; when she ate a big meatball sandwich, she was rated least attractive.
Given the power that food has over our attitudes and feelings for ourselves and others, it's hardly surprising that food should be such a confusing and even painful subject for so many, or that a single meal or a trip to the grocery store can involve such a blizzard of contradictory meanings and impulses. According to Noble & Associates, while just 12 percent of American households demonstrate some consistency in modifying their diets along health or philosophical lines, 33 percent exhibit what Noble's Chris Wolf calls "dietary schizophrenia": trying to balance their indulgences with bouts of healthy eating. "You'll see someone eat three slices of chocolate cake one day and just fiber the next," Wolf says.
With our modern traditions of abundance, convenience, nutrition science, and culinary moralizing, we want food to do so many different things that just enjoying food as food has come to seem impossible.
The New Pornography?
In this context, the welter of contradictory and bizarre food behaviors seem almost logical. We're bingeing on cookbooks, food magazines, and fancy kitchenware—yet cooking far less. We chase the latest cuisines, accord celebrity status to chefs, yet consume more calories from fast food. We love cooking shows, even though, Wolf says, most move too fast for us to actually make the recipe at home. Food has become a voyeuristic pursuit. Instead of simply eating it, says Wolf, "we drool over pictures of food. It's food pornography."
There is evidence, however, that our obsession with variety and novelty may be on the wane or at least slowing down. Studies by Mark Clemens Research show that the percentage of consumers who say they're "very likely" to try new foods has dropped from 27 percent in 1987 to just 14 percent in 1995—perhaps in response to the overwhelming variety of offerings. And for all that magazines like Martha Stewart Living lend to culinary voyeurism, they may also reflect a yearning for traditional forms of eating and the simpler meanings that go with them.
Where can these impulses lead us? Wolf has gone so far as to rework psychologist Abraham Maslow's "hierarchy of needs" to reflect our culinary evolution. At the bottom is survival where food is simply calories and nutrients. But as our knowledge and income grow, we ascend to indulgence—a time of abundance, 16-ounce steaks, and the portly ideal. The third level is sacrifice, where we begin removing items from our diet. (America, says Wolf, is firmly on the fence between indulgence and sacrifice.) The final level is self-actualization: everything is in balance and nothing is dogmatically consumed or avoided. "As Maslow says, nobody ever really gets to be completely self-actualized—just in fits and starts."
Rozin, too, urges a balanced approach, particularly in our obsession with health. "The fact is, you can eat almost anything and grow and feel good," Rozin argues. "And no matter what you eat, you will eventually face deterioration and death." Rozin believes that to resign enjoyment to health, we've lost far more than we know: "The French have no ambivalence about food: it's almost purely a source of pleasure."
Columbia's Gussow wonders whether we simply think too much about our food. Tastes, she says, have become far too complex for what she calls "instinctive eating"—choosing foods we really need. In ancient times, for example, a sweet taste alerted us to calories. Today, it may indicate calories, or artificial sweetener; it may be used to hide fat or other flavors; it may become a kind of background flavor in nearly all processed foods. Sweet, salty, tart, spicy—processed foods are now flavored with incredible sophistication. One national brand of tomato soup is sold with five different flavor formulations for regional taste differences. A national spaghetti sauce comes in 26 formulations. With such complexities at work, "our taste buds are constantly being fooled," Gussow says. "And that forces us to eat intellectually, to consciously assess what we eat. And once you try to do that, you're trapped, because there's no way to sort through all these ingredients."
And how, exactly, are we to eat with more pleasure and instinct, less anxiety and less ambivalence, to regard our food less intellectually and more sensually? How can we re-connect with our food, and all the facets of life that food once touched, without simply falling prey to the next fad?
We can't--at least, not all at once. But there are ways of beginning. Kass, for example, has argued that even small gestures, such as consciously halting work or play to fully focus on your meal, can help recover an "awareness of the deeper meaning of what we're doing" and help mitigate the trend toward culinary thoughtlessness.
University of Maryland's Belasco has another strategy that begins with the simplest of tactics. "Learn to cook. If there is one thing you can do that is very radical and subversive," he says, "it is either starting to cook, or picking it up again." To create a meal from something other than a box or can requires reconnecting—with your cupboards and refrigerator, your kitchen utensils, with recipes and traditions, with stores, produce, and deli counters. It means taking time—to plan menus, to shop, and, above all, to sit and enjoy the fruits of your labors, and even invite others to share. "Cooking touches a lot of aspects of life," says Belasco, "and if you are really going to cook, then you're really going to have rearrange a lot of the rest of how you live."