Do We Age in Stages?
Are there discrete and predictable stages of human development?
Posted Mar 12, 2015
For more than two millennia, metaphors have governed the way that philosophers, physicians, theologians, poets, playwrights, artists, and other authorities have conceived of the life course. The journey of life has been likened to a circle, a cycle, a pilgrimage, or, more recently, as a sequence of psychosexual or psychosocial stages or a series of passages, seasons, transitions, or transformations.
The ancient and medieval worlds divided the human life course into three, four, five, six, seven, and twelve-part schemata, drawing an analogy between the life stages and the Trinity or three Magi, the four elements, humours, or seasons, the six days of creation, the seven days of the week or seven planets, and the twelve months.
During the Renaissance, an astrological model divided the life span into seven ages, as in Shakespeare’s As You Like It, where the stages of life begin with “puking” infancy and end with “second childishness and mere oblivion, sans teeth, sans eyes, sans taste, sans everything.”
Not surprisingly, the Age of Discovery conceived of aging as a journey and the Enlightenment, with its stress on progress, as a ladder. Well before Darwin published his Origins of Species, the nineteenth century likened life to a game. Indeed, it’s not an accident that one of the first popular board games, which appeared in 1860, was called the Checkered Game of Life.
The twentieth century gave a scientific gloss to the notion that psychological development takes place through a series of fixed, universal stages.
According to Freud’s theory of psychosexual development, children progress from the oral stage to the anal stage to the anal and phallic stages, the latency period, and the genital stage.
Arnold Gesell—the psychologist who coined the phrase “the terrible twos”—argued that there was a biological and neurological basis to children’s maturation, and that the development of their adaptive, cognitive, language, motor, and social skills followed a predictable sequence. It was Gesell’s theory of maturation that gave rise to the notions of developmental schedule and developmental milestones and to the idea of “school readiness”—that children were only capable of performing certain school tasks when they had achieved a certain level of biological and neurological development.
Jean Piaget argued that children passed through four stages of cognitive development—the sensiomotor, preoperational, concrete operational, and formal operational stages—and that at each developmental stage children’s mental processes differ qualitatively from those at later stages.
Erik Erikson’s theory of psycho-social development would extend the notion of universal stages across the entire life course. He posited eight stages of personality and identity development, during which an individual had to confront certain developmental challenges involving basic trust versus mistrust, autonomy versus shame and doubt, initiative versus guilt, industry versus inferiority, identity versus identity confusion, intimacy versus isolation, generativity versus stagnation, and integrity versus despair.
Erikson’s view that development continued through adulthood helped inspire psychologists and popular writers alike, including Roger Gould, Daniel J. Levinson, and Gail Sheehy.
In fact, the human life course is shaped less by certain innate, universal biologically or neurologically rooted developmental stages or discrete psychological tasks or predictable crises or passages than by historical, social, and cultural context—which shapes the timing, nature, and meaning of life course transitions.
To be sure, human development has a biological dimension—but even this dimension is more malleable than we sometimes think. The age at which girls menstruate or that boys cease growing has varied widely over time.
That life stages are social and cultural constructions is evident in the sudden recognition (or invention) of new stages in the life course at particularly moments in time. Take the example of adolescence. Although the word has Latin roots, it did not acquire its modern associations with puberty, psychological storm and stress, defiance, and risk-taking until 1904. That was when G. Stanley Hall, the first American Ph.D. in psychology, published a massive two-volume work entitled “Adolescence: Its Psychology and Its Relations to Physiology, Anthropology, Sociology, Sex, Crime, Religion, and Education."
At the time, adolescence as we know it was confined to a small fraction of the population. Some 1.75 million children (18 percent of the 10 to 14 population) worked in factories, stores, or on city streets, while millions more toiled on farms. Just six percent of teens graduated from high school.
After the turn of the century, educators, jurists, psychologists, and youth workers—worried about teens trapped in dead-end jobs who spent their time in pool halls and dance halls or on street corners—radically reconstructed the adolescent experience. They restricted child labor, constructed a new juvenile court system, and expanded and extended schooling. A new high school opened every day for the first 30 years of the 20th century.
The Great Depression made adolescence a normative experience, transcending class and ethnic lines. Out of a mixture of altruistic and selfish motives, child labor was finally outlawed. In 1936, for the first time, a majority of 17-year-olds attended high school.
Inside high schools and junior highs (a product of the 1920s), young people created their own distinctive peer culture, with its own styles, language, and customs, including dating, which first appeared during the 1910s. In 1941, the word “teenager” entered the language, referring to a distinctive culture and market rather than to a biological stage of life.
In our own time, psychologists, led by Jeffrey Jensen Arnett, and sociologists like Michael J. Rosenfeld have identified a new life stage. Emerging adulthood is a period of relative independence, during which the young leave the parental home but before they enter a steady career and establish a family of their own. An outgrowth of such developments as the expansion of higher education, delayed marriage, and the growing acceptance of sex outside of marriage, emerging adulthood is, according to Arnett and Rosenfeld, a period of identity exploration, instability, self-focus, and self-discovery, as the young travel and experiment with a variety of jobs and relationships.
The importance of historical context is also apparent in the lives of many members of the “Greatest Generation.” The circumstances that shaped their early lives were the Great Depression and World War II, which forced young women and men to grow up quickly and to take responsibility for themselves and their family’s well-being. In their later lives, straightened economic conditions would encourage an emphasis on family and discouraged conspicuous consumption.
The post-World War II economic boom allowed many members of this generation to leave central cities and establish intense inward-turning, child-centered homes in the rapidly expanding suburbs. For many of these adults, marriage itself was defined less by intimacy, mutuality, conversation, and companionship than by self-sacrifice. Women were expected to sacrifice their individuality and self-fulfillment for the sake of their husband and children, while men were expected to devote themselves to the job that would support their family. Family roles were clearly delineated, with the husband as breadwinner and disciplinarian of last resort, and the wife as mother, homemaker, and epicenter of anything having to do with emotions and relationships.
Nowhere is the role of socio-economic and cultural context in shaping development clearer than in the shifting nature of the transition to adulthood. Over just a quarter century, a pattern that was thought of as normal in the 1950s gave way to patterns that were radically different. During the 1950s, school leaving was followed, in rapid succession, by male entry into a full-time job, early marriage and childbearing. It was supplanted by very different—and much less orderly—trajectories.
No longer is post-secondary school attendance confined to the years between 18 and 21. It often involves intermittent attendance over a larger number of years at a variety of institutions. Meanwhile, while some women bear children early, often outside of marriage, many others postpone marriage until the late twenties or thirties and delaying childbearing into their mid- or late-thirties.
Two lessons grow out of this brief history. The first is that human development does not end with adolescence or young adulthood, but continues across the life span. And while individual development is shaped by particular sociological and economic circumstances and by culturally defined chronologically-specific values, roles, and expectations, these do not predetermine a person’s path through adulthood, which is a product of one’s personality, choices, opportunities, and good or bad luck.
A second and even more important lesson is that in recent years women and men have acquired much more control over their pathway through adulthood. The norms, roles, and expectations that defined the standard life course have eroded, leaving individuals freer than ever to choose the way they wish to live. Choice has become more central to adulthood, as individuals acquired more and more freedom to choose whether to marry or cohabit or to live alone, or to leave a union and start over, or whether or not to bear children.
Freedom can be a burden. It is, of course, easier, in many respects, to follow a prescribed, predictable life path. But it is far better in the end to act like a true grown up and decide for oneself what kind of life one wants to lead.