Skip to main content

Verified by Psychology Today

Self-Centered: The New Normal?

We are living in changing times, in an era of a poorly studied morality shift.

As a child and adolescent psychiatrist, when I listen to my young patients, I often hear personal tales of crisis, confusion, and even despair. Throughout my clinical work with these kids, I try mightily to remain balanced. Both empathy and hopefulness are juxtaposed to my relentlessly working at removing any barriers, be they biological or psychosocial, that block progression toward living happier, healthier and more productive lives in their communities. These humane and measured emphases loom as basic tenets of my profession as a physician, and they have remained so since time out of mind.

Yet what has not remained as firmly rooted in time as my professional traditions is the culture itself, the communities in which these kids live. Societal trends have drifted away from an emphasis on community and the common good and moved toward the need to take care of self, perfect oneself, even to the point of self-aggrandizement.

Employing a Google research engine that counts word usages in published writing form 1500 to the present, three academics, Jean Twenge, W. Keith Campell and Brittany Gentile, recently mapped out how words and terms appearing in print have drifted away from the usage of community based ones toward more individualistically based ideas. That is, terms and words like “self” and “unique” and “I come first” or “I can do it myself” have become more frequently printed. Words like “collective,” “share,” “band together,” and “common good” are receding in usage. Concurrently, words such as virtue and conscience appear less frequently in printed media while others about self-betterment and self-strengthening arise more frequently.

In a real sense, the zeitgeist in the early 21st century whispers to our kids to take care of themselves and ignore the community at large. We are living in changing times, an era of a poorly studied morality shift.

Rather recently, academic researcher Dr. Audrey Longson, speaking at the annual meeting of the American Psychiatric Association in San Francisco, presented some data on the interplay of reality TV shows and moral trends afoot in the lives of young people in our society. Specifically, she emphasized narcissism, which is characterized by pride, vanity, and the focus on oneself, even at the expense of others. Using a unique set of measurement tools including the so-called Survey Monkey to access a group of media-savvy participants, she uncovered a subtle correlation between immersion in both voyeuristic shows Like “The Real Housewives of San Francisco” and more skill-based shows like “Survivor” and the development of certain narcissistic traits in their youthful audience. Tendencies toward exhibitionism, voyeurism, the drive to have power over others, and the need to see oneself as very, very special seem more pronounced in those youths immersed in these shows versus those who are not. The distinction is even clearer when these reality show viewers are compared with those watching educationally directed shows.

The researcher sees her work as provisional and acknowledges that the issue of cause and effect remains fuzzy. Do the shows themselves, during which the Kardashians go on a shopping spree at the mall or when Snooki while lunching with a friend disses her for being overweight, attract youths already prone to being self-centered? Or do they instill in a sensitive viewer a positive impression of these types of behavior, a normalizing of these self-centered acts?

Since my own work with teens and preteens transpires in a clinical setting, not in a research one, my bias tips toward seeing both these ideas as true. Certain especially vulnerable youths struggling with identity confusion are attracted to such narcissistically driven shows, and the shows may simply strengthen these trends. But also many impressionable youth might find immersion in these shows swaying their perceptions of the normal toward the narcissistic. Since adolescence typically involves a confusion about how to see oneself in the context of the wide world, it is easy to see how even a relatively intact adolescent can experience the message delivered by these shows as not just entertaining and amusing but also as appealing, even compelling.

If we expand this argument about self and narcissism versus community and the common good a bit further, we finds ourselves pondering other large bodies of scientific evidence regarding for instance media violence, or teen sexuality, or the glamorization of drugs and alcohol or the prevalence of a rather unglamorous trend, that is childhood obesity. Scientific literature in all these areas suggests that media is having potentially deleterious effects on young humans prone to too much screen time. In every instance, the media if on a subliminal level encourages certain behaviors and hence impacts on the moral development of the child.

Let us take as examples gluttony, a traditional Christian vice, and moderation in all things, a Classic virtue. Most parents see value in these moral imperatives. Yet when children sit for hours in front of screens and absorb a long row of commercials, they hear and see images that encourage them to seek after fatty, salty and sugary foods. These messages contend with the Classic and Christian virtues of moderation and abstemiousness. Children learn to gratify themselves and their senses in the moment, and so they risk obesity. And childhood obesity is reaching alarming levels.

Or as the epidemic of bullying would suggest, kids now live in a broader culture buttressed by a media culture in which the opposite of compassion and empathy—the hallmarks of my therapeutic work—are shoved off stage. The norm has inched toward youths seeing the vulnerable child not as one to be protected, nurtured and encouraged, but rather as a weak link to be made fun of, treated as a laughing-stock, in need of public humiliation, even on Facebook.

I wonder where these trends will lead. In terms of the present generation, which includes the troubled kids sitting in my office, many have become glued to screens not to books, at a level never experienced before in the history of childhood. Per a recent Kaiser Family foundation study, the average has risen to at least eight hours per day. With media immersion ascendant, children eschew time expended on friendships, schoolwork and family. So they are certainly leaving themselves open for many, many hours per day to absorbing the feelings, ideas, and indeed the morality of the media world to which they tune in. And they are diminishing contact with friends, teachers, and family who might offer them radically different perspectives.

In short, the question the researcher at the APA meetings poses regarding cause and effect has a fairly clear answer in the realm of social reality. Hours spent with any attractive and appealing “friend” on TV or the movies or the Internet influences the child’s moral development, just as hours spent every day with a good tennis coach strengthens the child’s tennis game. Or close contact with an intellectually curious teacher strengths the children desire to learn. Or hours in close connection with a wise parent enhance a child’s sense of right and wrong.

Dr. George Drinka is a child and adolescent psychiatrist and the author of The Birth of Neurosis: Myth, Malady and the Victorians (Simon & Schuster). His new book, When the Media Is the Parent, is a culmination of his work with children, his scholarly study of works on the media and American cultural history, and his dedication to writing stories that reveal the humanity in us all.

More from Psychology Today

More from George Drinka M.D.

More from Psychology Today