In the late ‘80s, Terri was jealous of the other moms. Their toddlers sat quietly playing with toys while older siblings practiced soccer. Her toddler, Laura, ran up and down the sidelines—corralled from the field over and over by her frazzled mom.*

In grade school, Laura often visited the school nurse with stomachaches—a malady that disappeared as soon as school was over. In high school, stomachaches gave way to anxieties and moodiness. But no one in the mid 1990s pointed out that Laura might have ADHD: It was mainly a boys’ disorder then. Finally, in college, Laura was diagnosed and prescribed a stimulant. For about a year, she gave the diagnosis and its treatment a try. But, after the experiment and some thought, she quit the drugs and the driven and irritable personality they imposed on her. Looking back, both Laura and Terri feel lucky to have avoided the mass uptake of ADHD diagnosis and treatment that characterized the late 1990s and the 2000s—despite Laura’s bumpy childhood and adolescence.

As a philosopher who has spent a decade studying the meshing of medical, social, and scientific influences into the phenomenon we call “ADHD,” I have come to appreciate that Terri and Laura’s relief is well placed. Today’s predominant understanding of ADHD packs social, educational, and family problems; clinical needs for efficiency; and a biological model of mental disorders into a single and ever-expanding package. Some of aspects of the packaging and growth have been intentional and overt: direct-to-consumer marketing by drug companies, and broadened diagnostic criteria in the American Psychiatric Association’s Diagnostic and Statistical Manual of Mental Disorders. Other pressures—medical expense, strapped schools, social stresses, science funding—have also helped create the diagnostic steamroller: currently, 11% of US children and teens—20% of boys—are now said to have ADHD, and rates among adults are rising fast.

A subtler phenomenon has also been at work. Gradually, the idea that people can be better behaved, higher achieving, and all-around nicer if they get treated for their ADHD has become a social, educational, and medical dictum that one ought to be treated, or get one’s children treated, for their ADHD. This “ought” shows up in social pressure to get a child evaluated, to give him/her advantages or a leg up, to make oneself more productive. The “ought” gets reinforced because in some ways, ADHD is easy to see—at least, a popularized version is easy to see. We see a fidgety boy or a daydreaming girl as having ADHD—largely because this is what we’ve been taught to see. We’re attuned as well to seeing how medication changes the picture: treated, the boy and girl march through their worksheets, confirming our view of what we’ve been taught: ADHD diagnosis and treatment work, so the “ought” makes sense.

But there are several problems with received “ought.” First, despite our folk observations, little evidence suggests that standard ADHD diagnosis and treatment—which is treatment with stimulant medication and little else—does in fact provide long-term benefits. ADHD-diagnosed children certainly can become socially adept and high-achieving adults (like Laura), but not clearly because of diagnosis and treatment. Second, the “ought” itself is questionable. Why should we embrace the particular view of behavior, niceness, and achievement that “ADHD” and its treatment embed? What other ways might there be to help a child—or oneself—fit in or succeed? Or, more radically, to be happy with not fitting in or succeeding in standard ways? Finally, following through on the “ought” comes with a significant downside: stigma. It’s common today to say that biologizing mental disorder removes stigma—but it doesn’t. It changes stigma. In a biologized view, whatever traits and behaviors are labeled disordered aren’t anyone’s fault, but they are a part of a person for life. So when the traits or behaviors are things society at large looks down on—like the fidgeting and lack of certain types of productivity packed into “ADHD”—choosing the diagnosis also means choosing a negative stereotype.

For Terri and Laura, avoiding then rejecting the ADHD model worked, in large part because Laura was able put herself in school and work that played to her strengths. But embracing alternatives to ADHD is not everyone’s best choice: some people find tremendous benefit from diagnosis and treatment. I suggest that when making the choice for oneself or one’s children, it is important to understand that today’s expansive “ADHD” is not an uncontestable fact. Instead, ADHD is just one possible explanation for a wide range of problems, and drugs are one possible solution. ADHD diagnosis and treatment have a history and an outlook one can often reasonably reject.

*The story is true, the names are changed by request.

About the Author

Susan C. C. Hawthorne, Ph.D.

Susan C. C. Hawthorne, Ph.D., is an associate professor of philosophy at St. Catherine University.

You are reading

Questions of Medical Ethics

Fact, Fiction, and Religious Education

Religion often teaches that miracles are real. Does that confuse our kids?

Time's Tyranny

What Does Efficiency Cost?

Minnesota Not so Nice on Mental Health

National trends deny access and blinker research ethics.