Diagnosing ADHD in the Coming Decades
Part One of Three: ADHD in the 22nd Century .
Posted April 25, 2021 | Reviewed by Lybi Ma
- The definition of ADHD has changed and will change over time.
- New diagnostic tools will evolve to help us understand the biological underpinnings of ADHD.
- Cultural or environmental changes will influence a diagnosis as well.
The most pressing question for some new patients coming to the office: “Do I have ADHD?” They may have been ruminating over this question for months, years, or even decades before finally seeking a professional opinion. After a series of questions about their current symptoms, personal history, and family history, we might say “Yes...” Or, “No... probably.” Sometimes we might ask the patient and a significant other or family member to complete a structured ADHD symptom questionnaire, and/or a computerized test of attention and impulsivity (continuous performance test or CPT). After we render our august opinion on their diagnosis, we sometimes get a quizzical look from the patient and a follow-up question: “How did that tell you whether I have ADHD? I thought I was going to get a brain scan, DNA test, or something.”
The truth is, in 2021, asking questions to get a patient's story remains the most useful element in diagnosing ADHD. This is not unique to ADHD, it's true for every other psychiatric and many neurological disorders because we don’t have reliable, clinically useful tests for any of them. We don’t do blood tests, brain scans, or genetic tests to diagnose depression, anxiety, schizophrenia, or bipolar disorder, and we don’t do them for ADHD either, because right now, there aren’t any that work. The objective tests we do have, such as a CPT, are unreliable and of little value taken in isolation, as even the makers and marketers of these tests are careful to admit. And so we’re left with the longstanding art of medicine: taking a good history, making observations, and then combining that with our clinical experience and expertise to draw conclusions.
We ask the patient: Do you get distracted easily? Have trouble staying on task? Procrastinate? Feel overwhelmed by your “to-do list?” Does this happen so much that you routinely don’t get anything done? Do you have trouble listening or getting organized? Have a tendency to interrupt other people and talk over them? Bring up unrelated topics in the middle of a conversation? Does this happen so pervasively that people have commented negatively about it?
Do you have trouble sitting still in meetings? Fidget with your hands and feet a lot? Make impulsive decisions that surprise other people? Skydive much? And now that we’ve discussed all that, what about the past? Was any of this true for you as a kid? Were your parents always afraid about the next time you’d come home with a broken bone, or with another note from the teacher? Did you daydream all day in class? You seem pretty smart, how were your grades? Did you turn in your homework?
We rarely get through even a fraction of this list, because the patient jumps in and tells us the rest on their own: “Procrastinate? I am the Queen of procrastination.” “The King of distraction.” “I lose things Every day,” “OMG, I constantly interrupt people. My friends get super annoyed.” If the patient’s story is convincing and consistent, if there is no better psychiatric explanation, and if symptoms have gone on long enough (for at least six months and at least some since childhood) then they have ADHD based on current diagnostic criteria.
But notice the phrase “current diagnostic criteria” in the last sentence. These are the criteria we use now, in 2021. But the criteria have changed since the concept of ADHD first started to emerge in the early 1900s. Back then, what we now call ADHD was considered a “moral disorder” (bad kids with bad parenting). Over the next few decades, a notion gained traction that differences in such behavior could relate to brain anatomy and physiology. Still, until the 1960s, there remained no formal recognition by psychiatry that ADHD existed as a clinical entity. In some ways, the ADHD community is still grappling with this legacy.
In the 1960s, doctors started taking ADHD more seriously, but still only in kids. At the time, it was called Hyperactive Kinetic Disorder; emphasizing the hyperactive and behaviorally disruptive features that were the main concerns of parents and grade-school teachers. Over the next decade, some clinicians began to prefer the term Minimal Brain Dysfunction (to distinguish it from bigger childhood brain dysfunction issues like Cerebral Palsy). This new way of thinking had a side-benefit: clinical researchers began to consider other possible effects of 'minimal brain dysfunction,' including emotional and attentional features.
In the early 1980s, the DSM (Diagnostic and Statistical Manual of Mental Disorders, the official “bible” of psychiatric diagnosis in the US) changed the name of the disorder to highlight its inattentive features, calling it Attention Deficit Disorder (ADD) with (or without) hyperactivity. The DSM is periodically revised based on new research and clinical experience, and so there have been iterative changes since then, most recently in DSM-5 (published in 2013), which calls the entirety “ADHD” with some added specifiers based on the presentation of predominantly inattentive features, hyperactive/impulsive features, or both combined. For the first time, it also fully embraced the typical continuance of these symptoms and challenges from childhood into adulthood.
That was then. For ideas on the future directions of ADHD diagnosis, check out our post next week for Part Two.
1. https://www.additudemag.com/cpt-not-accurate-adhd-assessment-for-adults…, https://journals.sagepub.com/doi/abs/10.1177/1087054718822060, https://www.karger.com/Article/FullText/508041
2. South Med J 1976 May;69(5):642-53. doi: 10.1097/00007611-197605000-00047