Diagnosing ADHD in the Coming Decades
The future diagnostic criteria for ADHD.
Posted May 5, 2021 | Reviewed by Jessica Schrader
- Differences in diagnostic techniques, and in daily life and expectations, all can play a role in the diagnosis of ADHD.
- When ADHD was first described, we didn't have to cope with a number of distractions such as electronics.
- It would be ideal to have biological markers for ADHD, but at this point we have our necessarily limited diagnostic criteria.
This post is Part 2 in a series.
In our last post, we described the clinical diagnosis of ADHD based on behavioral criteria. But rating behavior is never objective; it’s always an opinion. It's always slanted by the perspective and attitudes of the observer, whether that is the patient themselves (their self-evaluation), or of others, as well as the diagnostic biases, training, and experience of the evaluating clinician. Reports of behavior—including the specific words chosen and their connotation—are necessarily influenced by the observer’s definitions of what is “OK."
Personality differences, expectations based on the observer’s upbringing and belief system, and cultural differences all play a role. “Tough graders"—including really strict teachers but also other people (parents, spouses) who tend to be behaviorally rigid, demanding, and judgmental—may exaggerate their reports, making it sound like a kid is a perfect hellion in class or a wreck because they “should” be getting A’s instead of B’s. Or they may opine that their (possibly soon to be ex-) spouse is the devil incarnate because of impulsivity and inattentiveness. We might be far less impressed by the severity of these symptoms during the office visit (or the reverse could be true). At the same time, we’re not the ones teaching or raising this child, and we’re not the ones living with this adult—so we may underestimate their level of impairment in the well-defined and self-limited context of a medical consultation.
Building on our discussion in the last post, the point is this: The definition of ADHD has radically changed since before it had its current name, and in a span of about a century has gone from being considered a “moral deficiency” limited to school-age children, to an “attentional deficit” that can span a lifetime. The disorder was first recognized in children who couldn’t perform or behaviorally conform in a rigid classroom setting, and only within the past decade has the psychiatric community officially recognized (via DSM criteria) that kids with ADHD don’t simply “grow out of it”—their symptoms continue but they develop some coping mechanisms and generally adjust their lives to suit their ADHD. For example, they gravitate away from slower-paced, quiet work with delayed rewards—kids with ADHD don’t usually grow up to become librarians or government administrators awaiting their pension to kick in. Instead, they choose stimulating and fast-paced careers with short-term payoffs/reward systems: sports professionals, salespeople, firefighters, musicians, visual artists, professional chefs, entertainers, retail, plus the occasional high-powered serial entrepreneur.
Even with a suitable career, they may have issues at work that recall issues from their years in the classroom. In fact, the practical consequences of ADHD can be worse in adults than in children (as a kid, missing school homework results in lower grades or getting grounded; as an adult, not paying attention could result in a fatal car accident, missing work deadlines results in getting fired, or not listening could lead to a ruined marriage).
And what about our new technological age? Compared to the 1900s when it first started to be noted by a few pediatric specialists, or even the 1960s when ADHD started to become more recognized in mainstream clinical circles, the world is faster paced and more full of distractions (mainly electronic) than ever. Fewer and fewer of us have an agrarian lifestyle or live surrounded and paced by the rhythms of nature. This is likely to affect how easily distracted we are, and perceive ourselves and others to be. Diagnostic criteria for ADHD have not been updated fast enough to keep up with this ever-accelerating environmental shift. And what will influence focus in the next decades?
In the future, how will doctors define and diagnose ADHD? Someday, there may be a “brain scan, DNA test, or something” that works for diagnosing ADHD, as some of our patients expect now. Unfortunately, MRIs currently can only show the general structure of the brain. Although they can see into the brain in three dimensions, their level of resolution is still only at about the level of the naked eye—an MRI is not a microscope. Current brain scans can’t reveal subtle differences in brain structure that might underlie ADHD and other behavioral and emotional disorders. Even when our ability to image the brain does reach a much finer level of resolution, just being able to see detailed anatomy will still not be sufficient to tell a doctor how the brain WORKS—i.e., the fine wiring and activity levels of brain cells, let alone what the chemicals whizzing between them are doing. Similarly, while we can sequence a patient’s genome right now, it will be a while before we can “read” a DNA sequence and reliably predict what it means clinically—i.e., which genetic differences, especially in combination, predictably lead to differences in attention (or other psychiatric issues).
Maybe someday there will be a test a doctor can order to get all the diagnostic information needed to make a diagnosis, but it's simply not that day today. Maybe someday clinicians will look back and laugh at us and at today’s diagnostic criteria—that’s quite likely, almost certain—but for today, the art of medicine and diagnostic criteria remain the best avenue to understanding and diagnosing ADHD.
In Part 3: How we envision future diagnostic criteria, for the not-so-distant future and beyond.