Certain, notable epochs are lucky or unlucky enough to be the bearers of elegant, self-zeroing equations, or culturally apt perfect storms. The hubris and entitlement of the late Roman Empire begged for barbaric invasions. The feudal dissolution of the Middle ages was just asking for a plague for culling and a printing press for democratic restructuring. At hand, two converging fronts, the Digital Age and ADD, seem to bely a crisis, or less dramatically, a phenomenon, in need of a solution, or again less dramatically, some understanding.
First, to define some terms. The Digital Age is already at risk or past risk of becoming a trite throw-away trope. Beyond a pat moniker for the time of gadgets, web, and smart-phones, it should be considered in it’s most literal form as a cross-roads for facile reductionism (all stimuli parsed or built from numerical code), and infinite possibilities (using that coding potential towards a new informational, intellectual, aesthetic creativity). The choice of which path to be determined by a struggle between personal and cultural (internal and external) values. To be fair those values will likely continue to be shaped by the historically big players (greed, lust, need for validation, and hopefully, need for community and engagement).
ADD (Attention Deficit Disorder) is a constellation of symptoms consistent with inattention, distractibility, trouble listening, with hyperactive and impulsive subtypes showing additional symptoms in those realms. In this thesis, maybe I’m talking more about, ASAD (Acquired Situational Attention Disorder), freshly coined here. I’m throwing this out there as a construct because it’s likely that most ADD is not really ADD. In the early 70’s, the prevalence of ADD jumped from 1-2% to 6-7%. Since it is unlikely that the government or a maleficent cell is tweaking our water supply with neurotoxins, it is more likely that we have culturally (which drives medically) expanded the inclusion criteria. Not to mention the exponential hordes who would claim ADD without meeting even the expanded criteria of DSM-V. So, perhaps ASAD can be used to included the broader, gestalt criteria of: my attention, focus, and concentration seem at least subjectively and maybe objectively inadequate to excel in the career, culture, or any other dynamic, that I have chosen to pursue. This might help account for shortfalls like not being able to read or think as fast as your lap-top or iPhone can spew, or not being able to watch and comment on as many YouTube videos as, well, there are.
But the sum of the parts is the scary, or less dramatically, interesting part. If in reducing information and impressions to rapidly reproducible code, the Digital Age has infinitely increased our possibilities and choices, must it not necessarily decrease the valence our value of any previously, more highly esteemed construct (like flooding the economy with printed money)? And if ASAD is rapidly decreasing our predisposition if not full-on capacity to focus, concentrate, and, mostly importantly, discriminate in making choices and decisions, wouldn’t it follow that piling on more devalued choices = gas on the fire? N.B. I'm not for censorship or controlling the means of production, but there's probably a difference between democratization and a tech-novel-fueled info-frenzy.
Historically, this hyperaccelerations in tech and culture seem to have a few ways to go. Scary apocalypse mode (late industrial revolution into modernism–Boom–WWI). Adoption/Assimilation Mode (Printing Press–Renaissance–Age of Reason). My fear is that the Digital Age ASAD combo could go a little something like this. Infinite but universally devalued choices in the hands of subjects who have abdicated the valuation and drive towards personally identified, meaningful constructs (values-based), start forming constructs based on trends, economics, or non-sense. A volcano blows. We all die.
One parting e.g. Climate change. Infinite data points, anecdotal or trending are sprawled out before us. No doubt there exists an internally consistent methodology to predict risks, benefits, and future (as our patient, analog farmers have done). But, possibly, has our acceptance of our own ASAD undermined the value of a good, likely to be true story, to the point where a crazy-pants, hand-waving, look-there’s-a-squirrel story, like “the data’s not in yet,” can slip in seamlessly. Yep… GD