The SMART Act
I doubt that it will help with our obsessive smartphone behaviors.
Posted Aug 08, 2019
Senator Josh Hawley recently introduced the Social Media Addiction Reduction Technology (SMART) Act to deal with, in his words, “social media addiction.” The Act is pretty straightforward and yet it will neither garner support in the Senate nor reduce social media addiction. Here are the positives.
In the “Findings” section the Act asserts that, (1) the business model for many Internet companies—particularly social media companies—is to capture users’ attention; (2) to capture attention some companies design their platforms exploiting brain physiology and human psychology and (3) these design choices interfere with free choice of users. I can’t argue with any of these findings.
The goal of the attention economy is to use any and all tools to grab one’s attention and keep a person’s eyeballs glued to a company’s screens starting with the choice of icon image and colors and including behavioral psychology tools such as streaks, infinite scrolling, self-loading media and in-game purchases among many others. I watch my young grandchildren play their “free” games only to ask dad if they can play longer and can they pretty please just buy this tool that will help them get to the next level.
One place where I disagree is placing all the blame on social media companies and focusing on screen time as a manifestation of the addiction. Some of my concerns stem from what the SMART Act actually stipulates. First, I am concerned about discussing this as a case of social media “addiction” that can be solved by reducing screen time.
I encourage the reader to check out Dmitri Christakis’ op-ed in JAMA where he discusses the challenges of understanding digital addiction in children and comments:
“It is not as simple as time spent on a device or activity but rather how that time is spent that matters. Therein lies the challenge. Disentangling the complicated effects of media usage to establish guidelines that can inform public policy and industry regulations requires a fine level of granularity. Even drawing distinctions between social media vs. passive viewing or gaming is inadequate. For example, 1 hour of social media usage could be spent in an online support group. For an LGBTQ (lesbian, gay, bisexual, transgender, queer) teen, such a community can be an invaluable and otherwise unavailable supportive resource, but for a teen with an eating disorder that social media exposure may normalize and even encourage the behavior.”
The SMART Act specifies that social media companies must stop using any of the following tools: (1) infinite scroll or auto-refill, (2) elimination of natural stopping points, (3) autoplay, and (4) badges and other awards from using the site. Again, I don’t disagree that these are all tools that can keep someone glued to a screen for a time period that disrupts relationships and might harm well-being. I say, “might” because the research is not completely convincing one way or the other, with some studies showing effects on anxiety, stress and depression while others demonstrate no impact. And, most of these studies are correlational, which precludes attribution of causality.
The SMART Act goes on to stipulate that a user should be “allowed” to set a time limit to block his/her access (with an allowance for exceeding the time limit in increments of one minute at a time). According to the act, the platform must limit use to 30 minutes a day unless “... the user elects to adjust or remove the time limit.” The act further requires devices to provide usage data across days, weeks, and months for each platform as well as including a pop-up every 15 minutes across platforms to alert the user of time spent on that platform. The act also requires users to display their opt-in and opt-out type choices in identical color, size, shape, font, etc. Finally, the Act requires the creation of a commission to investigate these issues and report to Congress every three years or less.
There’s a lot more in the act that you can read yourself but here are my issues with the act. First, it has provided little or no basis in research that links these actions by tech companies, screen time and social media addiction. At best the research is controversial and correlational. Remember that smartphones are less than a dozen years old and social media is about the same age. It takes years to make a paradigm shift such as the act proposes and even then there is always controversy.
Take video gaming as an example. Is it harmful? Do violent video games cause aggressive acts? If you peruse the literature you will see two staunch camps that have been arguing these issues using a variety of research tools for decades and yet there is still no consensus. This is why Internet gaming disorder did not make the most recent DSM-5 (Diagnostic and Statistical Manual of Mental Disorders), the bible for diagnosing psychiatric disorders. It did end up in the appendix suggesting it may be considered for a future DSM with further convincing research.
My second issue is with the concept of “addiction” itself. Addiction has a specific psychiatric meaning that deals with, among other symptomology, more use needed to gain the same pleasure over time, dishonesty about usage, neglect of work or school, and neglect of relationships. I have never been convinced that social media fits the criteria for addiction. Yes, some people might show those listed symptoms but most do not.
What we find in our research is that anxiety plays a major role in why someone is constantly checking social media, sometimes referred to as FOMO or fear of missing out, or nomophobia (not having Internet or phone access due to a variety of causes). Anxiety-based disorders are different from addictive disorders in that the “obsessed” user checks in to, say, Instagram, to reduce the anxiety of not knowing what has been going on since the last check-in.
In our models predicting sleep problems and academic performance, we find this anxiety is a prominent predictor but is also mediated by social media use among other tech-related variables. That means that more technological anxiety the more you use social media and the more you have sleep problems or do poorly in class.
As I see it, problematic reactions to technology, particularly social media, are on a sliding scale from addiction at one end to obsession at the other. Sometimes we check Instagram and get pleasure out of seeing pictures of our family and want more of that feeling, putting us on the addiction side. Other times we check because we haven't checked for a while and it helps reduce our anxiety of not being on top of what is going on, putting us on the obsession side. Sometimes it's a bit of pleasure and a bit of obsession and we slide back and forth on the continuum.
My third issue is that the act just assumes that users, when shown their massive tech use, will slap their foreheads and set strict limits on their usage. This not only sounds ridiculous, but it is also false. In several of our studies, we had subjects install an app on their phone to monitor daily smartphone unlocks and minutes and send the data to us four times during the semester. When asked “Did you look at the data before sending?” the answer was “yes.” When asked if it was more, the same or less than expected, the answer was “more.” When asked, “Did you do anything to reduce your phone usage?” the answer was “NO!”
We just concluded a study with teens and Millennials, providing the same app baseline data but then allowing the participants to select 2 to 4 strategies in two of the following categories: enhance communication, enhance focus and attention, enhance well-being, and reduce sleep problems. We have data from more than 300 Millennials and 75 teens including their self-reported daily technology use, their app-reported phone usage, other psychological issues (anxiety, stress, depression, boredom) and a written report on the impact of adopting these strategies. I hope to report on these data sometime in the next few months.