Technology Designed for Addiction
What are the dangers of digital feedback loops?
Posted Jan 04, 2018
One of the things that Wade L. Robinson discusses in his book, Engineering Ethics, is the importance of avoiding “error-provocative” designs, in which the technological artifact not only allows for the possibility of human fallibility but actively steers the user into the direction of harm. He spends a great deal of time discussing stove knobs, for example. I imagine most of us have had the frustration of thinking that we are activating one burner of the stove, say the left front, only to find that another burner, the back right, has actually heated. It turns out that every stove manufacturer has a slightly different way of aligning the knobs to the burners, and this design problem is surprisingly complex. Most of the time, the mismatch between knobs and burners simply causes frustration, but it can cause a fire in some cases, and it only takes a minute or two for a house fire to reach out-of-control levels. Robinson also discusses a plane crash in Colombia that was caused by a glitch in the autopilot software and other instances of design problems leading to injury and death, like the ignition switch failures in GM cars back in 2015.
Tech companies now face criticism for breakdowns caused by design. Recently much discussion has been focused on social media companies and the way that they manipulate human psychology to keep eyeballs glued to the screen. There is the problem of the “time suck” related to compulsively checking Facebook, but there are also the social problems that these technologies appear to be exacerbating. Even some tech executives have been joining the criticism of Silicon Valley tech giants or are being forced to go on the defensive after years of wide-eyed enthusiasm for social media. A video of a Stanford interview with former Facebook executive Chamath Palihapitiya has been making the rounds on the internet, as Palihapitiya directs absolutely devastating criticism towards the social media giants. “The short-term dopamine-driven feedback loops that we have created are destroying how society works,” Palihapitiya says, having grown jaded with the need for venture-funded, short-term gains at the expense of quality of life and civic dialogue. “It literally is a point now where I think we have created tools that are ripping apart the social fabric of how society works. That is truly where we are. I would encourage all of you, as the future leaders of the world to really internalize how important this is. If you feed the beast, that beast will destroy you.” The fact that this criticism comes from a high-level tech insider makes it that much more shocking.
Think for a second about how some of the most powerful technology the world has ever seen is now being used not to solve world hunger or send astronauts to the moon, but to get people to click on ads and buy stuff on Amazon. The feedback loops of social media also drive political polarization and confirmation bias, as we are constantly pushed in the direction of content that aligns with what we already believe and fits with the demographic groups to which we already belong. As we get more and more used to the creep of technology into our lives, this comes to seem completely normal. Those of us who remember life before the internet will be fewer and fewer, and eventually, no one will know what life was like without constant access to the internet and social media. No one will remember what it was like to eat dinner without taking a picture of it or have a conversation without referencing a meme. All of this “disruption” is driven by technologies purposely designed to be addictive.
Nir Eyal, a friend of mine from college, wrote a book called Hooked: How to Build Habit-Forming Products, in which he outlines, step-by-step, the operant conditioning tricks used to make an app addictive. By using variable rewards, in which a digital “treat,” like Reddit upvotes and gold, the gems and coins in various games, the likes in Facebook etc., which are only sometimes distributed, the user comes to anticipate the slight rush of the fleeting reward. Because the reward is not reliable, the twitchy behavior is triggered, in which we feel like we have to keep checking for messages, likes, and status updates. Eyal is a consultant for companies looking to develop these habit-forming features in their products, but, to his credit, he does include a section on ethics in his book, titled, “The Morality of Manipulation,” and he avoids the trap of fobbing off responsibility for addictive products onto the end user. He stresses that designers should take the good of the end user into account when offering a digital product or service. But he also implicitly recognizes that we are conducting a massive, uncontrolled experiment on the human psyche with the advent of social media and digital advertising.
In my experience, some people have the ability to override the addictive impulse, and others less so. All of us, to some extent, can say to ourselves, “Okay, if I scroll all the way to the bottom of this feed, it will just load a new batch of entries.” The continuous nature of feeds on Facebook and Reddit leave no natural stopping points where it would make sense to just quit surfing. Apps that swipe left and right mean that there are loops in all directions: up, down, and sideways. So we can logically know that the feed keeps going forever, but can we take the next step and disconnect from the platform? There are multiple sites dedicated to video game addiction, with some games, World of Warcraft being the most famous example, leading users to quit their jobs, neglect children, and get divorced--all just to keep playing the game. As games get even more immersive, with augmented reality and virtual reality features, combined with monetized incentives and built-in conditioning, the addictive aspects seem likely to increase in the years ahead. So far, Facebook and Twitter use are less likely to be stigmatized than compulsive video game playing, but they are arguably just as addictive.
You may be reading this and thinking, “Okay, so people are checking their phones all the time, posting pictures of themselves on Instagram, and getting addicted to dorky little games, but what’s the problem? Where’s the harm?” Well, starting with actual death, the CDC estimates that every day in the United States, nine people are killed and 1000 are injured as a result of distracted driving. While distracted driving is nothing new (Americans have been eating hamburgers and fries while driving for decades), texting while driving is particularly dangerous because it combines multiple types of distraction. Looking at a text takes the driver’s eyes off the road (visual distraction), at least one hand off the wheel (manual distraction), and the mind off the task (cognitive distraction). It’s easy to say that people should not text and drive, and many states and countries have passed laws to this effect. But the design problem, the “error-provocative” aspect of the technology, has not been addressed in a meaningful way. It seems likely that more accidents will occur, such as the recent train derailment in Washington, in which the conductor was distracted, possibly as a result of cellphone use, and three people were killed. A comprehensive solution would link the cellphone to the heavy machinery in such a way that the engine would not run without certain cell phone features being disabled.
To shift to more intangible harms, the problem of wage stagnation has gotten a lot of headlines in recent years as income inequality grows. Meanwhile, productivity has soared. The excess productivity has gone towards increasing corporate profits and C-suite pay, but workers have been largely shut out of the monetary benefits. Some of the gains in productivity and efficiency, I suspect, are going directly to social media use, in the absence of increased wages. Even as workers feel more stressed and undervalued than ever before, they are not working fewer hours. Instead, they channel job-related stress and dissatisfaction into social media and digital games, in a slowdown strike of massive proportions. Users in the United States now spend about five hours per day on smartphones, with television now being outpaced by time spent on apps. Instead of a shorter, more compressed workday, work is now spread throughout the waking hours, with frequent “breaks” on social media. Of course, these “breaks” are actually unpaid work performed on behalf of the social media companies themselves. Every vacation photo, every email, every digital interaction is now a vector for big data profiteering. This would not be a problem if life satisfaction and overall health were actually increasing, but it seems to be the case that incomes are stagnating and lifespan is actually decreasing in some cases.
According to a paper by economist Devrim Dumdalug in the International Review of Economics, income levels in the United States doubled in the postwar period, but self-reported happiness barely budged. Since the economic downturn of 2007-2008, we have a situation of increased productivity but with stagnant wages. Again, this would not be a problem if workers were generally happy and healthy, but instead, we have huge public health problems like obesity, diabetes, and high blood pressure, along with higher rates of depression and anxiety, suggesting that digital entertainment is not the best way to spend leisure time. Given the choice between a walk outside and twenty minutes on Facebook, the better choice for both mental and physical health would be taking a walk outside. My main point here is that people are not making the choice for better health because of the addictive features built into the technology. Rather than simply blaming the end-user, we should hold the technology companies accountable for the way that they deliberately foster addictive behavior.
We need to stop using words like “addictive” and “disruptive” as though they were compliments and demand that both corporations and governments take the public good into account when designing and regulating new products and services. Technology by itself doesn’t automatically improve our lives: it has to be used in conscious and deliberative ways to improve our wellbeing. Designers and engineers must stop building distraction into the systems that we use because our safety and wellbeing are at stake. In the meantime, it seems we are left to our own devices (no pun intended!) to stop scrolling down and start living the sorts of lives that we want to live. The first step is to realize that addiction is built right into the app and to become more conscious users of social media. Then we can begin to reclaim our time, to live more sane, peaceful, and healthy lives. We don’t have to give up technology, even if that were possible, but we do have to be more careful in our use of it.