The medium is the message.
- Marshall McLuhan
In his new book, The Circle, Dave Eggers has imagined a near future that could be our own. A young twentysomething, Mae Holland, has been hired to work for the Circle, a massive internet company in the Bay Area that has absorbed all other social networks and search engines. Mae thought she was in heaven, working for what seemed to be the most innovative, greenest, humane corporation on earth. A central innovation was the development of TruYou, which was “one account, one identity, one password, one payment system, per person… no more passwords, no multiple identities…one button for the rest of your life.” Another technology–again, not at all far off–was a network of millions of cheap HD digital cameras (SeaChance) providing a real-time video capture of every nook and cranny of planet Earth. The Circle celebrated this seemingly humanitarian form of surveillance with such credos as: “All that happens must be known.” “Sharing is caring.” “Privacy is theft.” The protagonist Mae, working as she does in “customer experience,” worked to amass thousands of “smiles” (likes), constantly under the surveillance of a complex technology infrastructure that rewards increased sharing, transparency, and lack of any work-home boundaries.
Orwellian allusions abound in The Circle
, though Eggers has captured our uniquely modern preoccupations and anxieties in his fable. There are obvious references
to Google. Some employees have “gone retinal” in their workstations, forgoing their archaic tablets and laptops. The architecture of the Circle’s campus features plated glass because it’s all about transparency. Of course, it isn’t merely transparency, as we discover, but surveillance and control. The glass in the Circle is more like the glass of an aquarium, glass that contains a prison of sorts, complete with edible plankton, small fish, and one large shark. At one point, an exasperated Mae proclaimed that “the volume of information, of data, of judgments, of measurements, was too much, and there were too many people, and too many desires of too many people and too many opinions…all of it constantly collated, collected, added and aggregated, and presented to her as if that all made it tidier and more manageable — it was too much.” Without giving anything away, we discover that increased levels of transparency and data collection have their downsides. Which is another way of saying everything turns into a totalitarian nightmare.
As in the fictitious world of The Circle, we also live in a world where our personal information is being harvested more rapidly and more covertly than ever. Certainly, we should be alarmed at mass surveillance and associated government secrecy by the NSA. One might wonder, though, if we are sufficiently alarmed by the corporate ownership of our private information—data collected invisibly as we search, purchase, and play online. To narrow the focus a bit, we might also wonder about how such a dynamic between technology, commerce, and culture affects our own psychological well-being and sense of identity. Are there psychological drawbacks to being datafied?
If we have embedded ourselves in the digital-consumer universe, our likes and our dislikes have been tracked and our future purchasing behavior can be predicted. Amazon can tell us what we might buy next. Presumably, there are ways that our purchasing behavior can also be predicted by the myriad other online choices we make. Nothing about this is particularly newsworthy and, of course, we are willing (if sometimes naïve) participants. As the writer David Annand has written in the Telegraph, we’ll mooch rather than march our way into the digital future. The question worth asking, though, is what kinds of psychological effects such a relationship with commerce and technology have over time–collectively and individually.
This is a similar question that the writer Nicholas Carr asked in his thoughtful essay in the Atlantic, “Is Google Making Us Stupid?” Carr, who later developed his thesis further in the book, The Shallows, examined the effects of computer technology on how we think. He articulated what most of us have already experienced: that it feels as if our brains have been tinkered with by technology. Our attention wanders when the prose gets long. We are easily distracted by all we can do online and become fidgety when faced with deeper passages of reading.
The impact of technology on us may be much more far reaching and elusive than we imagine. Consider our common view of how “likes” work in social media and online shopping: we make choices that shape how our technology behaves. Our internet searches narrow the field to what we think we want. The more people like particular products-people-experiences, the more we desire them. The customer, after all, is always right. But could it actually be the other way around? That technology shapes us. We make choices in response to the digital context in which we operate, and are thereby affected. My identity is not just an expression of my online choices but a result of them.
The psychoanalytic view is that we never quite know what we really want and there is much about ourselves that defy quantification. Our desires often mask deep wishes and yearnings not readily obvious to us. We placate ourselves in routine, comfortable patterns of behavior that protect us from the possibly terrifying task of finding out what we really want. The various means of online surveillance form a kind of redemptive mission in showing us what we’re like, easing the existential unease we face. In his essay, Circles, Ralph Waldo Emerson wrote: “The one thing which we seek with insatiable desire is to forget ourselves, to be surprised out of our propriety, to lose our sempiternal memory, and to do something without knowing how or why; in short, to draw a new circle.”
Carr concluded his essay in the Atlantic by recounting the chilly scene in 2001: A Space Odyssey where HAL is disassembled, one circuit after another. “I can feel it. I can feel it. I’m afraid.” HAL’s emotional outpouring is contrasted with the relative emotionlessness of the humans, who behave more robotically. Carr wrote that, “In the world of 2001, people have become so machinelike that the most human character turns out to be a machine. That’s the essence of Kubrick’s dark prophecy: as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.” Enamored as we are with being able to quantify ourselves and the world around us, we may fail to recognize how we have changed in the process.
© 2013 Bruce C. Poulsen, All Rights Reserved