Making credible predictions about new products and innovations in digital technology for 2014 is a tricky business. It’s easy to be found out when you get things wrong. But it’s also a very clichéd activity, because the idea of an upgrade is a perennial characteristic of high-tech marketing. It’s part of the banality of our digital lifestyle. The future is, ironically, always just behind us—locked into the last gadget we purchased, the last operating system we installed.
To envision the next big thing, we are supposed to forget the contemporary relations of people, money, regulation, and power that shape technology—the exploited workers, the toxic factories, the wasteful global supply system, the patent wars, the trade barriers, and the planned obsolescence.
Instead of facing the nitty-gritty challenges of this physical world and its political-economic-environmental parameters, we are supposed to revere technology’s capacity to augment our collective store of skills and intelligence, and be blown off our feet by accelerating innovation. Only our credit card bills snap us back to the present, reminding us of the grim distance between our real economic future and the fantasy promised by the merchants of upgrade.
A further irony: an avowedly fictional form of futurism, science fiction, can disturb the religious one-dimensionality of technological prediction, because it frequently involves both utopian and dystopian stories. Mary Shelley’s Frankenstein is often identified as the first sci-fi novel. Some clues to the double-sided nature of futurism were already present: an artificial creature encompassed the hopes and dreams but also the fears and nightmares of humanity remade by science.
Science fiction gives us the language to judge dystopian aspects of our digital lifestyle when, for example, computers fail to do what they’re designed for, technical glitches are blamed for late paychecks, or the power goes out. Even utopian versions are imbued with haunting portents. Consider the fantasy of singularity, when technological advances converge and deliver us via a final, qualitative leap into a fugue-state where all our worries, work, and needs are handled by machines. But what happens if they don’t like us (remember HAL in 2001)?
When we examined the mainstream media’s predictions for digital technology in 2014, we found mostly, shall we say, predictably Pollyannaish ideas—with hidden undersides.
We read that smartphones will continue on their trajectory and become the primary communications device used by those who can afford them (and millions who cannot). Consumers are supposed to think only about the future, for these gadgets should be reassuringly seen as benign mobile terminals connected to a vast networked supercomputer that works just for us.
That also means the onward march to anytime/anywhere computing—what once was called ubiquitous computing or ubicomp. For consumers, this will be experienced as a boost to consuming and entertainment. For the digerati, it represents another step toward eliminating a human shortcoming that has stopped us from domesticating the vast and growing stock of scientific and social data.
We’re told that 2014 could mark the start of a revolution in medicine (genomics), and a continuation of technological breakthroughs in crime fighting, warfare, and transportation. The robot car has already become something of a poster child for 2014 innovation, offering a futuristic glimpse of a driver who can sleep her way to work.
There will be a surge in wearable devices, which are linked, of course, to mobile terminals and networks. The Google Glass applications store is set to open in 2014 to promote this trend, hoping for a killer app that will set off a frenzy of sales.
But do you really want to become a human device, monitored and diagnosed and pursued by Google? All this rhetoric forgets the working conditions in factories, where such devices will be assembled at breakneck speed by kids who’ve never heard of post-industrialism. It forgets the problem of waste when old technology is thrown out in favor of an upgrade. And it ignores the built-in function that allows companies and spies to track users’ communication, wherever they go.
Lastly, we are told that small, niche, social media networks will continue to proliferate in 2014, as more of the digerati leave behind clueless cyber-plebeians happy with big social media sites. This forgets the business model behind these sites—it’s about advertising and market research—and their bespoke algorithms that make money by enhancing analytical routines that target ads and conduct non-stop surveillance on their users. We are expected to believe that small, specialized sites are giant killers.
At least one prediction refused to indulge in the hype. The Economist’s Adrian Wooldridge predicted a “tech-lash” in 2014. He thinks cyber peasants will “revolt against the sovereigns of cyberspace. The Silicon elite will cease to be regarded as geeks who happen to be filthy rich and become filthy rich people who happen to be geeks.”1 This will be the year, he suggests, of Occupy2 and Anonymous.3 If that happens—and it’s a big if—more of us might actually glimpse the reality beyond the smoke and mirrors of the new-year high-tech futurology binge. Then again, the sovereigns of cyberspace are clever enough that they may find innovative ways to sell their future to us that borrow from none other than the smart pranks of Occupy, Anonymous—and anyone else they see in their path.