"You must stay drunk on writing so reality will not destroy you." - Ray Bradbury
According to my writer's group companions, not everyone in this world spends a lot of time fretting about the possibility of a robot apocalypse.
I find this a bit mystifying, because it is entirely clear to me that as we fork over more and more control and responsibility to artificial intelligence (AI) -- letting autonomous non-human agents drive our cars, fight our wars, and entertain our children -- the risk that they may someday become self-aware and decide they'd rather not be our servants increases in lock-step.
I'm not alone in these concerns. The tech-intellectual triumvirate of Stephen Hawking, Elon Musk, and Bill Gates voice similar worries. Stephen Hawking famously said: "Success in creating AI would be the biggest event in human history. Unfortunately, it might also be the last..."
Thankfully, there are several think tanks comprised of scientists and other serious intellectuals tackling the problem, including the Center for the Study of Existential Risk (CSER) out of University of Cambridge. There are also numerous researchers addressing the issue from a more experimental perspective.
Alison Flood over at the Guardian recently posted a great overview of one such experimental approach. In this study, Mark Riedl and Brent Harrison from the School of Interactive Computing at the Georgia Institute of Technology attempted to give artificial intelligence (AI) an understanding of human values by exposing them to stories - little vignettes of protagonists faced with various quandaries and adventures. The system was rewarded for decisions that were in line with human social norms and values -- that is, decisions that mirrored that of the human protagonist in the story.
I'll let you read Alison's words on the details of the study, but the lead author on the study sums it up so: "In theory, a collected works of a society could be fed into an AI and the values extracted from the stories would become part of its goals, which is equivalent to writing down all the ‘rules’ of society.”
This research raises an interesting corollary: if feeding fiction to artificial intelligence can make it more understanding of human values and choices, could this same effect hold for regular intelligence? That is, human beings?
For in reading fiction, we are in some ways just like the computer program from Riedl and Harrison's study - running through simulations of possible experiences in the world. We feel the reward of rushing headlong into a new love or of splashing into the crisp and sparkling sea some morning in August. We feel the punishing twist in our gut as we hurt a friend out of jealousy and the despair of losing a child to one neglectful moment. It certainly seems possible that routinely running through such simulations could increase our empathy, our ability to put ourselves in another person's shoes and understand their choices and perspectives.
Indeed, compelling evidence is piling up that this is the case, that frequent readers of fiction outscore infrequent readers on tests of interpersonal sensitivity. More importantly (from an experimental perspective), people randomly assigned to read narrative fiction experience a temporary boost in their empathic abilities compared to people assigned to read nonfiction sources that don't require perspective taking. Reading fiction is associated with reduced gender stereotyping and greater egalitarian feelings about gender. Reading narrative fiction that breaks down Arab-Muslims stereotypes and exposes the reader to Arab-Muslim culture can actually (at least temporarily) reduce performance on measures of prejudice.
While this is much harder to test, reading narrative fiction frequently may also help us learn to be more understanding of others in our real lives, could even reduce the number of disagreements we enter with strangers and friends. Knowing extensive backstories for so many missteps and bad decisions, we can't help but run through possible explanations for why that person in the Subaru just cut us off in traffic or why our best friend just rudely abbreviated our conversation.
The type of fictional entertainment people enjoy varies quite a bit. One intriguing question is whether different personalty types are drawn to different genres of fiction -- something I tackle over at Motherboard, in an article titled What Kind of Person Loves Scary Movies? A different, but related, question is whether some genres of stories have greater or more beneficial effects on one's psyche than others. That is, does cramming down the latest serial-killer-whodunit increase your empathy less than reading about how our world stumbles but then recovers its humanity following a cataclysmic event?
There are very few studies that test this question, but one correlational study indicated that out of all the genres, the one that was most associated with greater empathy was (can you guess it?) .... romance novels. Whether deeply empathic people are drawn to romance in the first place, or the focus of these books on interpersonal relationships, emotions, and drama increases empathy better than other genres is currently unknown.
Of course, antidotes are only effective if you take them and outside of schooling environments, we can’t assign people to read diversity fiction. However, we can support our libraries, the reaffirmation of humanities education in our high schools and colleges, and funding for diverse authors in our communities. For reading fiction from multiple perspectives and backgrounds may broaden the mind and protect us from falling prey to divisive, fear-based narratives that rely on us being unable to consider nuance and multiple perspectives.
And who knows, maybe if we can get potential Terminators to absorb our values through fiction reading, they may be less likely to, well, terminate us.