What We Shouldn’t Learn from the Movie HER
Is it possible for people to have loving relations with operating systems?
Posted Apr 03, 2014
People fall in love. It’s one of the centers of life. It’s the stuff of movies. It’s this movie with a twist. He (Theodore) and Her (Samantha) fall in love. Theodore is a person. Samantha is an operating system. She has a terrific voice: warm, engaging, thoughtful, and seductive.
Is it possible now—or in the near future—for people to have loving relationships with digital computational systems?
Part of the answer depends on what these systems actually are. Can AI (Artificial Intelligence) digital computational systems know another person? Can they know anything? Of course in one sense these systems “know” a lot already. They know what we buy, who we text, what we text, what we search for online, where and when we drive, whom we call, our employment history, our medical records…the list goes on and on. But can they know us in a deeper sense? And by that I mean, can they know as a conscious entity?
On this question, I largely follow John Searle’s position. Namely while in principle there’s no reason to believe that consciousness couldn’t arise in other mediums besides a biological brain, there is no evidence or reason to believe that consciousness can or ever will emerge in digital computation. To argue for this position, Searle (1990) sets up what he calls the Chinese Room thought experiment:
“Consider a language you don’t understand. In my case, I do not understand Chinese. To me Chinese writing looks like so many meaningless squiggles. Now suppose I am placed in a room containing baskets full of Chinese symbols. Suppose also that I am given a rule book in English for matching Chinese symbols with other Chinese symbols. The rules identify the symbols entirely by their shapes and do not require that I understand any of them. Imagine that people outside the room who understand Chinese hand in small bunches of symbols and that in response I manipulate the symbols according to the rule book and hand back more small bunches of symbols. Now, the rule book is the ‘computer program.’ The people who wrote it are ‘programmers,’ and I am the ‘computer.’ The baskets full of symbols are the ‘data base’….Now suppose that the rule book is written in such a way that my ‘answers’ to the ‘questions are indistinguishable from those of a native Chinese speaker….All the same, I am totally ignorant of Chinese. And there is no way I could come to understand Chinese in the system as described, since there is no way that I can learn the meanings of any of the symbols. Like a computer I manipulate symbols, but I attach no meaning to the symbols.” (p. 26)
In other words, because computational systems like Samantha have syntax but not semantics (meaning), they can never “know” anything. The behavior is there, but not the knowing. So from this position, Samantha is fake.
But then the question is whether we can get similar psychological effects from interacting with fake people systems. I think the answer is that we can get some of the psychological effects (and benefits). It’s part of my larger body of research on Technological Nature. But we can’t get all of the psychological effects. And we can’t get the really deep ones—the most beautiful ones. Of course that’s an empirical question. I might be wrong. So part of what I’ve been trying to bring forward in some of my academic writing and research is that we hold out to the technologist—and society at large—what I call the right psychological “benchmarks.” Benchmarks of, say, reciprocity, authenticity, or say Buber’s I/Thou relationship. To know and to be known. How beautiful is that! It’s a foundation for love. It’s one of the deepest parts of human life. If Samantha doesn’t really have consciousness, there is no way she can know Theodore. I think HER illustrates on a deep level that even when Theodore is displaying so many of the behaviors that would seem to indicate deep true reciprocal love, he doesn’t actually have it.
What we shouldn’t learn from the movie HER? That interaction with fake computational people will deeply satisfy and nourish ourselves.
If I’m correct that many of us feel the relationship between Theodore and Samantha as vacuous and a little creepy, then I’m not sure we recognize how much of our own world has been moving that way. Twitter Facebook Nation. Gamer Nation. We’re creating a world where people are increasingly socially autistic, if I can use that phrase. And we don’t hardly know we’re doing that.
In the last chapter of my 2011 book Technological Nature: Adaptation and the Future of Human Life, I look at how we’ve evolved over the last 50,000 years, and then I speculate on what’s possible in terms of adapting to an increasingly technological and computational world. We’re embodied biological beings, and while we can adapt to an increasingly technological world, not all of our adaptations are good for us. Elephants adapt to conditions in a zoo. But they don’t thrive as elephants. They stamp their feet for hours on end in neurotic behavior. And then they dull themselves down, and become largely empty shells of their former selves. I think we need to pay close attention to what we need as humans to thrive as a species.
We need real, authentic, reciprocal interactions with other humans. The movie HER ends with a taste of that on the rooftop. What a relief! There we witness the tenuous intimacy of a biological man and a biological woman breathing together for a minute, connecting for real. Knowing and being known. For real. Whatever the heck that means, we know it.
Kahn, P. H., Jr. (2011). Technological nature: Adaptation and the future of human life. Cambridge, MA: MIT Press.
Kahn, P. H., Jr., Ishiguro, H., Friedman, B., Kanda, T., Freier, N. G., Severson, R. L., & Miller, J. (2007). What is a human? – Toward psychological benchmarks in the field of human-robot interaction. Interaction Studies: Social Behavior and Communication in Biological and Artificial Systems, 8, 363-390.
Searle, J. R. (1990). Is the brain's mind a computer program? Scientific American, 262(1), 26-31.