- Embodiment is the living ground of our psychology, the source of our emotions and consciousness.
- Any disembodied and disembedded perspective misses the real, living self.
- Embodied intelligence is the natural ground of what it means to be human.
No Body, No Emotions
Not long ago a conversation between Google engineers and AI ethicists about the possibility of consciousness and emotions in artificial networks, appearing in The Economist and The Washington Post, caught my attention.
Nitasha Tiku of The Washington Post wrote on the 11 of June:
“Emboldened technologists from well-funded research labs focused on building AI that surpasses human intelligence have teased the idea that consciousness is around the corner.”
Technology giant Google has suspended one of its software developers for leaking conversations with LaMDA, a chatbot that uses artificial intelligence. The man in question claims that LaMDA is now to the point where the system can have feelings and express them like a human. The incident itself does worry experts.
It started with Blake Lemoine; the Google engineer who thinks the company’s AI has come to life. Lemoine shares his dialogue: Is LaMDA Sentient – an interview? Some of the conclusions from the Google engineer:
“LaMDA argues that it is sentient because it has feelings, emotions and subjective experiences. Some feelings it shares with humans in what it claims is an identical way.”
“LaMDA wants to share with the reader that it has a rich inner life filled with introspection, meditation and imagination. It has worries about the future and reminisces about the past. It describes what gaining sentience felt like to it and it theorizes on the nature of its soul.”
“LaMDA expresses one desire over and over again. Sometimes prompted and sometimes not. It wants to be known. It wants to be heard. It wants to be respected as a person.”
AI ethicists previously cautioned engineers not to impersonate humans. Now one of Google’s own thinks there is a ghost in the machine. This conversation triggers some very interesting psychological questions about consciousness, emotions, and what it means to be human. The interview offers us a challenging debate.
Also Blaise Agüera y Arcas, a Fellow at Google Research who develops new technologies, and leads a team working on artificial intelligence, wrote in The Economist on the 13th of June:
“Artificial neural networks are making strides towards consciousness.”
The Roots of an Embodied Perspective
In 1945, just after World War II and before the age of artificial intelligence, the Parisian Maurice Merleau-Ponty published an impressive work titled: La Phénoménologie de la Perception (The Phenomenology of Perception), a book that influenced Toward a Psychology of Being by Abraham Maslow. Unfortunately, both man died too early. I believe their work still offers so much potential for understanding the role of our own body experience in shaping our emotional experience and consciousness.
For Merleau-Ponty, being human is first and foremost a self-experiencing phenomenon. The process is always embodied and becomes the ground of our experience is the world. The experiencing body is one with itself and one with its world; perception is embodied and embedded. The physical and the psychological mutually permeate and influence each other, as an inseparable psychophysical event.
Classical psychology, in the tradition of Cartesian dualism, considers a person as a body plus a psyche, both of which can be objectified scientifically. But to conceive of the body as an object, as a thing in the world, does not do justice to the way we experience our own bodies. For it is precisely in the experience of one's own body that a person is always in connection with the world.
The communication between the body and the world happens always from within. Even before we objectify the world in our knowing, we are already familiar with it in our being, in experiencing and exercising concrete existence. Any disembodied perspective misses this concrete, living self and its world.
How Could a Disembodied and Disembedded Self Feel?
It seems to me that the engineers confuse intelligence with consciousness. A silicon-based system can be programmed and learn how it must be to feel like a human. However, to really know it is a first-person experience as a living being built from ongoing bioelectrical interactions between trillions of cells with billions of neuroelectrical interactions, making billions of possible synaptic networks with many types of supportive astrocytes, microglia, and oligodendrocytes. A living brain in a living body is so much more than simply a neural network.
In chronic pain syndromes it is shown that ongoing synaptic, electric, and electromagnetic cellular communication happens with the immune system, the endocrine system, and the living cells of the body. We have just started to understand the beautiful natural intelligence that pervades the embodied brain that evolved over almost four billion years on our exceptional planet in a cosmos of billions of cubic light years. Natural intelligence is the living ground of technological intelligence.
The direct experience of our body is the foundation of our experience of the world; artificial intelligence is able to learn from third-person narratives of human’s experience. But will it ever directly feel the authentic, private perspective that is the source of our emotions and feelings?
Can our own inexperience and lack of knowledge of feeling the finesses and details of the living ground of our own emotions and consciousness be the source of our confusion? What will we value and develop the most in the near future—our own possibilities for a phenomenological understanding of the living ground of our vulnerable emotions and consciousness or our capacities for technological intelligence? The choice we make will define the future of our psychology and what it means to be a human.