Animating Dreams and the Future of Dream Recording
Recent research uses body signals during sleep to power an animated avatar.
Posted Sep 18, 2017
This past June, I attended the 34th Annual Conference of the International Association for the Study of Dreams in Anaheim, California. It was my fifth time attending the five-day conference, which brought together an inspiring and eclectic mix of researchers and clinicians working in the field of dreams. (The 2018 Conference is set for June 16 to 20 in Arizona and is accepting proposal submissions until December 15.)
What goes on at a dream research conference? Fellow PT blogger and dream researcher Kelly Bulkeley wrote about some of the exciting research that was presented at the 2016 conference. I'd now like to share a particularly fascinating piece of research that I saw at this year’s event.
Daniel Oldis, lucid dreaming author and Social Dreaming Advisor at DreamsCloud, presented "Animating Dreams and Future Dream Recording." The intent of the project was to establish a method for capturing and simulating motor behavior in dreams, and the presentation included a demonstration of an iPad avatar animation that was created from real data of a sleeping participant and thought to reflect the movements of the participant's dreaming body.
A great deal of research has established that body movement in dreams is associated with real signals in the muscles of the sleeping body, signals that appear in the same muscles that are involved in the dream behavior. Even though we do not see the body moving during REM sleep, indiscernible twitches and activity occur at an electrical level in the muscle fibers of the body. For example, walking or running in a dream would send signals and twitches to the muscles of the legs and feet; waving or grasping sends impulses to the arm, wrist, and hands. Because of this, researchers can use electromyography (EMG) sensors to measure and track bodily movements occurring during REM sleep, and presumably during dreams. Once the EMG signals are collected from various muscles, software can then be used to decipher and reconstruct the dreamed movements they represent.
For this proof of concept study, data were collected at the University of Texas-Austin Cognitive Neuroscience Lab in March 2016, under the direction of David Schnyer, and funded by DreamsBook, Inc. Two participants slept in the laboratory for one night each and their sleep was monitored to determine when they were in REM sleep. Each was recorded for a total of seven REM cycles. The EMG signals were recorded from the right and left quadriceps on the upper legs, and were also positioned on the upper arm (triceps area) and on the chin. EOG signals recorded eye movements.
The signals, then, consisted of eyes, chin, arm, and legs. Initially, leg and arm points were loaded into corresponding muscle data columns of OpenSim software, in order to visualize simple leg and arm movements.
After this limited method, David Oldis, an iOS programmer, was enlisted to create a complete picture of full body simulation of dream movement. In order to create an animation that could be played on an iPad or iPhone, he used Apple's 3D rendering tool, SceneKit. He programmed the dream-muscle EMG signals into the animation avatar by converting the muscle values from eyes, chin, right arm, and both legs into body-feature angles of an avatar human model. (You can see a clip of the avatar here.)
This project is only an initial step toward dream animation, and is a relatively simple prototype, but it is intended to be a proof of concept for dream movement simulation. The goal is to develop more complex and precise methods of digitally recording and reconstructing dream imagery, especially dream motor behavior.
With any luck, we will see some advancements at next year's conference.
Facebook image: Maksym Povozniuk/Shutterstock
Oldis, D. (2017). Animating Dreams and Future Dream Recording. Poster presented at the 34th Annual Conference of the International Association for the Study of Dreams. Anaheim, Ca.