Live Science reports on the project:
The use of the EEG data is a bit more complex [than the use of rapid eye movements]. Running it through a machine learning algorithm, we identified several patterns from a sample of the data set (both REM and non-REM events). We then associated preprogrammed robot behaviors to these patterns. Using the patterns like filters, we process the entire data set, letting the robot act out each behavior as each pattern surfaces in the signal. Periods of high activity (REM) where [sic] associated with dynamic behaviors (flying, scared, etc.) and low activity with more subtle ones (gesturing, looking around, etc.). The “behaviors” the robot demonstrates are some of the actions I might do (along with everyone else) in a dream.” [LiveScience]
And here’s a video of it, dancing away [Alt].
The project is the brainchild of Fernando Orellana and Brendan Burns, who used the equipment of The Albany Regional Sleep Disorder Center in New York to record the data.
A robot dancing your dreams. Can’t help but feel inspired by that quip.