Skip to main content

2 Heads—and a Brain-Computer Interface—Are Making Waves in the Art World

brain-computer interface noor

In a quiet performance space, artist Eva Lee and designer Aaron Trocola face about 70 spectators, eyes closed. They are silent, but the audience witnesses them emitting brainwaves. Lee and Trocola are neurally linked, wearing Trocola’s custom 3D-printed headset, its eight electrodes monitoring both their neural patterns as well as their heartbeats. A large cable connects them, recalling fantastical movies in which characters swap bodies.

The audience is experiencing Dual Brains—not a scientific experiment, but a performance-art piece Lee first conceived in 2016. For all its high-tech trappings, the work grew from a series of drawings with a deeply human theme.

“I’m most interested in how humans express empathy toward one another,” Lee says. “About our neural connectedness and how we choose, with our individual will, to help each other.”

Lee and Trocola are in the vanguard of artists who use brain-computer interface (BCI) technology. Also called “neural control interfaces,” BCIs create a communication pathway between a wired brain and an external sensor device. In this case, amplified brainwaves and technology generate artistic experiences that connect with audiences on both intellectual and emotional levels.

The most recent performance of Dual Brains was in March in New York, in the Special Projects section of the seventh-annual Spring/Break Art Show, produced by Harvestworks and ThoughtWorks Arts, partner organizations supporting collaborations among artists, makers, technologists, and scientists. ThoughtWorks also sponsors Art-A-Hack, an incubator for this sort of work. Its cofounder, Dr. Ellen Pearlman, is an artist working with BCI to create immersive performances.

brain-computer interfact rendering of headsets
3D rendering of the headsets used in Dual Brains. Courtesy Aaron Trocola.

Artists have long been influenced by their contemporaries in the sciences and vice-versa. With BCI, artists may realize creative works before engineers can perfect everyday applications, but science and technology provide the tools. There is a flourishing market for BCI headsets, from vendors such as Emotiv, OpenBCI, and Neurable—each with early-adopting customers looking to create art and music.

The concept reaches back to the 1960s, when experimental composers such as Alvin Lucier, David Rosenboom, Richard Teitelbaum, and John Cage experimented with precursors to BCIs. In 1965, Lucier introduced his “Music for Solo Performer,” in which his amplified brainwaves vicariously played percussion instruments. Seven years later, Rosenboom conceived of a performance where four people’s brainwaves would power a musical experiment, and in 2015, he actually performed it at the opening of the new Whitney Museum of Art in downtown New York.

These were novel experiments using neuroelectrical data—essentially readings of voltage fluctuations in the scalp—as raw signals for processing. What artists did with the signals, which were data-rich but basically random, was rooted more in aesthetic interests than in science. Artists such as Lee are committed to aesthetics but are also directly influenced by scientists.

Dual Brains, for example, draws from research on stress and empathy by Dr. James Coan at the University of Virginia, Lee says. Coan developed a “social baseline theory,” showing that individuals thrown together under duress develop an interneural dependence: They help each other on a neural level. Coan studied couples under artificially created stress and ultimately showed that holding hands—with the subject’s spouse or even a stranger—significantly reduced the experimental stress.

Lee teamed up with Trocola, a designer working with OpenBCI, who created special gear for Dual Brains. Wearing headsets connected through a custom signal processor, Lee and Trocola sit quietly together for three minutes. For the first minute, they set a baseline for their brainwaves. For the second minute, each recalls a disturbing event, concentrating on it to arouse anxiety. During the third minute, they join hands. “We’re trying to demonstrate the empathy interaction,” Lee says.

brain computer interface
Aaron Trocola and Eva Lee during a performance of Dual Brains. Courtesy Pat Shiu.

Behind them, the brainwaves are projected as a complex, moving visualization. “The visuals are driven by the data,” Lee says. “We made aesthetic decisions, adding color and so forth, but we really were trying to just let it be. In the third segment where we held hands, the patterns change very visibly, and there is another pattern we thought of as a cloud or a fountain that develops between us as the performance progresses.”

Pearlman’s work with BCIs is focused on the nature of consciousness and on what she refers to as “the Quantified Society”: cultures transformed by data-centric technologies (biometrics, big data, machine learning, and artificial intelligence). “Art predates science,” Pearlman says. “It’s not the other way around.”

In 2016, she first performed Noor, her “brain opera.” Noor’s brooding storyline takes place in World War II. It is immersive theater, performed in a circular space with five screens on walls 15 feet high. A performer wears an Emotiv headset, which processes brainwaves using a proprietary smoothing algorithm to read four states: frustration, interest, excitement, and meditation. As the performer experiences those states, colored bubbles representing those states are animated on the projection screens, along with databanks of allegorical, emotionally themed storytelling imagery.

Pearlman is inspired by the work of Drs. Jack Gallant and Alex Huth of the University of California, Berkeley. Gallant and Huth developed ways to monitor image processing in the visual cortex and reconstruct images viewed by human subjects with good fidelity. In the future, what Pearlman simulates with curated images may be possible with actual neurological signal data.

brain-computer interface 3D Systems Sense Scanner
Aaron Trocola scans Dual Brains’ main programmer Pat Shiu using the 3D Systems Sense Scanner. Courtesy Eva Lee.

In medicine, BCIs show great promise to facilitate communication for those with locked-in syndrome or other conditions that impair speech; control Parkinson’s symptoms and epileptic seizures; improve treatment for paralysis; or even project images into the minds of the visually impaired. Elon Musk and Facebook are reportedly interested in applications from neural implants to “typing with your mind.” The possibilities seem endless—fertile ground for both science and art.

The potential next phase for BCI-based art is to feed brainwave data into machine-learning algorithms (AI systems based on pattern recognition) to extract literal meaning from complex signals, a process artists currently present as visual-type abstractions.

Trocola foresees BCI-driven collaborative creative projects involving multitudes. “You could imagine a large number of people wearing headsets, measuring their emotional states and uploading the data,” he says. “A machine-learning system could be interpreting the patterns in real time, and the group could express something about what is going on in a neighborhood or a city that can offer deeper insight than analysis of written communication on the Internet.”