The technological advancements in music-making have truly soared over the last couple of years. Among the newest projects is one run by researchers at Plymouth University and the Royal Hospital for Neuro-disability, where they’ve given patients who are unable to walk or talk the ability to use electrical signals from their brains to select musical passages that are played by musicians in real-time. Ultimately, this gives the disabled musicians a way to express their musical creativity despite their inabilities to speak or physically interact with instruments.
It all came together when Professors Eduardo Reck Miranda and Joel Eaton created a Brain-Computer Music Interface to measure brain activity in the visual cortex by combining an EEG with a computer. According to Discover Magazine, “The computer displays four musical sequence options, and each option has a corresponding matrix of flashing lights. The four musicians choose the desired sequence by concentrating on the light matrix that corresponds to their choice. The choice is then sent to another musician who physically plays the part. The members of the ensemble could even modulate the volume by changing the intensity of their concentration.” These advancements are simply incredible. Full story.
Kendall Deflin (liveforlifemusic.com) / March 18, 2016