Amyotrophic lateral sclerosis patient uses communication device to request playback of Tool, “loud”.
The sentence followed “I love my cool son” and “Goulash soup and sweet pea soup.”
Amyotrophic lateral sclerosis, or ALS, is a condition that causes patients to lose control of their muscles, rendering communication all but impossible and leading to extreme isolation. According to research reports this week, however, it may be possible for patients to select letters and form complete sentences with the help of an implanted device that reads brain signals.
One participant in a study to measure the effectiveness of the invasive implant decided to put the device to use by requesting the music of his favourite 90s progressive metal band: Tool.
The man, now 36, began working with the research team at the University of Tübingen in 2018 while he could still move his eyes, saying he wanted the implant to maintain communication with his family and his son. Following the written consent of his wife and sister, researchers inserted two square electrode arrays, 3.2 millimetres wide, into a part of the brain that controls movement.
The initial phase of the study proved frustratingly unsuccessful – until the team decided to try neurofeedback, which Kelly Servick, writing for Science.org, explains as the process by which “a person attempts to modify their brain signals while getting a real-time measure of whether they are succeeding.”
This process could then be used to adjust the pitch of an audible tone based on the speeding up and slowing down of electrical firing of neurons— not unlike a real-time synthesizer oscillator. The participant was asked to alter the pitch any way he could.
Initially, he could simply move the tone, but by day 12 could match a target pitch, leading the device to be tuned to the most responsive neurons. Soon the man could answer “yes” or “no”, and subsequently could specify individual letters. Alongside forming early sentences like “I love my cool son” and “Goulash soup and sweet pea soup,” he also took pains to declare, “I would like to listen to the album by Tool, loud.”
“He eventually explained to the team that he modulated the tone by trying to move his eyes,” writes Servick, “but he did not always succeed. Only on 107 of 135 days reported in the study could he match a series of target tones with 80 per cent accuracy, and only on 44 of those 107 could he produce an intelligible sentence.”
The technology is still some way off being widely available. “We’re nowhere near getting this into an assistive technology state that could be purchased by a family,” says Melanie Fried-Oken, a brain-computer interface researcher at Oregon Health & Science University.
Nevertheless, the future has never looked brighter for music fans suffering from ALS who still want to turn it up.
Find out more here.