For decades, one of the most troubling difficulties of treating patients with certain types of neurological disorders has been the complete inability of doctors to communicate with these patients, in any fashion. Modern technology has helped millions of physically or mentally disabled persons express themselves or better communicate with caretakers, but locked-in syndrome has been extremely difficult to treat. Now researchers from Europe have used a brain-computer interface to communicate with four completely locked-in ALS patients.
Locked-in syndrome means that patients can’t communicate, even though they’re still “in there.” One patient in this report had been locked in since 2014. Another had been stricken by early-onset ALS at just 23, and the terrible march of the disease had taken just two years, from diagnosis to being completely locked in. It seems obvious — can’t we just look at brain waves to see what people are trying to say? But brain-computer interfaces are still in their relative infancy. Even though DARPA is pouring a bunch of money into different BCI projects, they’re still pretty much bespoke applications for small-scale projects.
Designed by lead author Niels Birbaumer, a neuroscientist working at the Wyss Center for Bio and Neuroengineering in Geneva, this particularfits on a person’s head like a beanie. It uses an EEG to measure changes in electrical activity in the brain, and also monitors blood flow using a technique known as near-infrared spectroscopy.
After obtaining informed consent, Birbaumer’s team asked patients to work with them to calibrate the BCI, by thinking “YES” or “NO” — and yes, thinking in caps lock, because that’s louder than regular thinking. The researchers wanted the patients to think that one single thought, yes or no, as loudly and clearly and emphatically and single-mindedly as they could. [Who knew all the depictions of psionics I read in bad fantasy were accurate? — Ed] If possible, too, the researchers asked the patients to try to do whatever communication techniques they’d been able to use before, hoping to use the EEG results in combination with a neuromuscular map to refine results.
These people had been locked in for years, but they were still cogent and capable, and they all worked patiently with the researchers. They wore the EEG caps and answered pair after pair of related yes-no questions (“Paris is the capital of France”—”yes”; “Paris is the capital of Germany”—”no”). At the end, each patient had a personal, carefully calibrated model of what their brain looked like when they were thinking “yes” or “no.”
Then the researchers started asking questions in earnest. Asking about the important stuff. “Your husband’s name is Joachim.” (“Yes.”) “You have back pain.” (“No.”) “You are happy to be alive.” (“Yes.”) Birbaumer told the MIT Technology Review that “the relief was enormous” for the families, after hearing the affirmative responses from relatives they’d expected never to be able to communicate with again.
The technique the researchers used wasn’t foolproof — even though many of the questions had known answers, respondents only gave the “correct” answers roughly 70% of the time, and it’s not clear why. Some may have had attention span issues, and at least one was partially blind — and there may have been concurrent problems with other neurons, confounding results. Some issues may also remain to be worked out on the technology platform. But it won’t be long before we’re seeing recent advances in brain mapping represented in applications like this, with attendant leaps in quality and capability. Who’s ready to start taking bets on how long it’ll be before we have an all-external method of brain mapping with sub-1mm resolution?
Source : https://www.extremetech.com/extreme/243778-locked-als-patients-get-voice-thanks-new-mind-reading-machine