Advance

A communications platform for people with complete locked-in syndrome

Neural data in NeuroKey software
Brain activity is recorded from implanted microelectrode arrays. Each square shows the neural signals recorded simultaneously by each microelectrode, or channel.

Laying the foundations for future assistive devices

Wyss Center scientists and engineers are working in collaboration with clinicians and faculty to develop a system for people with complete locked-in syndrome that allows consistent, fluent communication over periods of months to years, in the home environment.

The question
When people are completely paralyzed, and cannot move or speak, can they use their brain signals to communicate?

We want to answer fundamental questions on the ability of people to communicate following complete paralysis because of amyotrophic lateral sclerosis (ALS). The results will lay the foundations for the development of future assistive devices and methods of communication for people who are completely locked-in. 

ALS is a progressive neurodegenerative disorder in which deterioration of the nervous system responsible for voluntary movement eventually leads to paralysis, but in which the cognitive parts of the brain often continue to function normally. ALS is also known as Lou Gehrig's disease and motor neurone disease (MND).

Some people with ALS progress into a complete locked-in state (CLIS) in which they lose the use of all muscles and survive with artificial ventilation and feeding. In CLIS, people have no way to communicate.

When people can no longer speak, but still have some remaining movement ability, they often use assistive communication devices to express themselves through a computer. Such devices include eye trackers, switches that detect muscle activity, and sip and puff devices that measure air pressure in inhaled or exhaled breath.

There is evidence that people with ALS can have a high sense of well-being, even in a locked-in state, and so the ability to communicate is important to ensure continued high quality of life and appropriate care.

Jonas Zimmermann
Jonas Zimmermann, PhD, Wyss Center Senior Neuroscientist
“The participant we work with has a particularly fast progressing form of the disease and loss of communication was imminent when he was enrolled. While he was still able to move his eyes to communicate, he expressed his wish to take part in this case study.”

As part of a single patient case study, the Wyss Center team is working with a participant with ALS, his family, the departments of neurology and neurosurgery at the München Klinik Bogenhausen in Germany and the University of Utrecht, to determine whether people with advanced ALS, who can no longer use assistive devices to communicate, can voluntarily form words and sentences with the help of an implanted brain-computer interface (BCI) system.

Microelectrode array in front of 5 centime coin for scale
These tiny, 3.2 x 3.2 mm, microelectrode arrays are inserted into the surface of the motor cortex – the part of the brain responsible for movement. Each array has 64 needle-like electrodes that record neural signals allowing patients to use a computer to select letters and ultimately spell words and sentences.

The technology
An implanted brain-computer interface system including a neural signal decoder and an auditory feedback speller

Two microelectrode arrays, placed on the surface of the user’s brain, detect neural signals. A wired connection sends the neural data to a computer for processing. The Wyss Center’s NeuroKey software decodes the data and runs an auditory feedback speller that prompts the user to select letters to form words and sentences. The user learns how to alter their own brain activity according to the audio feedback they receive.

2 Computer
Recording of neural signals
3 Neurokey
Real time processing of neural signals
4  Auditory Feedback
Audio feedback and brain signal regulation
5  Communication
Forming words and sentences
Kiap Speller Hello
The NeuroKey speller dashboard.

The project is laying the foundations for brain-to-computer speech and other assistive tools - for example emergency alarms - for people who cannot ask for help in another way. 


Nick Ramsey 01
Nick Ramsey, Professor of Cognitive Neuroscience at the Brain Center of the University Medical Center of Utrecht (NL) and Senior Scientific Advisor at the Wyss Center
“My research focuses on decoding brain signals so that implantable BCI devices can enable operation of a speech computer. The Wyss Center team and my lab are working together towards the same goal of restoring communication for people locked-in because of ALS.”

NeuroKey software platform
Enabling real time brain-computer communication

NeuroKey is medical-grade software, suitable for use with implanted devices, and optimized for high channel count, high frequency data processing in real time. It allows the team to rapidly change how the data is processed through its flexible and modular programming interface.

The system in numbers

128
channels of brain signals
30 kHz
brain signal sampling frequency
24/7
availability
David Ibáñez Soria
David Ibáñez Soria, PhD, Wyss Center Brain Machine Interface Scientist
“We developed NeuroKey because there was no software on the market that would allow us to process brain signals in ways that are specific to the user or in ways that were never tried before. NeuroKey is very flexible, it makes it easy to quickly try new ways to process a user’s brain signals.”

The software also has a simple user-interface that allows the family or caregivers to easily launch apps for communication, such as a speller or a quick Yes/No question app, and to recalibrate the system when needed.

If you are interested in using NeuroKey for your real-time neural signal processing applications, please contact us at: info@wysscenter.ch

Future vision
Advances in algorithm development and a fully implantable brain-computer interface

The results of the project guide the development of future assistive communication tools. Such tools are key for the CLIS ALS population, but also have potential to help people affected by other diseases that impact their ability to move or communicate including stroke, spinal cord injury, late-stage muscular sclerosis, late-stage Parkinson’s disease and severe cerebral palsy.

Our team is continuously working to improve the NeuroKey software. We are building increasingly accurate and robust brain signal decoders, improving the predictive text of the speller and exploring the integration of other assistive devices, including home automation systems.

We are also developing ABILITY, a fully implantable brain-computer interface for the restoration of movement and communication, to bring cutting-edge technology closer to safe, long-term daily use at home.

Wyss Center implantable neurotechnology ABILITY device
The leads and electrodes connected to this prototype ABILITY device are similar to those currently used in this project. In the future, the fully implantable, wireless ABILITY device will replace the cable that connects the electrodes to the computer.

The Wyss Center welcomes the opportunity to connect with groups around the world to address the data challenges associated with real time brain signal recording and the development of user-friendly assistive devices to help people with disorders such as ALS.

If you are interested in finding out more about this project, please contact Jonas Zimmermann, Senior Neuroscientist: jonas.zimmermann@wysscenter.ch 

The research system used in this project is exclusively for clinical investigation.

Team

Wyss Center team with ABILITY implant

We welcome new opportunities to exchange ideas and to explore collaborations

Collaborate with us

We are searching for innovative and driven people to make a difference

Join our team

Follow us