Long-term Sight on Horizon


Long-term Sight on the Horizon

The scientists will investigate how electrical stimulation in the visual cortex can be used to create artificial visual impressions. Illustration: Maria Asplund and BrainLinks-BrainTools

Around 354 million people worldwide are visually impaired and around 40 million are blind. Only about half of the blind patients benefit from current treatment methods. Neuroprosthetics may be a way to restore vision, but technology which can provide meaningful visual percepts over longer times has until now been missing. Working within an international research consortium, Dr. Maria Asplund and Dr. Patrick Ruther of the University of Freiburg’s Department of Microsystems Engineering and the excellence cluster BrainLinks-BrainTools, are aiming to develop a neuroprosthetic implant that uses machine learning to improve vision for longer time periods. Starting on 1 September 2020, the European Commission will support the “NeuraViPeR” project in its financing program “Horizon 2020.” The four million euro grant will have a total duration of four years. The project team at Freiburg is to receive 692 thousand euros of that sum.

Today’s neuroprostheses have too few electrodes and their lifespan of just some weeks is not sufficient to stimulate and record neuronal activity in the long term. They additionally monitor neither the efficacy of the stimulation nor the state of the patient’s brain. The “NeuraViPeR” project’s goal is to create the basic parameters for a long-lived implant for brain stimulation and recording. By using adaptive algorithms of deep learning – an automated process of machine learning – the consortium’s goal is to develop a new brain-computer interface technology that will work safely and effectively for decades.

The algorithms convert camera images into electrical stimulation patterns for the visual cortex, the part of the cerebral cortex that makes sight possible. They will use the recorded brain states and eye movements to additionally improve perception in a closed-loop system. In this way, the algorithms will make it possible for the blind to recognize objects and facial expressions, and find their way in unfamiliar surroundings. In order to make the system light-weight, reliable, and wearable, the researchers want to install the software algorithms on energy-saving hardware with a low lag time.

The neuroprosthesis that is being developed in Freiburg comprises thousands of coated micro-electrodes that will be electrically connected to highly integrated computer chips using modern assembly technologies and implanted into the visual cortex. The electrodes should cause minimal damage to surrounding tissues and remain stable despite repeated, long-term electrical stimulation. The international research team will, with the aid of the novel computer chips, direct stimulating currents to the electrodes and monitor the neural activity in larger areas of the brain.

Prof. Dr. Shih-Chii Liu of the University of Zurich is directing the project. Other members come from the Netherlands Institute for Neuroscience of the Royal Netherlands Academy of Sciences in Amsterdam, the spin-off company Phosphoenix in Amsterdam, the “Stichting Universiteit” in Nijmegen in the Netherlands, the Miguel Hernández University in Elche, Spain, and the Interuniversity Microelectronics Centre in Louvain, Belgium. The team brings together expertise from information technology, system and clinical neuroscience, materials technology, microsystems engineering, and deep learning. Maria Asplund and Patrick Ruther will work on the project in the new “Intelligent Machine-Brain Interfacing Technology” (IMBIT) research building. IMBIT is expected to open in early 2021 and provide researchers at Freiburg with optimal infrastructure for investigating brain-machine interfaces.

Press release of the University of Zurich

Intelligent Machine-Brain Interfacing Technology (IMBIT)

/Public Release. The material in this public release comes from the originating organization and may be of a point-in-time nature, edited for clarity, style and length. View in full here.