Optics.org
daily coverage of the optics & photonics industry and the markets that it serves
Featured Showcases
Photonics West Showcase
News
Menu
Research & Development

Optical techniques key to bidirectional brain-machine interfaces

01 Mar 2017

Researchers at Geneva using a combination of 2-photon microscopy and optogenetics say discovery could benefit prosthetic limb control.

For more than 40 years, scientists have been developing brain-machine interfaces; the main objective being neural control of prosthetic limbs in paralyzed patients or amputees. A prosthetic limb directly controlled by brain activity can partially recover the lost motor function. This is achieved by decoding neuronal activity recorded with electrodes and translating it into robotic movements.

Such systems however have limited precision due to lack of sensory feedback from the artificial limb. Neuroscientists at the University of Geneva (UNIGE), Switzerland, have been investigating the possibility of transmitting this missing sensation back to the brain by optically stimulating neural activity in the cortex. They have discovered that not only is it possible to create an artificial sensation of neuroprosthetic movements, but that the underlying learning process occurs very rapidly.

Imaging and optical stimulation

These findings, just published in the journal Neuron, were obtained by resorting to modern imaging and optical stimulation tools, offering an innovative alternative to the classical electrode approach.

The article's summary notes, "Simultaneous two-photon imaging and real-time optogenetic stimulation are used to train mice to activate a single neuron in the motor cortex, while continuous feedback of its activity level was provided by proportionally stimulating somatosensory cortex."

Conventionally, brain-machine interfaces work by relying largely on visual perception: the robotic arm movement is controlled by looking at it. So the direct flow of information between the brain and the machine remains unidirectional. However, movement perception is not only based on vision but mostly on proprioception, the sensation of where the limb is located in space.

Daniel Huber, professor in the Department of Basic Neurosciences of the Faculty of Medicine at UNIGE explained, “We have therefore asked whether it was possible to establish a bidirectional communication in a brain-machine interface: to simultaneously read out neural activity, translate it into prosthetic movement and re-inject sensory feedback of this movement back into the brain.”

Artificial sensation

In contrast to invasive approaches using electrodes, Huber’s team specializes in optical techniques for imaging and stimulating brain activity. Using two-photon microscopy, they routinely measure the activity of hundreds of neurons with single cell resolution.

“We wanted to test whether mice could learn to control a neural prosthesis by relying on an artificial sensory feedback signal,” said Mario Prsa, a researcher at UNIGE and the first author of the Neuron study. “We imaged neural activity in the motor cortex. When the mouse activated a specific neuron, the one chosen for neuroprosthetic control, we simultaneously applied stimulation proportional to this activity to the sensory cortex with blue light”.

Neurons of the sensory cortex were rendered photosensitive to this light, allowing them to be activated by a series of optical flashes and thus integrate the artificial sensory feedback signal. The mouse was rewarded upon every above-threshold activation, and after 20 minutes, once the association had been learned, the rodent was able to more frequently generate the correct neuronal activity.

Huber says that this means that the artificial sensation was not only perceived, but that it was successfully integrated as a feedback response to the prosthetic movement. In this manner, the brain-machine interface can function bidirectionally. This type of bidirectional interface might allow in the future more precisely displacing robotic arms, feeling touched objects or perceiving the necessary force to grasp them.

The neuroscientists at UNIGE are working on how to produce a more efficient sensory feedback system. They are currently capable of doing it for a single movement, but they wish to know if it is also possible to provide multiple feedback channels in parallel. They say that this research is preparing the groundwork for developing a new generation of more precise, bidirectional neural prostheses.

Huber added, “We know that millions of neural connections exist. However, we discovered that the animal activated only the one neuron chosen for controlling the prosthetic action, and did not recruit any of the neighboring neurons. This is a very interesting finding since it reveals that the brain can home in on and specifically control the activity of a single neuron.”

SPECTROGON ABFirst Light ImagingHÜBNER PhotonicsBerkeley Nucleonics CorporationABTechHyperion OpticsLaCroix Precision Optics
© 2024 SPIE Europe
Top of Page