17 Feb 2009
A brain-computer interface based on near-infrared spectroscopy can decode a person's preference from their thoughts alone.
A brain-computer interface (BCI) that can decode thought processes could enable people with severe or multiple disabilities to communicate and control external devices via thought alone. Bringing such a system one step closer, Canadian researchers have developed a way to use optical imaging to decode preference by measuring the intensity of near-infrared light absorbed in brain tissue (J. Neural Eng. 6 016003).
The system is based on the use of near-infrared spectroscopy (NIRS) to study cerebral haemodynamics during the decision-making process. NIRS has been investigated before as a non-invasive tool for reading thoughts, but previous NIRS-BCI set-ups required user training. For example, in order to indicate "yes" to a question, a subject would need to perform a specific unrelated task, such as a mental calculation.
The key difference in this latest system - developed by researchers at the Bloorview Research Institute and the University of Toronto - is that the BCI is trained to directly decode neural signatures corresponding to specific decision. As no secondary task is required to indicate preference, the design should be more intuitive to use - decreasing the cognitive load required to operate the interface and removing the need to train the user.
"This is the first system that decodes preference naturally from spontaneous thoughts," said Sheena Luu, the University of Toronto PhD student who led the study, under the supervision of Tom Chau, Canada Research Chair in paediatric rehab engineering.
The NIRS-BCI was tested on nine subjects, who were asked to mentally evaluate two possible drinks displayed sequentially on a computer monitor and decide which they preferred. The participants wore a custom designed headband fitted with fibre-optics that emit light into the pre-frontal cortex of the brain. The headband contained 16 sources (arranged in pairs, emitting at 690 and 830 nm) and three detectors, giving a 48-channel configuration.
During the task, frequency-domain NIRS was used to image each subject's prefrontal cortex. In total, 60 trials were collected per participant, with the first four considered practice runs and the remaining 56 used for off-line classification. After teaching the computer to recognize the unique pattern of brain activity associated with preference for each subject, the researchers could predict the preferred drink in each trial with an average accuracy of 80%.
"When your brain is active, the oxygen in your blood increases and, depending on the concentration, it absorbs more or less light," Luu explained. "In some people, certain parts of their brains are more active when they don’t like something, and in some people they're more active when they do like something."
Ultimately, Luu envisions creating a portable NIR sensor that rests on the forehead and relies on wireless technology, with the ultimate goal of "opening up the world of choice" to children who can't speak or move. "Preference is the basis for everyday decisions," she explained. When children with disabilities can't speak or gesture to control their environment, they may develop a learned helplessness that impedes development.
Future work will explore the possibility of decoding preference for other items besides drinks and the classification of more abstract preferences, such as activities or letters of the alphabet. It may be possible to generalize the interface to select from more than two items, allowing the user to choose a preferred item out of a number of alternatives.
Luu notes that the brain is too complex to ever allow decoding of a person's random thoughts. "However, if we limit the context - limit the question and available answers, as we have with predicting preference - then mind-reading becomes possible."