12 Sep 2017
Adaptive optics helps reveal how color signals are conveyed on a cellular level.
A project from the University Eye Hospital Bonn, University of California, Berkeley, and University of Alabama at Birmingham has now completed a study of how the topography of individual wavelength-sensitive cells shapes perceptual sensitivity, a step towards understanding how color signals are conveyed at the cellular level. The work is published in the Journal of Neuroscience.
The project employed adaptive optics scanning laser ophthalmology (AOSLO), a technique in which a laser is scanned in horizontal and vertical directions across an area of the retina while adaptive optics remove aberrations from the optical pathways caused by astigmatism, movement, or other origins.
Using the ophthalmoscope to both examine and stimulate individual photoreceptor cells in living subjects, the project investigated how human color vision emerges from the three independent receptor types known to operate within the retina.
"We demonstrate how the precise topography of the long (L), middle (M), and short (S) wavelength-sensitive cones in the parafovea region of the retina shapes perceptual sensitivity," said the team in its published paper. "We used adaptive optics microstimulation to measure psychophysical detection thresholds from individual cones whose spectral types had been classified independently by absorptance imaging."
The AOSLO platform allowed the generation of three independent narrow-band input channels for illuminating the retina: one in the infrared at 842 nanometers for retinal imaging and wavefront sensing, and two stimulation channels in the green and red visible spectrum, at 543 and 710 nanometers respectively.
Cell by cell analysis
Having initially mapped the pattern of L, M and S cones in the retina by measuring each cell's absorption of different wavelengths, the project was then able to determine the detection threshold for each cone, by steadily lowering the intensity of the stimulation light.
"This is important because we could use the sensitivity of each cell to determine how overall perception is governed by the contribution of individual cones," commented Wolf Harmening of University Eye Hospital Bonn.
This led to a potentially significant finding about the way in which an individual cell's sensitivity depended on the immediate neighboring cells. It seems that the brain does not receive raw data from individual photoreceptors, but rather a retinal signal already subject to a degree of pre-processing depending on the environment of the receptor.
"If a cone sensitive to red light is surrounded by cells that are more sensitive to green, this cone is more likely to behave like a green cone," explained Harmening. "Spatial and color information of individual cones is modulated in the complex network of the retina, with lateral information spreading between receptors through what are known as horizontal cells."
While the full implications of this discovery remain to be investigated, the project has already proven the benefit of cellular-level investigations such as the one facilitated by AOSLO. Conventional tests of vision use stimuli that necessarily activate hundreds to thousands of photoreceptor cells at the same time, but cellular-scale retinal computation has important implications for basic and clinical research.
"What’s new is that we can now study vision on the most elementary level, cell-by-cell," said Harmening. "When the basis of vision is understood better, we open avenues for new diagnoses and treatments in case of retinal disease. The novel single-cell approach offers access to new findings in ophthalmology."