24 Nov 2021
Virtual histology of skin allows rapid diagnosis of malignant disease.
Progress towards virtual histopathology techniques, whereby optical imaging in situ reveals the vital clinical information that would normally require staining of an excised tissue sample, have involved parallel breakthroughs in both imaging and data processing, in order to meet real-world clinical requirements.
An example of the imaging advances being investigated for the task is a recent University of Alberta project using ultraviolet-wavelength photoacoustic spectroscopy to produce colorized images of cells in tissue samples.
A project from the lab of Aydogan Ozcan at UCLA has now made a new breakthrough in the data processing dimension, developing a deep-learning framework to transform non-invasive confocal microscopy data from unstained skin into virtually-stained images.
Reported in Light Science & Applications, the new framework may permit more rapid diagnosis of malignant skin neoplasms and reduce invasive skin biopsies, in line with the current clinical drive towards diagnosing skin disease non-invasively.
"This process bypasses several standard steps typically used for diagnosis, including skin biopsy, tissue fixation, processing, sectioning and histochemical staining," commented Ozcan. "Images appear like biopsied, histochemically stained skin sections imaged on microscope slides."
Deep learning-based approaches have already enabled the development of algorithms to learn image transformations between different microscopy modalities, but the new study is the first to apply virtual histology to intact unbiopsied tissue, according to the UCLA team.
A new age of digital dermatology
The new framework uses image data from reflectance confocal microscopy (RCM), which detects backscattered photons and generates grayscale images of tissue based on the contrast of relative variations in refractive indices and sizes of cell components. UCLA used RCM images of excised skin tissue with and without nuclear contrast staining to train its deep convolutional neural network, until the framework was able to transform in vivo RCM data into virtually stained 3D microscopic images.
Trials on images of normal skin structure, basal cell carcinoma and other clinically important samples showed that the approach could successfully reveal similar morphological features to those shown with standard H&E (hematoxylin and eosin) staining.
Once the neural network is trained with many tissue samples, it will be able to run on a computer or network, so the project's ultimate goal is to develop virtual histology technology that can be built into any device or combined with other optical-imaging systems.
This could include interfacing this digital, biopsy-free approach with whole-body imaging and electronic medical records, which could then lead to "a new age of digital dermatology," according to UCLA."The current standard for diagnosing skin diseases, including skin cancer, relies on invasive biopsy and histopathological evaluation, which can be costly for patients and the health care system," said Philip Scumpia from the David Geffen School of Medicine at UCLA. "Our approach potentially offers a biopsy-free solution, providing images of skin structure with cellular-level resolution."