22 Sep 2006
Optical coherence tomography is fast becoming one of the most exciting areas in biomedical photonics as developers get to grips with high-speed, high-resolution Fourier domain detection schemes. James Tyrrell speaks with three experts in the field to discover more about the breakthroughs, the applications and the firms involved.
From its beginnings as an off-shoot of telecoms, optical coherence tomography (OCT) has developed into a multimillion dollar industry at the forefront of medical imaging.
Carl Zeiss Meditech dominates the market thanks to its early success in the retinal imaging sector, but for how long? High-performance Fourier methods are driving a wave of activity in the sector as firms big and small vie to turn this extra imaging capability into profit, targeting applications from cancer detection through to cardiology.
In fact, the surge of interest in the technique has led many components suppliers to change their strategy. "In the past we were constantly going to telecoms firms to convince them that their products could be useful for OCT," Stephen Boppart of the Beckman Institute for Advanced Science and Technology, University of Illinois at Urbana-Champaign, US, told OLE. "However, today there has been a switch and photonics companies are now making light sources specifically for OCT."
OCT can be thought of as an optical version of ultrasound imaging. "It grew out of a technique called OCDR, optical coherence domain reflectometry, which was developed in the late 1980s for finding faults and measuring the characteristics of telecoms components," commented Joseph Izatt of Duke University, US.
Pioneers of the technique, such as James Fujimoto from MIT, US, and Adolf Fercher from the University of Vienna, Austria, soon began to realize the potential of OCT for medical diagnosis. "OCT is very good at high-resolution morphological imaging of biological tissue and it fills a niche that no other kind of modality occupies," explained Boppart, a former member of Fujimoto's group. "With a resolution of around 2 to 5 µm upwards to approximately 15 to 20 µm, it sits somewhere between ultrasound and the very high resolution microscopies such as confocal or multiphoton microscopy."
How it works
OCT provides structural information by exploiting differences in refractive index within a sample. The set-up resembles a Michelson interferometer with reference and sample arms, a partially coherent light source and a photodetector. "As light propagates through the sample, portions of the beam are reflected back [to the detector]," said Boppart. "OCT can localize the origin of those reflections and reconstruct an image based on the optical backscatter from within the tissue."
Initially, instruments were configured in so-called time-domain (TD) mode. "You move the reference arm of the interferometer through a range of positions and whenever it matches the position of a reflector in the sample, you get a little fringe burst on the detector," explained Izatt, who worked with Fujimoto in the early 1990s. "The envelope of those fringe bursts becomes a map of the reflectivity in the sample and by scanning the light source in either one or two additional dimensions you can build up a two- or three-dimensional image."
It turns out that the depth resolution is mainly determined by the polychromacy of the light source. "15 years ago, we were using semiconductor lasers centred at the long end of the visible spectrum that had about 3 nm bandwidth, which gave us roughly 150 µm of depth resolution," said Wolfgang Drexler, formerly of the University of Vienna group and now based at Cardiff University, UK. "Soon afterwards superluminescent diodes (SLDs) came out and gave us 20 or 30 nm [of bandwidth], which really improved the resolution by a factor of 10."
Another breakthrough occurred in the mid-1990s when Fercher and his co-workers published a paper that is very much shaping the industry today. Rather than encoding the back reflections in time, scientists showed that you could perform the same measurement by operating in the spectral or Fourier domain (FD).
"The basic physical concept behind the spectral domain approach is that instead of moving one of the arms of the interferometer (the reference arm) you can obtain that same degree of freedom by manipulating the wavelength of the light source," commented Izatt.
Switching from TD-OCT to FD-OCT has been shown to give a 20–30 dB advantage in terms of sensitivity. "You get much better signal-to-noise because you are essentially utilizing all of the light that is coming back from a single-depth column in the tissue," said Boppart. "Whereas with the time-domain systems you are only detecting a single point in three-dimensional space."
Back in the lab, there are two ways to implement FD-OCT. One approach is to continue to use a broadband light source, such as a Ti:sapphire laser, SLDs, amplified spontaneous emission or even photonic crystal fibre sources, and to replace the single photodetector with a spectrometer and detector array. The other option is to keep the single photodetector and exchange the broadband beam for a tunable or swept light source.
Pros and cons
Naturally, the OCT community is keen to establish the most effective approach. "The great advantage of the swept source technique is that the interferometer becomes very simple – you have a single-channel detector and all of the complexity goes into the source," said Izatt. "Whereas in the spectrometer-based systems you split the complexity – you need a broad bandwidth light source, but you also require a high-throughput, high-efficiency and high readout rate spectrometer."
Another factor is wavelength. "At 800 nm, which is the best wavelength for ophthalmic applications, there are very good CCD sensors out there driven by the digital photography market. So for 800 nm, I think that the spectrometer approach combined with a broad bandwidth light source is a very pertinent choice," commented Drexler. "However, if you go to non-transparent tissue – so from ophthalmic to other tissues like gastro-intestinal or gynaecology or skin – then the 1300 or 1050 nm wavelength [region] is much more attractive and so I think at 1300 nm, FD-OCT based on swept sources [and a single photodetector] is probably the better approach."
While all of this activity makes for a buoyant research area, scientists such as Boppart and his colleagues are anxious for the technology to stabilize. "Up until just recently the technology was changing so dramatically that it was very difficult for products to enter clinical trials for evaluation and assessment by end-users," he explained. "Definitive trials take years to perform and the technology has to be at a stable point before you can begin studies with hundreds or thousands of patients."
OCT's maturist sector is ophthalmology. "There is no other technique out there that can deliver that richness of morphological information about the retina," said Drexler. "With ultrasound you really have to get in contact with the eye and run the risk of infection, whereas OCT is a non-invasive technique."
Retinal imaging was OCT's first success and offered patients a more sophisticated diagnosis. However, it could ultimately be the front section of the eye that provides firms with their biggest reward. "The anterior segment is where refractive surgery happens and if a great application that combined OCT and refractive surgery were to be found then this would actually dominate the retinal diagnostic market," said Izatt, who is also involved with Bioptigen, a US-based OCT start-up working in the ophthalmology sector. "Here, the predictions are based on future generations of LASIK and related surgeries such as Phakic intraocular lenses."
Unlike LASIK, which involves reshaping the eye, Phakic refractive surgery is based on inserting a flexible lens, which unfolds to fill the iris, through a small incision in the cornea. "It has been shown that OCT can measure the width of the eye very accurately to establish where to secure the lens," explained Izatt. "The lenses can be replaced as your prescription changes, so they are suitable for children, whose vision changes as they age."
The transparency of the eye makes it a natural choice for OCT, but by using light-guiding catheters, the technique can be used to study other areas of the body such as the heart. It can help prevent sudden death or serious injury by allowing surgeons to pinpoint unstable plaque. "Currently people do this by injecting dyes and fluoroscopy or they use intravascular ultrasound," said Boppart. "However, neither of those techniques can match OCT in terms of resolution or information content." LightLab Imaging, co-founded by Fujimoto, is one of the companies leading the way here.
It turns out that OCT has the potential to spot very subtle changes within the body, especially if used in tandem with novel nano-sized contrast agents. "The buzz words are "molecular imaging" and you can modify these agents with tags to label specific molecules," revealed Boppart, who leads a group working in this area. "This enhances the ability of OCT to locate tumours or other areas of interest."
The wealth of applications in combination with the growing expertise in FD-OCT is capturing the imagination of investors. "It is undeniable that this revolution in technology has lead to a proliferation of commercialization efforts," observed Izatt.
Izatt's start-up Bioptigen, is one of a number of firms now developing next-generation OCT systems. "In Germany there is a company, ISIS Optronics GmbH, that is focused on skin imaging," added Boppart. "And Thorlabs is now carrying OCT products." Other firms include Optopol of Poland, TopCon Europe and Imalux, US, to name just a few players in the OCT market. What's more, if the rumours are to be believed, then additional "stealth" start-ups are coming soon, primarily with products for the ophthalmology sector.
With high-speed OCT imagers on the horizon that can operate at hundreds of frames per second, data management is looming as the next challenge facing OCT. "We are now pushing the bottleneck from photonic technology back onto computing and processing power," said Boppart. "Today, you can get gigabytes of data from a single sample and so it is important to have some kind of automated diagnosis technique or algorithm that can extract the most relevant information."
Drexler shares his concerns. "You want to be able to collect data in a reasonable amount of time, because patients don't want to be bothered for too long and there are issues about long waiting times," he said. "Fortunately, PCs and frame-grabber cards continue to move towards faster speeds and higher data-acquisition rates, which means that it is not a fundamental problem, just a [technical] challenge that we have to face from a hardware and software point of view."