06 Mar 2007
A user-friendly set of optical tweezers that uses the position of the operator’s hands to move trapped particles is broadening the appeal of the technology. Jacqueline Hewett reports.
Help really is now “at hand” to transfer optical tweezers into the wider scientific community. This is thanks to work being carried out in the UK to create a user-friendly set of optical tweezers with a range of simple interfaces.
Graham Gibson and Miles Padgett of the University of Glasgow are part of the research effort. In their system, the position of the user’s fingertips defines the x, y and z location of the optical traps that make up a microhand capable of manipulating micron-sized objects, ranging from metallic particles to red blood cells (Optics Express 14 12497).
“We wanted to create an intuitive and natural interface,” Gibson told OLE. “The microhand is our first attempt at developing this completely new type of interface. The novelty of this system is that the trap locations are controlled by the position of the operator’s fingertips.”
All about holograms
Basic optical tweezers pass a laser beam through an objective lens to form a tight focus on a microscope coverslip. Particles held in a solution on the coverslip are then attracted and trapped in the high intensity part of the beam. Holographic optical tweezers use a diffractive optical element, in this case a programmable spatial light modular (SLM), to shape the incident beam and generate multiple optical traps that can be moved independently and simultaneously.
As Padgett explains, the microhand uses some clever software, written in-house, to recreate the position of the user’s fingertips as optical traps on the microscale.
“The user puts on a pair of black gloves with white beads attached to the fingertips,” Padgett told OLE. “A webcam looks down on the gloves and we use LabView’s pattern-recognition software to assess where the four beads are. We then use an algorithm to design holograms and pass this information on to the SLM to produce the optical traps.”
The hologram-designing algorithm essentially adds up the phase holograms of basic optical elements to produce a hologram that defines an individual trap. For example, a diffraction grating produces a lateral shift and a lens causes an axial shift.
“One way of imagining it is, if our program said ‘if I want to make a hologram that puts a trap where the first bead is then it’s this diffraction grating’ and it goes through beads one to four, adds them and displays the result on the SLM,” explained Padgett. “One laser beam hits the SLM and four beams leave it giving four traps.”
This is where the processing power of modern computers comes into play. “We can calculate 8–10 frames per second so we can see where the fingers are, calculate the four gratings, add them together and display it 10 times per second – enough to follow what is happening in real time,” continued Padgett. “Our holograms have 512 x 512 pixels.”
The team’s system is built around an inverted microscope with a 1.3 NA objective lens. A diode-pumped, Yb:YAG laser emitting 3 W at 515 nm is expanded to fill the SLM’s aperture, which is then imaged onto the back aperture of the objective lens to produce the traps.
On the coverslip, one 5 µm-diameter silica bead is drawn into each of the four optical traps and can be moved at will by the user. One facet of the microhand is that it can be used to manipulate objects that cannot be tweezed by traditional means such as opaque particles. For example, the researchers have successfully used the microhand to control chrome particles measuring 8 µm.
It’s not just metallic particles that can be moved by the microhand. The technique is also ideal for picking up particles such as red blood cells. The fragile nature of this biological material means careful consideration has to be given to the size of the silica beads and the trapping laser.
“5 µm beads are a good compromise,” explained Gibson. “2 µm beads are easier to trap but the beam is closer to the trapped object. You isolate the trapped object better as the bead gets bigger. You would like to use 10 µm beads, but they become too difficult to trap. We have also switched to a Ti:sapphire laser emitting between 810 and 830 nm because there is less absorption by the cells.”
The team is also interested in introducing a level of feedback either to the glove or to the images shown on screen. “One of the things we have tried, but haven’t succeeded in, is squashing a red blood cell,” said Padgett. “Making the system tactile is something we are interested in.”
Having ironed out all the teething problems with the microhand, the technology will be transferred to the University of Bristol where researchers Mervyn Miles and Daniel Robert have a number of applications in mind. One idea is to use it to assemble even smaller tools. “It might simply be a set of chopsticks that you could build something smaller with,” commented Miles.
Other assembly tasks could include building a photonic crystal lattice structure, or electronic devices such as LEDs. “We could introduce a defect in the photonic structure exactly where we want it,” said Miles. “We’ll do this with the microhand first to learn how to manipulate things, and then we need to automate the process. One of the big challenges is to develop algorithms that allow you to assemble larger structures. In terms of an LED, each trap could contain a component and you could use the microhand to pick them up and assemble them in the right order.”
According to Miles, biological applications are also on the agenda. “If you trap a particle and put it inside a cell, gel, or other 3D structure, you can watch its random walk as it explores its environment,” he said. “This is called a photonic force microscope. Increasing the number of traps, we could have many particles and track their behaviour in real time. The trapped particle could also contain an enzyme to trigger events. You could have a tool to dissolve something when the beam is turned off, for example.”
Changing the interface
Having successfully demonstrated the microhand, the Glasgow team is also putting the finishing touches to an interface that uses a joystick to manipulate the traps. Described in an article to be published in the New Journal of Physics, the optically controlled gripper uses the fire button on the joystick as a means of positioning the silica spheres around the trapped particle.
Given the simplicity of both of the tweezers, scientists who are unfamiliar with the technology may now be willing to look at it in a different light. Handing direct control back to the user could be just the change that scientists are looking for.
• This article originally appeared in the February 2007 issue of Optics & Laser Europe magazine.