08 May 2014
Machine vision and dextrous hand technology combined in system designed for remote operation.
by Ford Burkhart in Baltimore
A robot named Robo-Sally won hearts at the DSS 2014 exhibition floor all week. She shook hands (and has a firm, pleasant grip), handed over a pen, looked you in the eye, and did everything but dance.
Her operators said she can turn a doorknob and even enter code numbers to unlock a door, say, to enter danger zones during a Fukushima-like nuclear disaster.
She can’t quite insert and turn a key yet, but they are working on that. Sally hasn’t yet been called on in an emergency but she’s ready to go at the lab, about 20 miles north of Washington, DC, said Colin Taylor, one of her software engineers.
The engineers from the Applied Physics Laboratory (APL) team at Johns Hopkins University had a serious purpose. Sally demonstrated what her ‘arms,’ developed by the APL, can do for an amputee, with near-human abilities of her modular prosthetic limbs and intuitive feedback controls.
One engineer stood behind Sally and moved his hands and arms, sending signals over wires to Sally. Sally perfectly replicated those motions, including reaching out and shaking hands, as her hand sensor gave exactly the right firmness. In the field, it could all be done remotely, untethered, from up to a half mile away.
That showed what a prosthetic device, on a patient amputee, could do, activated by his own brain signals. The robot could also fulfill tasks - like bomb disposal or checking chemical leaks - requiring mission-level solutions under dangerous conditions. The robot is part of a larger Navy-sponsored program that includes unmanned ground vehicles to explode ordnance.
APL began its work on “human capabilities projection” in 2007 to separate humans from hazards. The team drew upon the Revolutionizing Prosthetics program, funded by the Defense Advanced Research Projects Agency, applying upper limb and dextrous hand technology. The hand has ten degrees of freedom and the arm has an additional seven, roughly equivalent to a human hand and arm.
The robot knows how to apply just the right force for sensitive actions. And with its machine vision the robot could pick up a tool or move a rock.
Sally is mounted on a four-wheel Synbotics platform about four feet long, and uses a Telefactor Robotics sensorized camera head.
|Flir and WWF unite to combat rhino poaching in Kenya|
|SPIE announces 2019 Society Awards|
|NASA: back-up electronics should return Hubble camera to operations|
|2019 SPIE Startup Challenge semi-finalists named|
|Photonics21 announces Infoday to promote 2019 calls|
|SPIE applauds passage of US National Quantum Initiative Act|