11 March 2009—A few years from now, young Jane, returning home from a school trip to the Smithsonian, might excitedly tell her mom, ”Guess what? I touched the Hope Diamond today,” while brother John shouts, ”That’s nothing! I touched a Neanderthal skull.” These are the kind of scenarios Hiroaki Yano, associate professor in the department of intelligent interaction technologies at the University of Tsukuba, Japan, is working to realize.
Yano, 39, has spent the past 15 years (including obtaining his Ph.D. in engineering) working in the field of haptics, the study of sensing and manipulation of objects through touch. For the past year he has led a research group developing a handheld haptic device for sensing unreachable or untouchable objects and is due to report details of the device next week at the IEEE-sponsored 2009 World Haptics Conference in Salt Lake City.
The prototype system employs a laser range finder to determine the distance to a given object. The data are fed to a computer, where an algorithm calculates the motor torque necessary to move a lever backward and forward in real time on a haptic interface device; the degree of movement corresponds to the changing distance the laser beam measures as it tracks across the surface of the object. By pressing a thumb against the moving lever, the user can feel the reaction force generated by the object and so get a sense of its shape, even down to changes in surface depth as fine as 0.1 millimeter. The device works for a distance up to 1 meter and through glass.
To evaluate how well the device works, Yano had six participants use it see if they could identify which of four unseen objects were placed in front of them: a forward-facing cube, a sphere, a cylinder, or a cube with only one edge facing forward. Each participant was tested four times with each object, for a total of 16 tries. The experiment was carried out with a glass barrier placed in front of the objects.
”The result was amazing,” says Yano. ”All participants got the sphere and the cube plane correct every time. Only one participant failed one time to identify the cylinder, and there were two failures in identifying the cube placed edge forward. That’s a 96 percent success rate.”
While Yano is gratified by the success of the tests, there is a lot of work still to be done. He admits the technology is ”not suitable for measuring the hardness and weight of an object,” though he speculates that certain remote-scanning technologies, such as hyperspectral imaging used by the mining industry to identify minerals, might be adapted to overcome such deficiencies. Yano wants to refine the haptic interface’s algorithms and mechanical precision to provide more accurate feedback force, and he’d also like to add a stabilization mechanism to eliminate haptic noise caused by movement of the hands. Furthermore, he points out, the device cannot be used to sense an object’s backside.
In addition, the device is a little awkward. It’s roughly as large as a brick and weighs 900 grams, so two hands are required to hold it, and it must be connected to the computer by cable.
”Ideally, we’d like to get it down to the size of a small flashlight,” says Yano. He’d like to make it wireless or use an embedded computer.
The prototype device has just one degree of freedom, so Yano is considering adding a stereo camera sensor, which would provide haptic feedback for several fingers. That addition would make the device more attractive to industry and other fields, he says. However, it would also add to the device’s complexity and cost.
”Now we are not using any special technology,” says Yano. ”Should a company be interested in commercializing the device, it would be able to produce a practical prototype in a few months and, I would think, a commercial product in three to five years.”
If that can be done for around $500, Yano thinks that museums and art galleries could replace the notices admonishing patrons not to touch with signs that say ”Please Touch the Exhibits.”
About the Author
John Boyd writes about science and technology from Japan. In July 2008 he profiled Atsuo Takanishi, an engineer whose flute-playing robot is just one part of a robotic orchestra he hopes to build.