Robots that can recognize objects via tactile sensations instead of visual cues have a wide range of potential applications, including surgical procedures and rescue operations. Researchers at the Intelligent System Control Laboratory at Nara Institute of Science and Technology (NAIST) in Japan have developed algorithms that enable a dexterous robotic hand equipped with sensors to identify objects via pressure, vibration, and temperature. The object recognition algorithms employ an active learning approach: The hand performs an action such as rubbing, squeezing, or pulling the object, obtaining tactile information that it uses to plan its next action.
NAIST researchers used MATLAB® and Robotics System Toolbox™ to develop two algorithms for tactile object recognition. The first uses machine learning techniques to develop a probabilistic model from observed sensor data. The second uses the learned model to recognize different objects.
“MATLAB let us focus on conducting our research rather than writing code,” says Takamitsu Matsubara, assistant professor at NAIST. “Usually when we are working with a new robot there is a lengthy code writing phase, but MATLAB and Robotics System Toolbox enabled us to minimize this phase and concentrate on improving active tactile object recognition.”