The most complete archive of touch-based data available to future robots is ready. It was developed at the Massachusetts Institute of Technology (Mit), storing in an artificial intelligence system the sensations recorded through gloves equipped with special sensors. The details of the archive, published in the magazine Nature, will allow to build robots with a sense of touch similar to the human one and to design wearable computers and prostheses with a greater sensitivity.
The starting point of the project, coordinated by Subramanian Sundaram, were hi-tech gloves scattered with about 550 sensors, tested on 26 objects of different shapes and weights, from scissors to tennis balls, from cans to pens. In this way, a unique sensory archive has been obtained, which will allow artificial intelligence systems to recognize and classify objects with the sole sense of touch, without having to observe their images.
In the first tests performed at MIT, the touch archive allowed an artificial intelligence system to identify objects with an accuracy of 76%, predicting their correct weight with a margin of error of about 60 grams. “This information – concluded the authors of the study – will help robots to identify and manipulate objects, and to design future robotic prostheses.