Robotic fingers are beginning to feel human-like sensitivity

Believe it or not, robotic fingers will be soon able to feel a level of sensitivity similar to humans! Researchers at the University of California San Diego’s Jacobs School of Engineering are developing a touch perception system for flexible actuators that could give human-like fingers to robots that also have human-like sensitivity.

Conducted within the Bioinspired Robotics and Design Lab in association with the larger Contextual Robotics Institute, The project is headed by Professor Michael Tolley. Their work has been published in a recent issue of the journal Science Robotics.

“The idea was to develop an embedded model of touch perception using machine learning. The team used a motion detection system to provide feedback to train a recurring neural network and then disconnected it when the training was completed. The process could be compared to teaching students to play the piano with the lights on, then turning out the lights, and having them demonstrate the ability to rely solely on their touch,” Benjamin Shih, one of the team members and a mechanical engineering doctoral student at UC San Diego, told Design News.

Researchers don’t explicitly compute their cinematics in real time, but they don’t learn from scratch every time. To draw some parallels or inspiration from biology, it’s like how people learn to feel and recognize objects, or it’s some combination, Shih added.

Experiments were conducted with the actuator deployed in free space as well as with the actuator in contact with the object along its length and at the tip, allowing one and two degrees of freedom. Shih said the project gave him new appreciation for human skin’s “incredible resolution.”