Robot Hands So Sensitive They Can Grab Potato Chip

A new type of robotic hand developed at The University of Texas at Austin demonstrates such sensitive touch that it can grasp objects as fragile as a potato chip or a raspberry without crushing them. The technology, called Fragile Object Grasping with Tactile Sensing (FORTE), combines advanced tactile sensing with soft robotics. The breakthrough could improve robot performance when a light touch is needed, such as in health care and manufacturing.

"Right now, robotics is starting to be able to do large motions around the house, but struggles with really fine and delicate movements," said Siqi Shang, lead author of a new paper published in IEEE Robotics and Automation Letters and a doctoral student in the Cockrell School of Engineering's Chandra Family Department of Electrical and Computer Engineering. "Robots can fold a shirt but may struggle to carefully pick up your glasses or unpack fruit from your groceries. "We believe sensing signals will give robots a sense of touch to handle these objects carefully."

The fingers at the heart of this technology were inspired by the fin-ray effect - a design principle derived from the natural structure of fish fins. These fingers are made using advanced 3D-printing techniques and feature internal, empty air channels that act as tactile sensors. The researchers recently applied the sensing technology to a year-long collaboration with the College of Fine Arts' Department of Theatre and Dance.

When the fingers prepare to grasp an object, the air channels inside them also shift, causing changes in air pressure. ​These pressure changes are detected by small, off-the-shelf sensors that provide real-time force feedback to the robot and let it know whether the object is slipping.

The researchers tested the grippers on 31 objects, including fragile items such as raspberries and potato chips, slippery items such as jam jars and billiard balls, and everyday items such as soup cans and apples. The system achieved a 91.9% success rate in single-trial grasping experiments, outperforming traditional grippers that rely solely on visual feedback. ​

The system recognized 93% of slips with 100% precision, meaning it never falsely identified a slip event. This high level of precision ensures that the robot adjusts its grip only when necessary, avoiding excessive force that could damage an object. ​

"Humans pick up objects with just the right amount of force; too much and you'll crush it, but too little and it'll slip out of your hand," said Lillian Chin, associate professor of electrical and computer engineering at UT. "Most current force sensors aren't fast or accurate enough to provide that Goldilocks level of detail. In particular, our sensors operate closer to the timescales of human hand sensors."

In addition to speed and accuracy, these fingers have a longer lifespan than other devices under development. Because the sensors are 3D printed, they can be easily customized to a variety of shapes.

The slip-sensing ability is what really distinguishes them. Very few robotic gripping technologies have slip detection at all, and those that do can't match FORTE's reaction time and speed.

FORTE is a significant milestone in the quest to create robot hands with dexterity similar to that of humans, and it could affect many industries:

  • In food processing, where handling fragile items such as fruits, vegetables and baked goods is a daily challenge, more sensitive machinery could reduce waste and improve efficiency.
  • In health care, robots could handle medical instruments or fragile biological samples with precision.
  • In manufacturing, the technology could be used to handle delicate components, such as electronics or glassware. ​

The researchers have publicly released the hardware designs and algorithms to encourage other scientists and engineers to build upon their work. ​They're still fine-tuning the technology, and the next steps include making the sensors less sensitive to temperature changes and improving the ability to catch objects that are slipping.

The project team also includes Yuke Zhu, associate professor in the College of Natural Sciences' Department of Computer Science, and doctoral student Mingyo Seo. The research was supported by the Texas Robotics Industrial Affiliate Program, the National Science Foundation, the Office of Naval Research, the DARPA TIAMAT program, and South Korea's Institute of Information & Communications Technology Planning & Evaluation.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.