Skip to content ↓

Soft robotic hand can pick up and identify a wide array of objects

Team from Computer Science and Artificial Intelligence Lab develops silicone rubber gripper and advanced object-identification algorithms.
Watch Video
Press Inquiries

Press Contact:

Adam Conner-Simons
Phone: 617-324-9135
MIT Computer Science & Artificial Intelligence Lab
Close
Three fingers on a new soft robotic gripper each have special sensors that can estimate the size and shape of an object accurately enough to identify it from a set of multiple items.
Caption:
Three fingers on a new soft robotic gripper each have special sensors that can estimate the size and shape of an object accurately enough to identify it from a set of multiple items.
Credits:
Photo: Jason Dorfman/CSAIL
“We want to ... give [robots] ‘sight’ without them actually being able to see,” says MIT grad student Robert Katzschmann.
Caption:
“We want to ... give [robots] ‘sight’ without them actually being able to see,” says MIT grad student Robert Katzschmann.
Credits:
Photo: Jason Dorfman/CSAIL

Robots have many strong suits, but delicacy traditionally hasn’t been one of them. Rigid limbs and digits make it difficult for them to grasp, hold, and manipulate a range of everyday objects without dropping or crushing them.

Recently, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have discovered that the solution may be to turn to a substance more commonly associated with new buildings and Silly Putty: silicone.

At a conference this month, researchers from CSAIL Director Daniela Rus’ Distributed Robotics Lab demonstrated a 3-D-printed robotic hand made out of silicone rubber that can lift and handle objects as delicate as an egg and as thin as a compact disc.

Just as impressively, its three fingers have special sensors that can estimate the size and shape of an object accurately enough to identify it from a set of multiple items.

“Robots are often limited in what they can do because of how hard it is to interact with objects of different sizes and materials,” Rus says. “Grasping is an important step in being able to do useful tasks; with this work we set out to develop both the soft hands and the supporting control and planning systems that make dynamic grasping possible."

The paper, which was co-written by Rus and graduate student Bianca Homberg, PhD candidate Robert Katzschmann, and postdoc Mehmet Dogar, will be presented at this month’s International Conference on Intelligent Robots and Systems.

The hard science of soft robots

The gripper, which can also pick up such items as a tennis ball, a Rubik's cube and a Beanie Baby, is part of a larger body of work out of Rus’ lab at CSAIL aimed at showing the value of so-called “soft robots” made of unconventional materials such as silicone, paper, and fiber.

Researchers say that soft robots have a number of advantages over “hard” robots, including the ability to handle irregularly-shaped objects, squeeze into tight spaces, and readily recover from collisions.

“A robot with rigid hands will have much more trouble with tasks like picking up an object,” Homberg says. “This is because it has to have a good model of the object and spend a lot of time thinking about precisely how it will perform the grasp.”

Soft robots represent an intriguing new alternative. However, one downside to their extra flexibility (or “compliance”) is that they often have difficulty accurately measuring where an object is, or even if they have successfully picked it up at all.

That’s where the CSAIL team’s “bend sensors” come in. When the gripper hones in an object, the fingers send back location data based on their curvature. Using this data, the robot can pick up an unknown object and compare it to the existing clusters of data points that represent past objects. With just three data points from a single grasp, the robot’s algorithms can distinguish between objects as similar in size as a cup and a lemonade bottle.

“As a human, if you’re blindfolded and you pick something up, you can feel it and still understand what it is,” says Katzschmann. “We want to develop a similar skill in robots — essentially, giving them ‘sight’ without them actually being able to see.”

The team is hopeful that, with further sensor advances, the system could eventually identify dozens of distinct objects, and be programmed to interact with them differently depending on their size, shape, and function.        

How it works

Researchers control the gripper via a series of pistons that push pressurized air through the silicone fingers. The pistons cause little bubbles to expand in the fingers, spurring them to stretch and bend.

The hand can grip using two types of grasps: “enveloping grasps,” where the object is entirely contained within the gripper, and “pinch grasps,” where the object is held by the tips of the fingers.

Outfitted for the popular Baxter manufacturing robot, the gripper significantly outperformed Baxter’s default gripper, which was unable to pick up a CD or piece of paper and was prone to completely crushing items like a soda can.

Like Rus’ previous robotic arm, the fingers are made of silicone rubber, which was chosen because of its qualities of being both relatively stiff, but also flexible enough to expand with the pressure from the pistons. Meanwhile, the gripper’s interface and exterior finger-molds are 3-D-printed, which means the system will work on virtually any robotic platform.

In the future, Rus says the team plans to put more time into improving and adding more sensors that will allow the gripper to identify a wider variety of objects.

“If we want robots in human-centered environments, they need to be more adaptive and able to interact with objects whose shape and placement are not precisely known,” Rus says. “Our dream is to develop a robot that, like a human, can approach an unknown object, big or small, determine its approximate shape and size, and figure out how to interface with it in one seamless motion.”

This work was done in the Distributed Robotics Laboratory at MIT with support from The Boeing Company and the National Science Foundation.

Press Mentions

BBC News

In this video, the BBC’s LJ Rich reports on the 3-D printed, soft robotic hand developed by researchers at the MIT Computer Science and Artificial Intelligence Lab. Rich explains that the robotic hand can “handle objects as delicate as an egg and as thin as a compact disk.”

CNBC

CNBC reporter Robert Ferris writes about how MIT researchers have developed a soft robotic hand that can identify and safely grasp delicate objects. Ferris explains that the researchers designed a “soft silicone ‘hand’ with embedded sensors that they can train to recognize different things.” 

Popular Science

Writing for Popular Science, Mary Beth Griggs reports on the soft robotic gripper developed by researchers at MIT CSAIL. “The silicone fingers are equipped with sensors that analyze the object they are touching and compare it to other items in its database,” Griggs writes. 

BetaBoston

MIT CSAIL researchers have developed a silicon gripper that allows robots to grasp a wide variety of items, reports Nidhi Subbaraman for BetaBoston. Subbaraman explains that the hand expands “to accommodate a shape, and grasps radially – surrounding an object instead of picking it up with pincers.”

The Washington Post

Washington Post reporter Rachel Feltman writes that MIT researchers have designed a new robotic hand with soft, 3-D printed fingers that can identify and lift a variety of objects. Prof. Daniela Rus explains that her group’s robotic hand operates in a way that is “much more analogous to what we do as humans."

Related Links

Related Topics

Related Articles

More MIT News