Robots just got a whole lot smarter thanks to the MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL).

Ted Adelson is the leader of the group that first wowed the robotic world eight years ago when his team presented their new sensor technology called, GelSight. The sensor can provide an incredibly detailed 3D map when it comes into physical contact with an object.

Now the team is mounting these sensors onto the grippers of robotic arms. The robots then have a greater understanding of the objects they come into contact with.

Now, by mounting GelSight sensors on the grippers of robotic arms, two MIT teams have given robots greater sensitivity and dexterity. The researchers presented their work in two papers at the International Conference on Robotics and Automation last week.

In one paper, Adelson’s group uses the data from the GelSight sensor to enable a robot to judge the hardness of surfaces it touches — a crucial ability if household robots are to handle everyday objects.

The team of researchers presented their work at the International Conference on Robotics and Automation last week. Two papers were presented, the first presented data from the GelSight that showed how it enabled a robot to judge the density of surfaces it comes into contact with. This aspect is crucial if we imagine a future with robots handing more and more items. The other paper showed the data collected that indicate how a robot was able to grasp and manipulate small objects by using its robotic arms with GelSight sensors installed.

A low-tech solution to a high-tech problem

In the complex world of robotics, The GelSight sensor can seem a bit of a low-tech solution to a high-tech problem. The GelSight consists of a small piece of clear rubber or ‘gel’ as it dubbed by its creators. One side of this block is painted with reflective metallic paint. On the opposite side of the paint are three colored lights and a camera. When the rubber is pushed up against an object the gel molds to that shape. The reflective metallic paint makes it easier for the computer to understand its geometry and to generate algorithms for it. So the computer uses the indented form and light reflections to understand the 3-dimensional shape of what it is touching.

Good grip

The robots used in the tests had the GelSight sensors installed on their flat gripper ‘hands’ that allowed them to pick up objects and test surfaces. For an autonomous robot, it is essential to know what materials it is engaging with. Knowing how hard or soft the material is and how that material will behave assist them in performing tasks safely and efficiency. The GelSight sensors also help Robots understand the materiality and behavior of objects that otherwise look very similar.

Understanding the human touch

Before the GelSight sensors, robots used the primitive method of ‘poking’ objects to determine the materials by sensing how much ‘give’ was in each material. Researchers at MIT rethought about the way humans understand materiality through contact. We generally make assumptions about the properties of a material by analyzing the contact area between the object and our fingers. The way that area changes informs how we understand materials. For instance, softer objects will often flatten more, increasing the contact area between finger and material.

These exciting discoveries by the MIT team mean robots are closer than ever to becoming a part of our everyday lives.