YOU CAN’T HAVE a conversation with your microwave or refrigerator—unless, of course, you’re on acid. And that’s all right, because these machines serve their purpose just fine as-is. They can afford to be shy.

1

But the robots that will one day move into your home can’t. To be truly useful, they’ll need to speak human language and understand human gestures. Which makes a repurposed Baxter industrial robot renamed Iorek1 all the more remarkable: It not only recognizes an object a human being is pointing at and talking about, but asks questions to clarify what they mean. Iorek is limited to trafficking in specific objects, sure, but the robot is a big deal for the budding field of human-robot interaction.

 

The robot—from researchers at Brown University—works like so. A human wearing a headset stands in front of the machine, which sits on a table with six objects in front of it. The human points at, say, a bowl, and asks, “Can I have that bowl?” A Microsoft Kinect atop the robot’s head tracks the movement of the hand to determine which object the subject means and combines that data with the vocal command.

 

Sometimes, though, two bowls are sitting right next to each other, and Iorek can’t differentiate which one the human is after. So it hovers an arm over the bowl it thinks the human wants and asks: “This one?” If the subject says no, the robot determines that its master seeks the other.

 

That may seem like a simple interaction, something a child could do. But this is huge for a robot because the system solves a nasty problem: uncertainty. “The real innovation in what we’re doing is what we call social feedback,” says Brown University’s Stefanie Tellex, co-creator of Iorek. “So what we’re trying to do is not just listen and watch and then act, but assess the robot’s own certainty about what the person wants it to do.”

 

After all, communication is all about certainty. And you want robots in your home to be very, very certain. Consider a robot that one day helps the elderly—cleaning up clutter, lifting them into and out of bed, etc. And say your grandmother asks it to put her down on the sofa, but the robot hears “drop me down the stairs.” Perhaps a rare situation, but you get the point.

 

The big question now is: How do humans want to communicate with the machines? One idea is to use EEGs to actually have them read our minds. (Also demonstrated with a Baxter robot, by the way. Baxter is quite popular in robotics labs.) That’s far off, of course. But Iorek shows how good machines are getting at recognizing vocal commands and gestures. That’s how the vast majority of people communicate, so sights and sounds will be key to human-robot interaction. More challenging, though, is adapting robots to get along with the deaf and blind.

 

Whatever that solution ends up being, chances are roboticists will have Iorek to thank for getting them there. So take a bow, Iorek.

 

No no no. A bow, not a bowl.

 

Source: Wired