Skip to content ↓

Team builds 'sociable' robot

Dr. Cynthia Breazeal plays with Kismet, the robot that mimics and responds to human emotions.
Caption:
Dr. Cynthia Breazeal plays with Kismet, the robot that mimics and responds to human emotions.
Credits:
Photo / Donna Coveney

"Hello, Kismet," said Cynthia Breazeal in a singsong voice. Leaning closer to the object of her attention, she asked, "Are you going to talk to me?"

The exchange is probably familiar to any parent, but Kismet is not a child. It's a robotic head that can interact with humans in a human-like way via myriad facial expressions, head positions and tones of voice. "The goal is to build a socially intelligent machine that learns things as we learn them, through social interactions," said Dr. Breazeal, a postdoctoral associate at the Artificial Intelligence Laboratory and leader of the Kismet team.

Building a sociable machine, she believes, is also key to building a smarter machine. Most current robots are programmed to be very good at a specific task -- say, navigating a room -- but they can't do much more. "Can we build a much more open-ended learning system?" asks Dr. Breazeal.

"I'm building a robot that can leverage off the social structure that people already use to help each other learn. If we can build a robot that can tap into that system, then we might not have to program in every piece of its behavior," she said.

INSPIRED BY KIDS

The work, which began in 1997, is heavily inspired by developmental psychology. "The robot starts off in a rather helpless and primitive condition, and requires the help of a sophisticated and benevolent caretaker to learn and develop," Dr. Breazeal said. Even Kismet's physical features -- which include big blue eyes, lips, ears and eyebrows -- are patterned after features known to elicit a care-giving response from human adults.

The eyes, in particular, are actually sensors that allow the robot to glean information from its environment, such as whether something is being jiggled next to its face. Kismet can then respond to such stimuli -- by moving its head back if an object comes too close, for example -- and communicate a number of emotion-like processes (such as happiness, fear and disgust).

A human wears a microphone to talk to the robot, which also has microphones in its ears. The latter will eventually be used for sound localization.

The robot's features, behavior and "emotions" work together so it can "interact with humans in an intuitive, natural way," Dr. Breazeal said. For example, if an object is too close for the robot's cameras to see well, Kismet backs away. "This behavior, by itself, aids the cameras somewhat by increasing the distance between Kismet and the human," she said. "But the behavior can have a secondary and greater effect through social amplification. A withdrawal response is a strong social cue for the human to back away."

Kismet, she noted, is the exact opposite of HAL, the menacing robot in the movie 2001: A Space Odyssey. "HAL is simply a glowing red light with no feedback as to what the machine is thinking. That's why it's so eerie. Kismet, on the other hand, both gives and takes feedback to communicate," she said.

"I think people are often afraid that technology is making us less human. Kismet is a counterpoint to that; it really celebrates our humanity. This is a robot that thrives on social interactions."

MAKING IT LIFELIKE

To make Kismet as lifelike as possible, Dr. Breazeal and colleagues have not only incorporated findings from developmental psychology, but have also invited the comments of cartoon animators. "How do you make something that's not alive appear lifelike? That's what animators do so well," she said.

The proverbial wizard behind the curtain (or in this case, the wall) is a bank of some 15 computers. They process software programs that allow the robot to perceive its environment, analyze what it finds and react.

In experiments over the last year or so, the researchers have been exploring how the robot interacts with people who aren't familiar with it. Are Kismet's actions and emotions understandable? Do people use those actions as feedback to adjust their own responses? Conversely, is the robot correctly "reading" its visitors?

Results to date are encouraging. For example, many of the people who've met Kismet have told Dr. Breazeal that the robot has a real presence. "It seems to really impact them on an emotional level, to the point where they tell me that when I turn Kismet off, it's really jarring. That's powerful. It means that I've really captured something in this robot that's special. That kind of reaction is also critical to the robot's design and purpose," she said.

Once the robot's social skills are optimized, "we can move on to other forms of learning," Dr. Breazeal said. In early work to that end, the researchers are teaching Kismet how to use its voice to negotiate the social world. "We want it to be able to get people to do things for it, much like a very young child."

The algorithms that are crucial to this will allow Kismet to "learn" by trial and error. When it first attempts a task, it won't be very good. The robot will "remember" its mistakes, however, and make incremental improvements as it goes along. It can then apply what it's learned to completing the same task under different conditions.

CONTINUALLY EVOLVING

Kismet is continually evolving with new sensors and software. "It's not like we can develop one more learning algorithm and say we're done," said Dr. Breazeal. "We're trying to develop the first robotic creature that takes an active interest in its world and learns and develops over time, becoming more and more capable. And that's complicated!" She foresees "several more doctoral dissertations" related to the work.

Dr. Breazeal's colleagues on the work, all graduate students in the Department of Electrical Engineering and Computer Science (EECS), are Paul Fitzpatrick, Paulina Varchavskaia and Lijin Aryananda. The team reports to Professor Rodney Brooks, director of the AI Lab. Professor Brooks invented Cog, another robot in development at the lab that is closely related to Kismet. Cog, with a head, torso and arms, allows its researchers to explore additional aspects of movement and their relation to learning. EECS graduate students Aaron Edsinger and Brian Scassellati have developed vision software used by both Cog and Kismet.

The Kismet team has published several articles on the work, most recently in the July/August 2000 issue of IEEE Intelligent Systems. For more information, including video of the robot and technical papers, go to http://www.ai.mit.edu/projects/humanoid-robotics-group/kismet/kismet.html.

Kismet is sponsored by the Office of Naval Research, the Department of Defense's Advanced Research Projects Agency, and Nippon Telegraph and Telephone Corp.

A version of this article appeared in MIT Tech Talk on February 14, 2001.

Related Links

Related Topics

More MIT News

Andres Sevtsuk stands in the middle of a crosswalk as blurry travelers go by.

Street smarts

Andres Sevtsuk applies new sources of data to creating more sustainable, walkable, and economically thriving city spaces.

Read full story