Skip to content ↓

MIT developing first humanoid personal assistant

Una -May O'Reilly, co-principal investigator for MIT's newest robot, watches as Cardea lets itself out the door of the lab to go for a stroll down the hall.
Una -May O'Reilly, co-principal investigator for MIT's newest robot, watches as Cardea lets itself out the door of the lab to go for a stroll down the hall.

The Roman goddess of thresholds is getting 21st-century attention as the namesake for an MIT robot that could become the world's first humanoid personal assistant. Among many other things, Cardea the robot can open doors--literally and perhaps figuratively.


"Just as personal computers have enabled tremendous information-processing productivity gains for individuals, we believe that building a physical cognitive assistant that can do physical things in the world will enable tremendous productivity gains for the individual, office worker or factory worker," said Rodney Brooks, director of MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL).


Cardea and 14 other robots from universities across the country recently demonstrated their capabilities in Washington, D.C., for officials from the Defense Advanced Research Projects Agency (DARPA). The robots have one feature in common: the wheeled base of a Segway Human Transporter. Earlier this year, DARPA invited proposals from universities interested in receiving such a wheeled base. Each team was to describe a robot they'd develop using the base, and what that robot would demonstrate. The only hitch: a functioning robot was due in about four months.


Since receiving its Segway base in June, the MIT team "has been working virtually nonstop," said Una-May O'Reilly, a CSAIL research scientist and co-principal investigator for the project with Brooks.


Ultimately Cardea will be about five feet tall with a torso, three arms, a variety of sensors, and a human-like head with expressive features and vision, all atop the Segway base. Why three arms? They'll allow extra dexterity, especially since each will have a different kind of hand, perhaps a pincer, gripper, flipper or paddle, explained O'Reilly. For example, the robot could hold a container with two hands and unscrew its top with the third.


Coupled with onboard computers, these features will allow the robot "to locally navigate, cooperate with people using social cues, take instruction from people, recognize location context, recognize classes of objects, manipulate objects and learn from experience," O'Reilly said.


For their September DARPA demonstration, however, the researchers had more modest goals. They demonstrated Cardea going up to a door, opening it and going through the doorway. One arm and "hand" were attached to the front of the robot; the other two arms and hands are in development. Similarly, Cardea's head is far from complete. Currently it consists of a single camera on a metal "neck" about a foot and a half long.


The work is a natural extension of past work in Brooks' lab. Cog and Kismet, the lab's first humanoid robots, both employed a variety of novel systems for interacting with people.


Conventional robot arms, for example, are programmed to move to a certain position with a certain force, and if they hit an object that's in the way, they'll just keep pushing. That could either burn out the robot's motors or damage the object (perhaps a person) that's in the way. Cog's arms, in contrast, are reactive to their environment. Thanks to a patented physical spring system and "virtual spring" software, if something gets in the way of an arm, it will stop moving. Depending on the task at hand, the arms can also be made rigid or floppy.


Cardea's arms will be the next step toward "developing a system that's even more like a human's, which has this wonderful springiness and can react to surprises in the world," said O'Reilly, moving her arms in expressive gestures.


Thanks to Kismet, the researchers also have a good background in how to build a robotic head that can interact with humans via myriad facial expressions, head positions and tones of voice.


Both earlier robots "learned" using a behavior-based approach to intelligence that Cardea also will employ. With perceptual abilities such as vision and software programs that mimic development, the robots can interact with their environment in human-like ways. The idea is that simple interactions will lead to simple behaviors that in turn build on each other to make more complicated behaviors easier. That's how children learn.


Cardea will extend this approach for the first time to manipulation. In other words, the robot will "explore the world by manipulating objects," O'Reilly said. Humans do this all the time, especially in situations where vision isn't useful (like navigating through a dark house or feeling for an object that's fallen behind the couch).


In their planning document, the researchers write that they expect this approach "will lead to a manipulation renaissance in robots that can physically interact with their world," much as behavior-based robotic navigation led to commercially available robots for reconnaissance and cleaning.


The Cardea research team is the definition of diverse, with 11 men and women from Russia, the United States, Ireland, England, Ecuador, Australia, Canada and Indonesia. Members in addition to Brooks and O'Reilly are undergraduates Alexander Moore and Alana Lafferty, graduate students Aaron Edsinger, Lijin Arayananda, Paulina Varshavskaya, Jessica Banks and Eduardo Torres-Jara, and postdoctoral associate Paul Fitzpatrick. The team also includes an adjunct member, Kathleen Richardson, who is studying the researchers and others in the Brooks lab for her Ph.D. in anthropology from Cambridge University.


"I want to look at what kind of ideas go into the making of human-like robots," Richardson said. "What kinds of behaviors are the scientists trying to give the robots? How do they do so? How do they philosophize about their work?


"In my opinion," she concluded, "the interactive humanoid robots at MIT are the tip of the iceberg. I think we're going to see more of them in the future."



Related Topics

More MIT News