Skip to content ↓

Evolving human-machine relations explored at EECS 100th celebration

John V. Guttag, left, head of the Department of Electrical Engineering and Computer Science, plays "Robot John" for AI Lab director Rodney Brooks to illustrate how a robot learns tasks.
Caption:
John V. Guttag, left, head of the Department of Electrical Engineering and Computer Science, plays "Robot John" for AI Lab director Rodney Brooks to illustrate how a robot learns tasks.
Credits:
Donna Coveney

Prosthetic legs that bend naturally at the knee, computers that can communicate orally with human beings, and tiny machines are among the future devices in store for society, thanks to MIT's Department of Electrical Engineering and Computer Science.

Four MIT researchers gave hundreds of EECS alumni and friends a taste of the future of human-machine relationships on Friday in Kresge Auditorium at the first day of a two-day celebration of the department's centennial.

Using EECS department head John V. Guttag as a sample robot, Rodney Brooks, the Fujitsu Professor of Computer Science and Engineering and director of MIT's Artificial Intelligence Laboratory, demonstrated how much people rely on visual cues when they're teaching or learning from each other or working with robots. He asked Guttag to come on stage, and they demonstrated how a human and a robot would need to understand each other's nonverbal cues to interact effectively.

That's why Brooks creates robots that not only see, but have "eyes" so those who interact with them can tell where they're looking. Brooks said the humanoid robot is not necessarily the best form for the job, except for robots that will work in homes and other spaces designed for humans. He is creating a three-armed robot who will ply the halls of EECS's new home, the Stata Center complex, on a Segway.

The AI Lab is also making progress on robots that walk. Artificial limbs that adjust to the user's speed, gait and walking surface will provide a vast improvement over current models.

Teaching robots and other computer systems how to learn is the focus of Leslie Kaelbing, professor of computer science and engineering and associate director of the AI Lab. Robots built to explore outer space or shop in a grocery store end up in unpredictable environments, so they need to be able to make predictions about situations they've never seen before.

"The central question in machine learning is 'what is the right hypothesis'" for the machine to use in a given situation, Kaelbing said. "We trade off making compact things that get things wrong and complicated things that get things right."

Her ambitious goal is to create an "enduring personal cognitive assistant," a computer that would be the perfect secretary: organize files, remind you of meetings, and generally help you navigate daily life and do your job better.

Kaelbing's and Brooks' projects sometimes incorporate the accomplishments of Victor Zue, the Delta Electronics Professor of Electrical Engineering and Computer Science and director of MIT's Laboratory for Computer Science, who works on voice recognition and speech synthesis.

Zue creates computer systems that can have a conversation and provide spoken information in a natural way. His systems can, for instance, recommend restaurants and provide information about weather and airline flights. He, too, wants to make systems that can learn by adapting to new speakers and acquire new phrases when necessary.

Martin Schmidt, professor of electrical engineering and director of MIT's Microsystems Technology Laboratory, said "tiny technologies" such as postage-stamp-sized rockets, shirt-button-sized turbine engines, microchips that can be embedded in the body and tiny microphones will be the way of the future. For one thing, he said, making them is fun. Other reasons include military and medical applications as well as possible new energy sources. He described a tiny neural probe that can be embedded in the brain to measure neural activity for research.

"I think we're going to see many more direct connections between our neurons and robotic devices," Brooks said. "People and robots are going to become much more intimate in interesting ways."

Related Links

Related Topics

More MIT News