Imagine a room that responds to your verbal commands for information by projecting that information onto a wall display, or even verbally answering your request. What if the room even volunteers related information you hadn't thought of?
Move over, Star Trek. MIT scientists are going beyond fiction to develop such an intelligent room. Their prototypes -- a conference room and a personal office -- can already do the tasks described above (and more) for specific scenarios. In one demo, the room serves as a "command center" for hurricane disaster relief; in another, it helps a student give a virtual tour of the Artificial Intelligence Laboratory, where the room is being developed.
The idea is to make computers (which run the room) "not only genuinely user-friendly but also essentially invisible to the user," the researchers say. Thus, rather than typing on a keyboard, a user interacts with the room by talking and pointing. "You come as you are -- no keyboards, no mice, no fancy gadgetry," said Howard Shrobe, associate director of the AI Lab.
For example, in the hurricane command center demo, the intelligent room responds to a user's verbal request for a layout of the Virgin Islands by projecting a map on the wall. Other examples of what the room is capable of are given below, as reported in a paper by Michael H. Coen to appear in the Spring Symposium on Intelligent Environments of the American Association for Artificial Intelligence (March 23-25). Mr. Coen is a graduate student who is developing the room for his doctoral project.
User: "Computer,
[The room will now listen for utterances without requiring they be prefaced by the word Computer.]
User: "Show me the Virgin Islands."
Room: "I'm showing the map right next to you." [Room shows the map on the video display closest to the user.]
User: [pointing at St. Thomas] "Zoom in. How far away is Hurricane Marilyn?"
Room: "The distance between Hurricane Marilyn and the city of Charlotte Amalie located in St. Thomas is 145 miles."
User: "Where's the nearest disaster field office?"
[Room highlights its location on the map.]
Room: "The St. Thomas disaster field office is located one mile outside of Charlotte Amalie.
User: "Yes, show me the satellite image."
Such interactions are possible because the intelligent room incorporates a variety of artificial-intelligence technologies such as speech recognition programs and machine vision. These programs in turn receive raw data to determine, say, the location of the user through a variety of different cameras and other devices embedded in the ceiling and attached to the walls.
"The most challenging part of this work is taking academic systems that are essentially research systems, combining them and getting the overall system to work," Mr. Coen said.
To hear verbal commands, the room currently requires that the user wear a microphone. That's temporary, however, said Dr. Shrobe. Based in part on work from other schools, the researchers are developing a "microphone beam" that will allow the room to pick up sounds from the person of interest without the aid of a device worn on the body.
The researchers are also developing an advanced version of the software that coordinates the intelligent room; it is being implemented in the intelligent office. Mr. Coen named the software Hal after the computer in the movie "2001: A Space Odyssey." He notes that "our version of Hal is intended to have a sunnier disposition" than the movie version, which ultimately tried to destroy its human creators. (Coincidentally, Dr. Shrobe and the movie version of Hal share the same birthday: January 12.)
One of the more developed components of the room is the speech recognition technology, which is capable of recognizing several orders of magnitude more sentences than commercial software programs are capable of. The new software, developed by Mr. Coen and colleagues, expands the capabilities of software provided by DragonSystems, a commercial provider of such programs. In an example of technology transfer, the new software will be used in an Advanced Research Products Agency (ARPA) project called the Command Post of the Future.
Mr. Coen compares the intelligent room to indoor plumbing. "I'm hoping that this will cause the same kind of revolution in the way people live," he said. "I'm intensely serious about getting these technologies into people's homes within the next decade or so."
Other researchers involved in the work include Rodney A. Brooks, the Fujitsu Professor of Computer Science and Engineering and director of the AI Lab, and Tom������������������s Lozano-P���rez, the Cecil H. Green Professor of Computer Science and Engineering. At any one time, four MIT graduate students and about 10 undergraduates are also working on the project.
Major funding for the work is from ARPA.
A version of this article appeared in MIT Tech Talk on March 11, 1998.