Skip to content ↓

Flight of fancy

Using an autonomous mini-helicopter, an MIT team stunned the Association for Unmanned Vehicles International by solving one of its notoriously tough challenges on the first try.
A small, autonomous helicopter, programmed by MIT students under the direction of Professor Nick Roy, passes through a simulated window as part of a competition held over the summer.
Caption:
A small, autonomous helicopter, programmed by MIT students under the direction of Professor Nick Roy, passes through a simulated window as part of a competition held over the summer.
Credits:
Image courtesy of Nicholas Roy

In its first 18 years, the Association for Unmanned Vehicle Systems International’s annual aerial-robotics competition posed four successive challenges, which robotics researchers had to meet using entirely autonomous aerial vehicles — no remote control allowed. The first challenge, which stood for three years, was to move a metal disc from one end of an arena to another. The fourth challenge was to travel three kilometers and find a way into a specific building: it stood for eight years. But this summer, for the first time in the competition’s history, a challenge fell in its first year, to a team of students representing MIT’s Robust Robotics Group.

The competition presented a scenario mimicking the aftermath of a nuclear meltdown. The aerial robot had to navigate its way through a window and into a maze simulating the hallways of an evacuated building, locate the control room, identify a gauge ostensibly indicating radiation levels, photograph it, and transmit the photo to a base station over a radio connection. Unlike the fourth challenge, the fifth denied the vehicles access to GPS data.

Kyle Snyder, AUVSI’s senior technical director, said the MIT team’s feat came as “a pleasant surprise” to members of his organization and to other industry experts.

“I talked to some of the industry folks that attended the competition,” Snyder says, “and they said that there’s no way they could have done what the MIT and Georgia Tech teams were doing. Especially the MIT team — to actually pull together the sensors, the platform, and the understanding of what it was going to take to complete that mission. There’s nobody out there that could have done it.”

Competitors could use any type of aerial vehicle they chose, although, of course, it had to be small enough to operate indoors. Georgia Tech used a helicopter with two parallel rotors, and Embry-Riddle Aeronautical University used an innovative vehicle with a single spinning wing, like the pod of a maple seed. But the MIT team, which consisted of graduate students Abraham Bachrach, Ruijie He, and Sam Prentice and undergrads Anton de Winter and Garrett Hemann, used a battery-powered, off-the-shelf robot called a quad helicopter. The quad helicopter — or “quad” for short — is about two feet across and has a rotor at each of its four corners.

According to Nicholas Roy, the associate professor in the Department of Aeronautics and Astronautics who directed the students’ work, arming the quad with the software necessary to navigate a hallway required addressing “a fundamental research question.” The quad’s information about its immediate environment comes from a laser rangefinder that shoots out beams of light and gauges how long their reflections take to return. Since the lasers that emit the light lie in the same plane, they allow the quad to construct a two-dimensional map of its surroundings — a cross-section as seen from above. But the quad is continually buffeted by disturbances in the air, and if it tilts slightly, the map can change dramatically.

The quad, however, is also equipped with gyros and accelerometers that can measure its motion in three dimensions, so its twisting and tilting can be correlated with the changes in its environmental map. The MIT team developed algorithms that can use those correlations to give the quad some three-dimensional information about its surroundings.

Because the twisting and tilting points the rotors in unanticipated directions, it causes the quad to drift, so the algorithms also had to be fast: they had to be able to build their maps before the quad’s position changed too drastically. But in robotic control systems, gains in processing speed usually come at the expense of accuracy. To figure out exactly how much accuracy they could afford to give up, the MIT researchers tested their system in the Computer Science and Artificial Intelligence Laboratory’s motion capture studio, a large room with regularly spaced cameras along the tops of its walls. They were thus able to compare the quad’s own sense of its position with very precise external measurements, and they determined that, if the quad’s onboard computer was performing well, it could gauge its position with an accuracy of about five centimeters — exactly the margin of error that it had when passing through the window at the beginning of the competition.

The location-mapping software ran onboard the quad’s own processor, but to assemble a higher-level map of the entire maze, the quad radioed its measurements to a nearby base station that ran what Roy calls the “planning algorithm.” “One of our successes has been a planning algorithm that takes into account the fact that the sensor is limited,” Roy says. He points out, for instance, that the laser rangefinder has a 120-degree blind spot and a range of only 30 meters. “When we fly down long hallways,” Roy says, “the hallway may be longer than the maximum range. So down the corridor you see nothing, and you can build up a lot of speed very quickly without realizing it.” The planning algorithm thus keeps the quad oriented so that the rangefinder’s blind spot is directed at one of the side walls, so the quad can gauge its velocity by reference to the back wall — or any other obstacle with a fixed position — until the approaching wall comes into view.

Eric Johnson, the faculty advisor to the Georgia Tech team that entered the competition, says that while the MIT team developed a “fantastic system,” “there’s plenty of work to be done to make that kind of system practical and usable.” He points out, for instance, that “there’s a lot that can be done to make the system more robust and faster,” and that “another big detail to tackle is the 3-D aspect of it: although their system certainly could handle some aspects in three dimensions, I don’t think it had what would be necessary to, say, go up and down stairs.”

Roy agrees that “to do more three-dimensional operations — to be able to find a desk and land on a desk — the camera is clearly the right kind of sensor for that.” In fact, he says, the quad that completed the AUVSI challenge was equipped to process data from its camera, “but we ended up not using it because we did not need it,” he says. “In order to minimize points of failure, you turn off the things you don’t need.” But in his group’s ongoing research, Roy says, “we are moving more and more toward integrating the camera and the laser.”


Related Links

Related Topics

More MIT News