Skip to content ↓

UAVs learn to fly solo

Nick Roy develops unmanned aerial vehicles that can operate autonomously in constrained spaces and unmapped environments.
Press Inquiries

Press Contact:

Eric Markowsky
Phone: 617-335-0886
MIT Industrial Liaison Program
Close
Nicholas Roy
Caption:
Nicholas Roy
Credits:
Photo: David Sella

The popular term “drone,” which conjures images of remote-controlled flying zombies, is becoming less and less descriptive of the latest unmanned aerial vehicles (UAVs). New applications are requiring more autonomy and intelligence from UAVs.

“When people think about drones, they largely think of big military assets that are flying high in the sky where there’s not a whole lot of anything to hit,” says Nick Roy, director of the Robust Robotics Group at MIT’s Computer Science and Artificial Intelligence Laboratory. “But there are a lot of applications for smaller scale UAVs working closer to the ground that require more autonomy, such as agricultural monitoring, package delivery, and situational awareness for first responders.”

Teaching UAVs and other robots to think for themselves is the central mission of the Robust Robotics Group. “We want UAVs to be able to operate in urban environments, to get useful things done, and interact with people,” says Roy, who is also an associate professor of aeronautics and astronautics. “We want them to become as intelligent as they need to be for the task at hand.”

Roy has recently focused more on UAVs than terrestrial robots, although many of the principles and algorithms are similar. UAVs will require more autonomy to avoid collisions and crashes, as well as to understand what’s happening around them. Some level of reliable autonomous operation will be essential if the FAA is to fully permit commercial applications in the United States.

“It’s not just about avoiding obstacles, but about understanding the environment and what’s safe and unsafe,” Roy says. “UAVs need to understand their own behavior in terms of reliability and performance, and also to understand how people want them to do things.”

Project Wing and hybrid fixed-wing UAVs

In 2012, Roy accepted a sabbatical position at Google X to help launch Project Wing, a project with the goal of demonstrating the viability of product delivery using UAVs. After Roy and his team completed a prototype in August 2014, Google was convinced it was time to move to the product phase. Roy returned to MIT last year, but continues to consult with the project while MIT alumnus Dave Vos PhD '96 helps steer Project Wing to the next level.

Project Wing is a hybrid aircraft instead of the typical quadrotor designs that hve dominated academic research and the consumer UAV market in general. Although it does use four rotors, the rotors normally perform like airplane propellers. When the craft reaches its target to drop a package, it tilts upward so it can hover like a quadrotor.

This “tail sitter” design is a revision of an old idea that has yet to be proven commercially feasible. “Hybrid vehicles like tail sitters, tilt rotors, tilt props, or vehicles with two propulsion systems, have been explored throughout the history of aviation,” Roy says. “But enough things have changed to make them worth trying again. Our ability to manufacture small vehicles and put computation and modern control systems onboard means the things that once were hard are relatively easy now.”

Compared to quadrotors, conventional fixed wing craft have obvious limitations, including the need for a runway and the requirement for a minimum speed to remain airborne, Roy says. Yet, “fixed wing craft are a lot more efficient in flight and can stay up much, much longer,” he adds. Meanwhile, the new hybrid designs promise to combine the best of both technologies.

Although Roy is focused more on software than hardware, he must keep up on all the last technologies, especially sensors, which help shape the way the UAV thinks. Spurred on by the need to reduce weight and power consumption, for example, some UAV researchers are aiming to use lightweight, low-cost cameras for navigation, rather than requiring LIDAR equipment or 3-D cameras.

“Passive cameras give you an understanding of the scene that I think will be important in the future,” Roy says. “Pure vision-based navigation has yet to work reliably, but the field has progressed a lot. I’m excited about how we might use passive cameras to help UAVs navigate on their own.”

In the meantime, no single sensor technology is the right answer, Roy says. “GPS has issues in urban environments and cameras have issues especially at night,” he says. “A lot of my group’s recent research has focused on accurate ranging, whether it’s a laser range finder or a 3-D camera. Those sensors are heavy and don’t work in every domain. The right answer will probably lie in a fusion of sensors.”

The world through our eyes

Delivery UAVs like Project Wing or Amazon’s prototype will need more autonomy and intelligence than a typical UAV used for crop monitoring or filming commercials. This is especially true if the UAV is expected to drop off and pick up packages in urban environments.

“The UAV will need to be smart enough to reason about its own performance and impending failures,” Roy says. “Autonomy is the biggest challenge facing integration in the airspace. Vehicles need autonomy in order to recover from failures, and to see other aircraft and not hit them. They need autonomy to interact with air traffic control and play nicely in the national airspace.”

Researchers at MIT and elsewhere have focused on imbuing robots with object recognition, but that’s only the beginning. A greater challenge is to bridge the gap between the fundamentally different ways in which people and robots think.

“Robots think about the world in terms of very low level geometry,” Roy says. “They don’t think of walls as walls, but rather as pixels they can’t drive through. To work with people, robots must understand what things are for. To ask a robot to collect a box or load a truck, it needs the semantic understanding of what these objects are.”

Roy is focused less on object recognition itself than on helping robots “understand how objects are distributed and how they can interact with them,” he says. “Once you have object detection or scene understanding, you can move to the next step: showing the robot how to use this understanding to make decisions.”

One of Roy’s students, for example, is attempting to improve UAVs’ understanding of wind patterns in the urban environment. The UAVs could then use that knowledge to avoid turbulence or choose minimal energy routes.

Despite the continuing advance of computer miniaturization, the weight and power limitations of UAVs will continue to challenge their ability to process information quickly enough to make timely decisions. Rapidly fusing and integrating data from multiple sensors poses “computational challenges that are outside the scope of real-time systems like UAVs,” Roy says. “A lot of my research involves finding useful approximations that involve getting very good answers at the cost of a little accuracy and precision.”

These approximation algorithms were put to work in Roy’s recent experiments in which a fixed wing vehicle carrying a laser range finder flew at speed around the tightly constrained environment of a parking garage. “If you were to try to incorporate the laser range finder into the full-state estimate of the vehicle’s 12 degrees of freedom, the computation would get intractable,” Roy says. “But if you break the problem apart into the bits that the laser finder can see at any one time, you can still get the right answer, but much more efficiently than if you ask the laser to ‘reason develop’ the entire system at once.”

Asking the right questions

Teaching UAVs to recognize objects and process sensor data in order to make real-time decisions will help avoid collisions even in complex environments, including offices. Yet, additional autonomy and intelligence is required when UAVs work closely together with people. Beyond ensuring the safety of humans, the algorithms need to be sufficiently sophisticated to enable UAVs to take instructions from people or collaborate with them to get things done.

“We need to teach robots how to interact with people as seamlessly as people work with each other,” Roy says. “They need things like semantic maps to help them think about the world the same way people do. They also need to understand what people want and how they behave. We’re looking at things like natural language interfaces, and connecting human speech with the things the robot sees.”

The Robust Robotics Group has made some progress in teaching robots to understand directions and instructions. Now, the group is working on dialog management: teaching the robot and human how to converse.

“The challenge for the robot is not just how to know it needs to ask a question, but how to ask the question in a way that can return a useful answer,” Roy says. “If the robot says, ‘I don’t understand,’ the human will probably get annoyed and abandon the robot. This technology has to mature substantially before we’re really ready to have robots become part of our everyday lives.”

Related Links

Related Topics

Related Articles

More MIT News