Skip to content ↓

In Profile: Missy Cummings

Former U.S. Naval fighter pilot aims to improve how humans and computers interact.
Mary (Missy) Cummings, professor in the Department of Aeronautics and Astronautics and the Engineering Systems Division, holds a quad-rotor.
Mary (Missy) Cummings, professor in the Department of Aeronautics and Astronautics and the Engineering Systems Division, holds a quad-rotor.
Photo: Patrick Gillooly

Mary (Missy) Cummings was exhilarated the first time she landed a fighter jet aboard an aircraft carrier in 1989, but the young pilot's elation didn't last long. Seconds later, a close friend died while attempting the same landing on the back of the carrier.

“I can't tell you how many friends died because of bad designs,” says Cummings, recalling the crash that occurred on the U.S.S. Lexington in the Gulf of Mexico. “After spending so much time as a pilot, I found it incredibly frustrating to work with technology that didn’t work with me.”

It wasn’t until Cummings left the Navy after 10 years and chose to pursue a PhD in systems engineering that she realized she could help improve the severely flawed designs of the technological systems she used as a pilot — from confusing radar screens and hand controls to the nonintuitive setup of cockpits — by making an impact at the research level.

Today, she is an associate MIT professor with appointments in the Department of Aeronautics and Astronautics and in the Engineering Systems Division, and she directs the Humans and Automation Laboratory (HAL). Her work focuses on “human factors” engineering — specifically, how to develop better tools and technology to help people like pilots and air traffic controllers make good decisions in high-risk, highly automated environments. It is a critical field of research that has burgeoned in recent years with the explosion of automated technology. This has replaced the need for humans in direct manual control with the need for humans as supervisors of complex automatic control systems, such as nuclear reactors or air traffic control systems.

But one consequence of these automated domains controlled by humans — known as “humans-in-the-loop” systems — is that the level of required cognition has moved from that of well-rehearsed skill execution and rule-following to higher, more abstract levels of knowledge synthesis, judgment and reasoning.

A novel application

Nowhere has this change been more apparent than in the military, where pilots are increasingly being trained to operate unmanned aerial vehicles (UAVs), or drones, to perform certain cognitive tasks, such as getting a closer look at potential snipers. Prompted by the success of drones in Iraq and Afghanistan, U.S. Defense Secretary Robert Gates announced last year that UAV technology would become a permanent part of the defense budget.

But as UAV technology becomes more prominent, Cummings wants to make it easier for humans to control portable robots in time-sensitive situations. Her goal is to lower the cognitive overhead for the user, who may not have a lot of time to change complicated menu settings or zoom and pan a camera, so that he or she can focus on more critical tasks.

“It’s about offloading skill-based tasks so that people can focus specifically on knowledge-based tasks, such as determining whether a potential sniper is a good or  bad guy by using the UAV to identify him,” Cummings says. The technology could also help responders search more efficiently for victims after a natural disaster.

Over the past year, Cummings and her students have designed an iPhone application that can control a small, one pound UAV called a quad-rotor — a tiny helicopter with four propellers and a camera attached to it. When the user tilts the iPhone in the direction he or she wants the UAV to move, the app sends GPS coordinates to the UAV to help it navigate an environment using built-in sensors. The UAV uses fast-scanning lasers to create rapid, electronic models of its environment and then sends these models back to the iPhone in the form of easy-to-read maps, video and photos. Although the military and Boeing are funding the research, the technology could be used for nonmilitary purposes, such as for a police force that needs a device to help monitor large crowds.

The app is designed so that anyone who can operate a phone could fly a UAV: The easy-to-use design means it takes only three minutes to learn how to use the system, whereas military pilots must undergo thousands of hours of costly training to learn how to fly drones. “This is all about the mission — you just need more information from an image, and you shouldn’t have to spend $1 million to train someone to get that picture,” she says.

The project is valuable for teaching because it represents a “classic scenario” in systems engineering in which a need is conceptualized, a system is designed to address that need and experiments are conducted to test the system, Cummings explains.

The HAL group recently conducted experiments of the app in which participants located in one building flew the UAV inside a separate building, positioning it in front of an eye chart so they could read the images the camera captured. Some achieved the equivalent of 20/30 vision, which Cummings says is “pretty good,” pointing out that, more importantly, the device never crashed. As Cummings and her students continue to refine the technology, their next step will be experiments in the real world where the UAV could reach an altitude of 500 feet. Although the group is working with several government agencies and companies on the design, there are no plans to deploy the app just yet.

Learning from boredom

Cummings began flying planes after graduating from the U.S. Naval Academy in 1988 and received her master’s degree in space systems engineering from the Naval Postgraduate School in 1994. When the Combat Exclusion Law was repealed in 1993, meaning that women could become fighter pilots for the first time in U.S. history, Cummings had already established herself as an accomplished pilot and was selected to be among the first group of women to fly the F/A-18 Hornet, one of the most technologically advanced fighter jets.

Although she loved the thrill of flying, Cummings left the military when her resentful male colleagues became intolerable. “It’s no secret that the social environment wasn’t conducive to my career. Guys hated me and made life very difficult,” she recalls. Cummings details this experience in her book Hornet’s Nest (Writer’s Showcase Press, 2000).

But what is most enduring about Cummings’ military experience is that it fueled her desire to improve how humans and computers can work together in complex technical systems. She focuses on how design principles, such as display screens, can affect supervisory control factors, such as attention span, when humans operate complex systems.

“In order to build a $1 billion air traffic control system, you can’t just do it by rule-of-thumb, you need to use models that take into account human factors, such as that people get bored by advanced automation systems,” Cummings says. The HAL group is currently developing models of human boredom to help design systems that prevent the people who monitor gauges and dials in automated systems from becoming bored, which is what happened last fall when two Northwest Airlines pilots overshot their destination by 150 miles because they weren’t paying attention.

In addition to boredom, other human factors that Cummings studies are what kinds of strategies people use to carry out certain tasks, whether they feel confident or frustrated when using a technology and the level of trust they have toward a system. These factors are tested through a variety of methods, such as analyzing eye movement, the number of times someone clicks or scrolls a mouse, or whether the user responds to alert messages. The iPhone project aims to analyze cognitive workload, and how easily the users are able to carry out certain cognitive tasks.

Yossi Sheffi, director of MIT’s Engineering Systems Division, thinks Cummings’ work is extremely important because technology by itself cannot be the answer when designing large-scale systems. “Her research tying the human operator to technology is crucial — both to the design of the technology itself, but also to the operation of the system as a whole, in order to ensure that it operates efficiently and effectively,” he says.

But the work also matters to civilians operating increasingly complex small-scale systems like cell phones and remote controls, according to MIT Professor of Computer Science and Engineering Randall Davis, who has worked with Cummings on several projects and praises her understanding of how people process information. “As we are increasingly surrounded by technology of all sorts, it becomes increasingly important for someone to understand how to design this stuff so that it’s easy to use; otherwise, we’ll be surrounded by incomprehensible technology,” he says.

Cummings’ ability to infuse the quantitative methods of hard science with the human behavior models of soft science like psychology is what makes her unique, according to HAL Research Specialist Dave Pitman, a former graduate student of Cummings who is working on the iPhone project. “She has a very good understanding of both sides of the coin, and because her background is in hard science, she tries to bring this rigor more into the soft sciences, which allows for better research,” he explains.

Related Links

Related Topics

More MIT News