Skip to content ↓

“Blind” Cheetah 3 robot can climb stairs littered with obstacles

Improved design may be used for exploring disaster zones and other dangerous or inaccessible environments.
Watch Video
Press Inquiries

Press Contact:

Abby Abazorius
Phone: 617-253-2709
MIT News Office

Media Download

MIT’s Cheetah 3 robot can climb stairs and step over obstacles without the help of cameras or visual sensors.
Download Image
Caption: MIT’s Cheetah 3 robot can climb stairs and step over obstacles without the help of cameras or visual sensors.
Credits: Courtesy of the researchers

*Terms of Use:

Images for download on the MIT News office website are made available to non-commercial entities, press and the general public under a Creative Commons Attribution Non-Commercial No Derivatives license. You may not alter the images provided, other than to crop them to size. A credit line must be used when reproducing images; if one is not provided below, credit the images to "MIT."

Close
MIT’s Cheetah 3 robot can climb stairs and step over obstacles without the help of cameras or visual sensors.
Caption:
MIT’s Cheetah 3 robot can climb stairs and step over obstacles without the help of cameras or visual sensors.
Credits:
Courtesy of the researchers

MIT’s Cheetah 3 robot can now leap and gallop across rough terrain, climb a staircase littered with debris, and quickly recover its balance when suddenly yanked or shoved, all while essentially blind.

The 90-pound mechanical beast — about the size of a full-grown Labrador — is intentionally designed to do all this without relying on cameras or any external environmental sensors. Instead, it nimbly “feels” its way through its surroundings in a way that engineers describe as “blind locomotion,” much like making one’s way across a pitch-black room.

“There are many unexpected behaviors the robot should be able to handle without relying too much on vision,” says the robot’s designer, Sangbae Kim, associate professor of mechanical engineering at MIT. “Vision can be noisy, slightly inaccurate, and sometimes not available, and if you rely too much on vision, your robot has to be very accurate in position and eventually will be slow. So we want the robot to rely more on tactile information. That way, it can handle unexpected obstacles while moving fast.”

Researchers will present the robot’s vision-free capabilities in October at the International Conference on Intelligent Robots, in Madrid. In addition to blind locomotion, the team will demonstrate the robot’s improved hardware, including an expanded range of motion compared to its predecessor Cheetah 2, that allows the robot to stretch backwards and forwards, and twist from side to side, much like a cat limbering up to pounce.

Within the next few years, Kim envisions the robot carrying out tasks that would otherwise be too dangerous or inaccessible for humans to take on.

“Cheetah 3 is designed to do versatile tasks such as power plant inspection, which involves various terrain conditions including stairs, curbs, and obstacles on the ground,” Kim says. "I think there are countless occasions where we [would] want to send robots to do simple tasks instead of humans. Dangerous, dirty, and difficult work can be done much more safely through remotely controlled robots.”

Making a commitment

The Cheetah 3 can blindly make its way up staircases and through unstructured terrain, and can quickly recover its balance in the face of unexpected forces, thanks to two new algorithms developed by Kim’s team: a contact detection algorithm, and a model-predictive control algorithm.

The contact detection algorithm helps the robot determine the best time for a given leg to switch from swinging in the air to stepping on the ground. For example, if the robot steps on a light twig versus a hard, heavy rock, how it reacts — and whether it continues to carry through with a step, or pulls back and swings its leg instead — can make or break its balance.

“When it comes to switching from the air to the ground, the switching has to be very well-done,” Kim says. “This algorithm is really about, ‘When is a safe time to commit my footstep?’”

The contact detection algorithm helps the robot determine the best time to transition a leg between swing and step, by constantly calculating for each leg three probabilities: the probability of a leg making contact with the ground, the probability of the force generated once the leg hits the ground, and the probability that the leg will be in midswing. The algorithm calculates these probabilities based on data from gyroscopes, accelerometers, and joint positions of the legs, which record the leg’s angle and height with respect to the ground.

If, for example, the robot unexpectedly steps on a wooden block, its body will suddenly tilt, shifting the angle and height of the robot. That data will immediately feed into calculating the three probabilities for each leg, which the algorithm will combine to estimate whether each leg should commit to pushing down on the ground, or lift up and swing away in order to keep its balance — all while the robot is virtually blind.

“If humans close our eyes and make a step, we have a mental model for where the ground might be, and can prepare for it. But we also rely on the feel of touch of the ground,” Kim says. “We are sort of doing the same thing by combining multiple [sources of] information to determine the transition time.”

The researchers tested the algorithm in experiments with the Cheetah 3 trotting on a laboratory treadmill and climbing on a staircase. Both surfaces were littered with random objects such as wooden blocks and rolls of tape.

“It doesn’t know the height of each step, and doesn’t know there are obstacles on the stairs, but it just plows through without losing its balance,” Kim says. “Without that algorithm, the robot was very unstable and fell easily.”

Future force

The robot’s blind locomotion was also partly due to the model-predictive control algorithm, which predicts how much force a given leg should apply once it has committed to a step.

“The contact detection algorithm will tell you, ‘this is the time to apply forces on the ground,’” Kim says. “But once you’re on the ground, now you need to calculate what kind of forces to apply so you can move the body in the right way.”

The model-predictive control algorithm calculates the multiplicative positions of the robot’s body and legs a half-second into the future, if a certain force is applied by any given leg as it makes contact with the ground.

“Say someone kicks the robot sideways,” Kim says. “When the foot is already on the ground, the algorithm decides, ‘How should I specify the forces on the foot? Because I have an undesirable velocity on the left, so I want to apply a force in the opposite direction to kill that velocity. If I apply 100 newtons in this opposite direction, what will happen a half second later?”

The algorithm is designed to make these calculations for each leg every 50 milliseconds, or 20 times per second. In experiments, researchers introduced unexpected forces by kicking and shoving the robot as it trotted on a treadmill, and yanking it by the leash as it climbed up an obstacle-laden staircase. They found that the model-predictive algorithm enabled the robot to quickly produce counter-forces to regain its balance and keep moving forward, without tipping too far in the opposite direction.  

“It’s thanks to that predictive control that can apply the right forces on the ground, combined with this contact transition algorithm that makes each contact very quick and secure,” Kim says.

The team had already added cameras to the robot to give it visual feedback of its surroundings. This will help in mapping the general environment, and will give the robot a visual heads-up on larger obstacles such as doors and walls. But for now, the team is working to further improve the robot’s blind locomotion

“We want a very good controller without vision first,” Kim says. “And when we do add vision, even if it might give you the wrong information, the leg should be able to handle (obstacles). Because what if it steps on something that a camera can’t see? What will it do? That’s where blind locomotion can help. We don’t want to trust our vision too much.”

This research was supported, in part, by Naver, Toyota Research Institute, Foxconn, and Air Force Office of Scientific Research.

Press Mentions

NESN

NESN Clubhouse visits the lab of Prof. Sangbae Kim to learn more about his work developing a robotic cheetah that can run at a speed of approximately 13 miles per hour, jump over obstacles, climb up stairs and execute tight turns. Kim explains that the cheetah could run from home plate to first base in about 15 seconds. 

NBC News

In this video, NBC Mach highlights the robotic cheetah developed by MIT researchers that can navigate without cameras or sensors. While most robots require light to explore their surroundings, the “Cheetah 3 will be able to feel its way through light-less situations such as caves or mines.”

Newsweek

MIT researchers have updated their robotic cheetah to allow it to move without relying on external vision sensors, reports Lisa Spear for Newsweek. Spear explains that, “an algorithm helps the mechanical creature determine the best time to transition a leg between a swing and a step, by constantly calculating the probabilities of each legs' movement.”

Axios

Axios reporter Kaveh Waddell writes about the Cheetah 3 robot, which navigates its environment without cameras. Waddell explains that, “the researchers measure the force on each of the Cheetah's legs straight from the motors that control them, allowing it to move fast — at 3 meters per second, or 6.7 miles an hour — and jump up onto a table from a standstill.”

CNN

This CNN video profiles the new Cheetah 3 robot, which can avoid obstacles and climb stairs without using external visual sensors. CNN notes that the cheetah, “relies on ‘feel’ in place of cameras or sensors, using ‘blind locomotion.’”

Reuters

In this video, Reuters reporter Roselle Chen spotlights the Cheetah 3 robot, which utilizes two algorithms to run across rough terrain and maintain its balance without using cameras or sensors. Chen explains that the robot being able to navigate without cameras or sensors is like a human being able to walk around while its pitch black out.

ABC News

ABC News reporter Bopha Phorn writes about the latest iteration of a robotic cheetah developed by MIT researchers. Phorn explains that the researchers hope the cheetah will eventually be able to, “help some work that’s impossible for humans to do,” like search and rescue operations.

Boston Herald

MIT researchers have unveiled the latest iteration of their robotic cheetah that can navigate without the use of cameras or sensors and could be used for disaster response, reports Jordan Graham for The Boston Herald. “We’re mostly thinking about sending robots instead of humans where potential hazards like toxicity or radiation or dangers can be,” explains Prof. Sangbae Kim.

TechCrunch

TechCrunch reporter Brian Heater writes that Prof. Sangbae Kim and his research group robotic cheetah can now run up the stairs and walk over debris without the use of cameras or sensors. Heater explains that the robot, “utilizes a pair of new algorithms — contact detection and model-predictive control — which help it recover its balance in the case of slippage.”

The Verge

Verge reporter Rachel Becker writes that MIT researchers have developed a robotic cheetah that can run up the stairs and navigate without the use of cameras. Becker explains that the Cheetah 3 robot navigates its environment by touch, which could allow it to, “venture where humans can’t — like deep inside power plants for inspections.”

Popular Mechanics

Writing for Popular Mechanics, Eric Limer highlights how the updated Cheetah 3 robot can navigate by feeling its way around its environment and can leap up onto tables. Limer explains that the robotic cheetah is, “able to rear back on its hind legs, leap into the air, and make a solid landing on a platform much taller than it is.”

Related Links

Related Topics

Related Articles

More MIT News