Skip to content ↓

Monkey controls robotic arm using brain signals sent over Internet

Press Inquiries

Press Contact:

Elizabeth A. Thomson
Phone: 617-258-5563
MIT Resource Development

Media Download

James Biggs, postdoctoral associate in the Research Lab of Electronics (left), Professor Mandayam Srinivasan, director of MIT's Touch Lab, and mechanical engineering graduate student Jung Kim observe the movement of a robotic arm driven by signals from a monkey at Duke University.
Download Image
Caption: James Biggs, postdoctoral associate in the Research Lab of Electronics (left), Professor Mandayam Srinivasan, director of MIT's Touch Lab, and mechanical engineering graduate student Jung Kim observe the movement of a robotic arm driven by signals from a monkey at Duke University.
Credits: Photo / Donna Coveney

*Terms of Use:

Images for download on the MIT News office website are made available to non-commercial entities, press and the general public under a Creative Commons Attribution Non-Commercial No Derivatives license. You may not alter the images provided, other than to crop them to size. A credit line must be used when reproducing images; if one is not provided below, credit the images to "MIT."

Close
James Biggs, postdoctoral associate in the Research Lab of Electronics (left), Professor Mandayam Srinivasan, director of MIT's Touch Lab, and mechanical engineering graduate student Jung Kim observe the movement of a robotic arm driven by signals from a monkey at Duke University.
Caption:
James Biggs, postdoctoral associate in the Research Lab of Electronics (left), Professor Mandayam Srinivasan, director of MIT's Touch Lab, and mechanical engineering graduate student Jung Kim observe the movement of a robotic arm driven by signals from a monkey at Duke University.
Credits:
Photo / Donna Coveney

Monkeys in North Carolina have remotely operated a robotic arm 600 miles away in MIT's Touch Lab -- using their brain signals.

The feat is based on a neural-recording system reported in the November 16 issue of Nature. In that system, tiny electrodes implanted in the animals' brains detected their brain signals as they controlled a robot arm to reach for a piece of food.

According to the scientists from Duke University Medical Center, MIT and the State University of New York (SUNY) Health Science Center, the new system could form the basis for a brain-machine interface that would allow paralyzed patients to control the movement of prosthetic limbs.

The Internet experiment "was a historic moment, the start of something totally new," Mandayam Srinivasan, director of MIT's Touch Lab, said in a November 15 story in the Wall Street Journal.

The work also supports new thinking about how the brain encodes information, by spreading it across large populations of neurons and by rapidly adapting to new circumstances.

In the Nature paper, the scientists described how they tested their system on two owl monkeys, implanting arrays of as many as 96 electrodes, each less than the diameter of a human hair, into the monkeys' brains.

The technique they used allows large numbers of single neurons to be recorded separately, then combines their information using a computer coding algorithm. The scientists implanted the electrodes in multiple regions of the brain's cortex, including the motor cortex from which movement is controlled. They then recorded the output of these electrodes as the animals learned reaching tasks, including reaching for small pieces of food.

Analyzing brain signals

To determine whether it was possible to predict the trajectory of monkeys' hands from the signals, the scientists fed the mass of neural signal data generated during many repetitions of these tasks into a computer, which analyzed the brain signals. In this analysis, the scientists used simple mathematical methods and artificial neural networks to predict hand trajectories in real time as the monkeys learned to make different types of hand movements.

"We found two amazing things," said Miguel Nicolelis, associate professor of neurobiology at Duke. "One is that the brain signals denoting hand trajectory show up simultaneously in all the cortical areas we measured. This finding has important implications for the theory of brain coding, which holds that information about trajectory is distributed really over large territories in each of these areas even though the information is slightly different in each area.

"The second remarkable finding is that the functional unit in such processing does not seem to be a single neuron," Professor Nicolelis said. "Even the best single-neuron predictor in our samples still could not perform as well as an analysis of a population of neurons. So this provides further support to the idea that the brain very likely relies on huge populations of neurons distributed across many areas in a dynamic way to encode behavior."

Over the Net

Once the scientists demonstrated that the computer analysis could reliably predict hand trajectory from brain signal patterns, they used the brain signals from the monkeys as processed by the computer to allow the animals to control a robot arm moving in three dimensions. They even tested whether the signals could be transmitted over a standard Internet connection, controlling a similar arm in MIT's Laboratory for Human and Machine Haptics, informally known as the Touch Lab.

"When we initially conceived the idea of using monkey brain signals to control a distant robot across the Internet, we were not sure how variable delays in signal transmission would affect the outcome," said Dr. Srinivasan. "Even with a standard TCP/IP connection, it worked out beautifully. It was an amazing sight to see the robot in our lab move, knowing that it was being driven by signals from a monkey brain at Duke. It was as if the monkey had a 600-mile-long virtual arm."

The researchers will soon begin experiments in which movement of the robot arm generates tactile feedback signals in the form of pressure on the animal's skin. Also, they are providing visual feedback by allowing the animal to watch the movement of the arm.

Such feedback studies could also potentially improve the ability of paralyzed people to use such a brain-machine interface to control prosthetic appendages, said Professor Nicolelis. In fact, he said, the brain could prove extraordinarily adept at using feedback to adapt to such an artificial appendage.

Augmenting the body

"If such incorporation of artificial devices works, it would quite likely be possible to augment our bodies in virtual space in ways that we never thought possible," said Dr. Srinivasan, a principal research scientist in mechanical engineering and the Research Laboratory of Electronics. "In fact, the robot that was controlled by the monkey brain signals is a haptic interface -- a device that is part of a multisensory virtual-reality system in our lab. It enables us to touch, feel and manipulate virtual objects created solely through computer programs, just as computer monitors enable us to see synthesized visual images and speakers enable us to hear synthesized sounds.

"In our experiment at using brain signal patterns to control the robot arm over the Internet, if we extended the capabilities of the arm by engineering different types of feedback to the monkey -- such as visual images, auditory stimuli and forces associated with feeling textures and manipulating objects -- such closed-loop control might result in the remote arm being incorporated into the body's representation in the brain," Dr. Srinivasan continued.

"Once you establish a closed loop that is very consistent, you're basically telling the brain that the external device is part of the body representation. The major question in our minds now is: what is the limit of such incorporation? For example, if we program the virtual objects such that they do not follow the laws of physics of our so-called real world, how will they be represented in the brain?" he said.

Besides experimenting with feedback systems, the scientists are planning to increase the number of implanted electrodes, with the aim of achieving 1,000-electrode arrays. They are also developing a "neurochip" that will greatly reduce the size of the circuitry required for sampling and analysis of brain signals.

In addition to Drs. Nicolelis and Srinivasan, other co-authors of the paper were Johan Wessberg, Christopher Stambaugh, Jerald Kralik, Pamela Beck and Mark Laubach of Duke; graduate student Jung Kim of mechanical engineering and postdoctoral associate James Biggs of the Research Lab of Electronics (MIT); and John Chapin of SUNY.

The work is supported by the National Institutes of Health, National Science Foundation, the Defense Advanced Research Projects Agency and the Office of Naval Research.

A version of this article appeared in MIT Tech Talk on December 6, 2000.

Related Topics

More MIT News

Headshot of Catherine Wolfram

A delicate dance

Professor of applied economics Catherine Wolfram balances global energy demands and the pressing need for decarbonization.

Read full story