How can you use information to control a physical system that is spiraling into disorder?
In a February 7 paper in the Physical Review Letters of the American Institute of Physics, mechanical engineering Associate Professor Seth Lloyd and graduate student Hugo Touchette said information is key to exerting some level of control over arbitrary systems.
They found that controlling a system becomes possible when one acquires enough information about the system and then applies the information to keep uncertainties in the system's properties at manageable levels. For every single bit of information, there is a corresponding reduction in uncertainty.
A batter with one more bit of information about the trajectory of the ball coming toward home plate increases his control over the ball by a factor of two. If he is able to decrease the uncertainty of the angle with which the ball springs off the bat, he has a better chance of determining whether he has a pop fly or a home run.
Or to control the population growth of rabbits, you would need to know how many predators to introduce to make a dent in the size of the group. Or you might need information that would allow you to predict the future population based on the current population size. Without a minimum amount of information, the system would remain chaotic and would continue to spiral out of control.
TAMING CHAOS
Chaotic systems, which grow increasingly unpredictable as tiny changes result in huge effects, are particularly challenging. Chaos theory founder Ed Lorenz's well-known example is that a butterfly flapping its wings in Hong Kong can affect the course of a tornado in Texas. Chaos theory has been applied to biology, economics, chemistry, engineering, fluid mechanics and physics, among other fields.
"To control a chaotic system, you have to get a certain minimum amount of information at a certain rate to get the system in a desired state. If you get less than that, you're not going to be able to control it," Professor Lloyd said.
Controlling a system's entropy, or tendency toward disorder, is key. From the perspective of thermodynamics, controlling an object means reducing its entropy. The more that is known about the system, the less entropy it seems to have. In this vein, entropy is understood as the number of possibilities, which decreases as knowledge grows.
(To get an idea of how complex a system can become in a short amount of time, entropy is the logarithm of the number of possibilities. Two possibilities correspond to one bit, four possibilities to two bits, eight possibilities to three bits, etc. Doubling the number of possibilities adds one bit to the entropy.)
"Control is about information; getting information, processing it and feeding it back into the system. With more information, you can reduce uncertainty by a certain degree," Professor Lloyd said. "How much is the information worth to allow me to do better? This is a very basic question and the answer turns out to be quite simple."
HELP FOR QUANTUM COMPUTERS
It seems logical to say you need at least one bit of information to improve your level of control by one bit, but no one had ever provided quantitative answers before to such subjects as curbing complexity. Information theory methods of control can be applied to arbitrary systems, not just chaotic ones.
In quantum computing, for instance, researchers must figure out how to control the interaction of qubits (quantum bits) that are the basis of the computer. Quantum bits interact with each other on many levels, which makes them powerful computing devices but also makes them difficult to control.
Professor Lloyd is designing an experiment that uses one qubit as a sensor to collect information about another qubit in the system. The sensor then feeds the information back to reduce uncertainty about the system. He expects that this may help tame the currently unwieldy approach to error correction, which involves many redundant checks to ensure that the computer is on course. One qubit of information translates into one qubit of extra control.
This work is partly funded by the Natural Sciences and Engineering Research Council of Canada and by a grant from the d'Arbeloff Laboratory for Information Systems and Technology.
A version of this article appeared in MIT Tech Talk on February 16, 2000.