Subscribe
Subscribe
MY AMERICAN SCIENTIST
LOG IN! REGISTER!
SEARCH
 
Logo IMG
HOME > PAST ISSUE > May-June 2009 > Article Detail

COMPUTING SCIENCE

Everything Is Under Control

Can control theory save the economy from going down the tubes?

Brian Hayes

Running Hot and Cold

Graph shows response to shift in demandClick to Enlarge ImageProportional, integral and derivative control (together known as PID) are basic tools of control theory. In designing a control system, an engineer sets the “gain” of each type of feedback—the amount of correction applied for a given error magnitude. High gains yield a sensitive controller that promptly detects and corrects any disturbance. But a controller that responds too vigorously risks destabilizing the system, magnifying departures from the set point rather than suppressing them.

The hazard of controller-induced instability is most acute when there are delays, or time lags, built into the feedback circuit. The nature of this problem is familiar in everyday life. You step into the shower and find that the water is too cool, so you twist the temperature-control valve counterclockwise. Nothing happens for a few seconds, and so you turn the valve a little more. When the hot water finally makes its way to the shower head, you find you’ve gone too far. You dial the valve back a little, but the water continues to get hotter, so you turn the control further clockwise. Soon, you’re shivering. The temperature oscillations can keep growing until the shower is alternately emitting the hottest and the coldest water available. (In this situation the average temperature might be just right, but no one would count that a success of the control system.)

Cruise control and a shower valve are examples of control systems that regulate a single variable, such as speed or temperature. An aircraft autopilot, in contrast, might have to maintain a constant altitude and heading as well as controlling motion around the roll, pitch and yaw axes. All of these variables are coupled; a change in one affects others. Similarly, a controller for a distilling column in an oil refinery might need to regulate temperature, pressure and several flow rates. Again, the variables cannot be considered separately; turning up the heat alters pressures and flows.

Solving such multivariable control problems was difficult and tedious with early design methods, which are now characterized as classical control theory. Those methods assess the performance of any given control law but leave to the intuition of the engineer the task of choosing which laws to test. Beginning in the 1960s, modern control theory introduced a new computer-intensive methodology that not only evaluates given laws but also searches for the best attainable laws under stated constraints. This collection of techniques, known as optimal control, identifies the control law that comes closest to satisfying a given criterion.

A number of further variations have grown out of optimal control. Robust control finds laws that deliver reasonable performance even if the real system differs somewhat from the mathematical model that represents it. Stochastic control tolerates noise or errors in the measurements of the system’s state. Adaptive control applies the feedback principle to the control laws themselves, allowing the controller to continue working as the system evolves.




comments powered by Disqus
 

EMAIL TO A FRIEND :

Of Possible Interest

Letters to the Editors: Royal Society Misquoted

Feature Article: Quietest Places in the World

Engineering: Aspirants, Apprentices, and Student Engineers

 

Other Related Links

YouTube video of the MONIAC in action

Subscribe to American Scientist