cybernetics (coined by Norbert Wiener in 1947 from Greek kubernetes, ‘helmsman’), the study of the communication and manipulation of information in service of the control and guidance of biological, physical, or chemical energy systems. Historically, cybernetics has been intertwined with mathematical theories of information (communication) and computation. To describe the cybernetic properties of systems or processes requires ways to describe and measure information (reduce uncertainty) about events within the system and its environment. Feedback and feedforward, the basic ingredients of cybernetic processes, involve information – as what is fed forward or backward – and are basic to processes such as homeostasis in biological systems, automation in industry, and guidance systems. Of course, their most comprehensive application is to the purposive behavior (thought) of cognitively goal-directed systems such as ourselves.
Feedback occurs in closed-loop, as opposed to open-loop, systems. Actually, ‘open-loop’ is a misnomer (involving no loop), but it has become entrenched. The standard example of an openloop system is that of placing a heater with constant output in a closed room and leaving it switched on. Room temperature may accidentally reach, but may also dramatically exceed, the temperature desired by the occupants. Such a heating system has no means of controlling itself to adapt to required conditions.
In contrast, the standard closed-loop system incorporates a feedback component. At the heart of cybernetics is the concept of control. A controlled process is one in which an end state that is reached depends essentially on the behavior of the controlling system and not merely on its external environment. That is, control involves partial independence for the system. A control system may be pictured as having both an inner and outer environment. The inner environment consists of the internal events that make up the system; the outer environment consists of events that causally impinge on the system, threatening disruption and loss of system integrity and stability. For a system to maintain its independence and identity in the face of fluctuations in its external environment, it must be able to detect information about those changes in the external environment. Information must pass through the interface between inner and outer environments, and the system must be able to compensate for fluctuations of the outer environment by adjusting its own inner environmental variables. Otherwise, disturbances in the outer environment will overcome the system – bringing its inner states into equilibrium with the outer states, thereby losing its identity as a distinct, independent system. This is nowhere more certain than with the homeostatic systems of the body (for temperature or blood sugar levels). Control in the attainment of goals is accomplished by minimizing error. Negative feedback, or information about error, is the difference between activity a system actually performs (output) and that activity which is its goal to perform (input). The standard example of control incorporating negative feedback is the thermostatically controlled heating system. The actual room temperature (system output) carries information to the thermostat that can be compared (via goal-state comparator) to the desired temperature for the room (input) as embodied in the set-point on the thermostat; a correction can then be made to minimize the difference (error) – the furnace turns on or off. Positive feedback tends to amplify the value of the output of a system (or of a system disturbance) by adding the value of the output to the system input quantity. Thus, the system accentuates disturbances and, if unchecked, will eventually pass the brink of instability. Suppose that as room temperature rises it causes the thermostatic set-point to rise in direct proportion to the rise in temperature. This would cause the furnace to continue to output heat (possibly with disastrous consequences). Many biological maladies have just this characteristic. For example, severe loss of blood causes inability of the heart to pump effectively, which causes loss of arterial pressure, which, in turn, causes reduced flow of blood to the heart, reducing pumping efficiency.
Cognitively goal-directed systems are also cybernetic systems. Purposive attainment of a goal by a goal-directed system must have (at least): (1) an internal representation of the goal state of the system (a detector for whether the desired state is actual); (2) a feedback loop by which information about the present state of the system can be compared with the goal state as internally represented and by means of which an error correction can be made to minimize any difference; and (3) a causal dependency of system output upon the error-correction process of condition (2) (to distinguish goal success from fortuitous goal satisfaction).
See also COMPUTER THEORY, INFORMATION THEORY, SYSTEMS THEORY. F.A.