entrenchment

entrenchment See GOODMAN. entropy, in physics, a measure of disorder; in information theory, a measure of ‘information’ in a technical sense. In statistical physics the number of microstates accessible to the various particles of a large system of particles such as a cabbage or the air in a room is represented as W. Accessible microstates might be, for instance, energy levels the various particles can reach. One can greatly simplify the statement of certain laws of nature by introducing a logarithmic measure of these accessible microstates. This measure, called entropy, is defined by the formula: S(Entropy) % df. k(lnW), where k is Boltzmann’s constant. When the entropy of a system increases, the system becomes more random and disordered, in the sense that a larger number of microstates become available for the system’s particles to enter.
If a large physical system within which exchanges of energy occur is isolated, exchanging no energy with its environment, the entropy of the system tends to increase and never decreases. This result of statistical physics is part of the second law of thermodynamics. In real, evolving physical systems effectively isolated from their environments, entropy increases and thus aspects of the system’s organization that depend upon there being only a limited range of accessible microstates are altered. For example, a cabbage totally isolated in a container would decay as complicated organic molecules eventually became unstructured in the course of ongoing exchanges of energy and attendant entropy increases.
In information theory, a state or event is said to contain more information than a second state or event if the former state is less probable and thus in a sense more surprising than the latter. Other plausible constraints suggest a logarithmic measure of information content. Suppose X is a set of alternative possible states, xi, and p(xi) is the probability of each xi 1 X. If state xi has occurred the information content of that occurrence is taken to be -log2p(xi). This function increases as the probability of xi decreases. If it is unknown which xi will occur, it is reasonable to represent the expected information content of X as the sum of the information contents of the alternative states xi weighted in each case by the probability of the state, giving: This is called the Shannon entropy. Both Shannon entropy and physical entropy can be thought of as logarithmic measures of disarray. But this statement trades on a broad understanding of ‘disarray’. A close relationship between the two concepts of entropy should not be assumed. See also INFORMATION THEORY, PHILOSO- PHY OF SCIENC. T.H.

meaning of the word entrenchment root of the word entrenchment composition of the word entrenchment analysis of the word entrenchment find the word entrenchment definition of the word entrenchment what entrenchment means meaning of the word entrenchment emphasis in word entrenchment