You are on page 1of 13

Entropy and the Laws of Thermodynamics

http://pespmc1.vub.ac.be/ENTRTHER.html Author: J. de Rosnay , Date: Jul 3, 1998

The principal energy laws that govern every organization are derived from two famous laws of thermodynamics. The second law, known as Carnot's principle, is controlled by the concept of entropy. Today the word entropy is as much a part of the language of the physical sciences as it is of the human sciences. Unfortunately, physicists, engineers, and sociologists use indiscriminately a number of terms that they take to be synonymous with entropy, such as disorder, probability, noise, random mixture, heat; or they use terms they consider synonymous with antientropy, such as information, neguentropy, complexity, organization, order, improbability. There are at least three ways of defining entropy:

in terms of thermodynamics (the science of heat), where the names of Mayer, Joule, Carnot, and Clausius (1865) are important; in terms of statistical theory, which fosters the equivalence of entropy and disorder -- as a result of the work of Maxwell, Gibbs, and Boltzmann (1875), and in terms of information theory, which demonstrates the equivalence of neguentropy (the opposite of entropy) and information -- as a result of the work of Szilard, Gabor, Rothstein, and Brillouin (1940-1950)

The two principal laws of thermodynamics apply only to closed systems, that is, entities with which there can be no exchange of energy, information, or material. The universe in its totality might be considered a closed system of this type; this would allow the two laws to be applied to it. The first law of thermodynamics says that the total quantity of energy in the universe remains constant. This is the principle of the conservation of energy. The second law of thermodynamics states that the quality of this energy is degraded irreversibly. This is the principle of the degradation of energy. The first principle establishes the equivalence of the different forms of energy (radiant, chemical, physical, electrical, and thermal), the possibility of transformation from one form to another, and the laws that govern these transformations. This first principle considers heat and energy as two magnitudes of the same physical nature About 1850 the studies of Lord Kelvin, Carnot, and Clausius of the exchanges of energy in thermal machines revealed that there is a hierarchy among the various forms of energy and an imbalance in their transformations. This hierarchy and this imbalance are the basis of the formulation of the second principle. In fact physical, chemical, and electrical energy can be completely changed into heat. But the reverse (heat into physical energy, for example) cannot be fully accomplished without outside help or without an inevitable loss of energy in the form of irretrievable heat. This does not mean that the energy is destroyed; it means that it becomes unavailable for producing work. The irreversible

increase of this nondisposable energy in the universe is measured by the abstract dimension that Clausius in 1865 called entropy (from the Greek entrope, change). The concept of entropy is particularly abstract and by the same token difficult to present. Yet some scientists consider it intuitively; they need only refer mentally to actual states such as disorder, waste, and the loss of time or information. But how can degraded energy, or its hierarchy, or the process of degradation be truly represented? There seems to be a contradiction between the first and second principles. One says that heat and energy are two dimensions of the same nature; the other says they are not, since potential energy is degraded irreversibly to an inferior, less noble, lower-quality form--heat. Statistical theory provides the answer. Heat is energy; it is kinetic energy that results from the movement of molecules in a gas or the vibration of atoms in a solid. In the form of heat this energy is reduced to a state of maximum disorder in which each individual movement is neutralized by statistical laws. Potential energy, then, is organized energy; heat is disorganized energy. And maximum disorder is entropy. The mass movement of molecules (in a gas, for example) will produce work (drive a piston). But where motion is ineffective on the spot and headed in all directions at the same time, energy will be present but ineffective. One might say that the sum of all the quantities of heat lost in the course of all the activities that have taken place in the universe measures the accumulation of entropy. One can generalise further. Thanks to the mathematical relation between disorder and probability, it is possible to speak of evolution toward an increase in entropy by using one or the other of two statements: "left to itself, an isolated system tends toward a state of maximum disorder" or "left to itself, an isolated system tends toward a state of higher probability." (this is illustrated most simply by the example of the box with two compartments). These equivalent expressions can be summarized:

Potential energy -> entropy Ordered energy -> disorganized energy (heat) High-quality energy -> heat (low-grade energy) Order -> disorder Improbability -> probability

The concepts of entropy and irreversibility, derived from the second principle, have had a tremendous impact on our view of the universe. In breaking the vicious circle of repetitiveness in which the ancients were trapped, and in being confronted with biological evolution generating order and organization, the concept of entropy indirectly opens the way to a philosophy of progress and development (see: the direction of evolution). At the same time it introduces the complementarity between the "two great drifts of the universe" described in the works of Bergson and Teilhard de Chardin. The image of the inexorable death of the universe, as suggested by the second principle, has profoundly influenced our philosophy, our ethics, our vision of the world, and even our art. The thought that by the very nature of entropy the ultimate and only possible future for man is annihilation has infiltrated our culture like a paralysis. This consideration led Leon Brillouin to ask, "How is it possible to understand life when the entire world is ordered by a law such as the second principle of thermodynamics, which points to death and annihilation?"

Entropy increase: the box with two compartments


http://pespmc1.vub.ac.be/ENTROBOX.html

Autho: F. Heylighen, Date: May 9, 2003

Fig. : a box containing air molecules, divided in two compartments with a hole in the dividing wall. The arrows represent the direction and speed of movement of the molecules. Since there are many more molecules in the left compartment, there will also be many more molecules crossing the divide from left to right. The second law of thermodynamics states that in an isolated system the entropy will increase. In the statistical definition of entropy according to Boltzmann, this means that the system will evolve to its most probable state, that is, the one with the most homogeneous probability distribution. Let us explain this idea with a classical example: a closed box with two compartments (Fig. 1). The left compartment contains air, the right one is empty. Suppose that we make a hole in the wall separating the two compartments. An intuitive description of what happens is that air will be "sucked" out of the full compartment into the vacuum of the empty one, until both compartments contain equal amounts of air. This effect is easy to understand when you look at the individual air molecules in the two compartments. The molecules are continuously flying in all directions, colliding with each other and with the walls of the compartment. A molecule bumping against the central wall separating the full compartment from the empty one will be reflected, and move back towards the middle of the full one. However, if the molecule would land on the hole, rather than on the separating wall, it would not be reflected but continue its journey rightwards into the empty compartment. Since many molecules are thus continuously flying from the left to the right, the number of molecules in the right compartment will quickly increase. These molecules too will collide and be scattered in all directions. Some of them will move to the left, into the hole, and thus go back to the left compartment. However, as long as there are less molecules in the right compartment, there will also be less that move through the hole from the right, and thus rejoin their erstwhile companions. As long as the concentration of molecules on the left is larger, there will be more movement from left to right than from right to left. Only when the concentration of molecules in the two compartments is uniform will there be an equal flow through the hole in both directions. This configuration where the distribution is homogeneous is the one with maximum entropy.

Now, it is in principle possible that by some coincidence more molecules would move to the left than to the right. This creates a difference in concentration between the two boxes, and therefore a decrease in entropy. However, since the amount of molecules in either compartment is so astronomically large, a very large number of them must move "countercurrent" to produce a noticeable difference. The probability that such a vast number of molecules would all move together in the same direction is vanishingly small. It is so small, in fact, that we can assume that it will never happen. Because of the law of large numbers, the larger the number of coincidences needed for an effect to occur, the smaller the probability that the effect will occur. And with numbers as large as the number of molecules in a gas, the resulting probability is as close to zero as you can practically get. Therefore, although entropy could decrease spontaneously, that is so improbable that the non-decrease of entropy has gotten the status of a law of thermodynamics. While we cannot predict the movement of the microscopic particles, the evolution towards homogeneity on the macroscopic scale is perfectly predictable.

Generalization
The reasoning we made about molecules diffusing to evenly fill a box can be easily generalized to the homogeneous diffusion of heat. Just replace molecules moving to another region by molecules transmitting their energy to another region. In all cases the mechanism underlying the growth of entropy is the same: if particles move in different directions with the same probability, they will evenly spread over all directions, and erase any differences in local density. This formulation of the second law highlights the paradox underlying statistical mechanics. The equal probability of movement follows from classical mechanics, where all directions are considered equivalent. A particle might move backwards or forwards: there is no preference. Therefore, the average number of particles moving in a particular direction will depend only on the number of particles present, not on the direction. On average, more particles will move out of a high density region than out of a low density one. Therefore, high density regions will become relatively less dense, and the differences in density will disappear. On the microscopic level of the particle, all directions of change have the same probability. On the macroscopic level of the gas, however, one direction of change-towards greater homogeneity--has a much larger probability. This preferred direction creates an "arrow of time". While classical mechanics does not really distinguish between past and future, statistical mechanics does. Time always flows in the direction of entropy increase, of greater homogeneity. This is progress of a kind, pushing evolution forward without ever allowing it to go back. But entropy increase is not what we usually call improvement. It is rather associated with the dissipation of energy, with wear and tear, with things running down and getting disorganized. Thus, thermodynamics has introduced irreversible change into the scientific world view. Yet, it still fails to explain the development of new organization which we normally associate with evolution.

Entropy and Information


Statistical entropy is a probabilistic measure of uncertainty or ignorance; information is a measure of a reduction in that uncertainty

Entropy (or uncertainty) and its complement, information, are perhaps the most fundamental quantitive measures in cybernetics, extending the more qualitative concepts of variety and constraint to the probabilistic domain. Variety and constraint, the basic concepts of cybernetics, can be measured in a more general form by introducing probabilities. Assume that we do not know the precise state s of a system, but only the probability distribution P(s) that the system would be in state s. Variety V can then be expressed as entropy H (as originally defined by Boltzmann for statistical mechanics):

H reaches its maximum value if all states are equiprobable, that is, if we have no indication whatsoever to assume that one state is more probable than another state. Thus it is natural that in this case entropy H reduces to variety V. Like variety, H expresses our uncertainty or ignorance about the system's state. It is clear that H = 0, if and only if the probability of a certain state is 1 (and of all other states 0). In that case we have maximal certainty or complete information about what state the system is in. We define constraint as that which reduces uncertainty, that is, the difference between maximal and actual uncertainty. This difference can also be interpreted in a different way, as information, and historically H was introduced by Shannon as a measure of the capacity for information transmission of a communication channel. Indeed, if we get some information about the state of the system (e.g. through observation), then this will reduce our uncertainty about the system's state, by excluding--or reducing the probability of--a number of states. The information I we receive from an observation is equal to the degree to which uncertainty is reduced: I = H(before) - H(after) If the observation completely determines the state of the system (H(after) = 0), then information I reduces to the initial entropy or uncertainty H. Although Shannon came to disavow the use of the term "information" to describe this measure, because it is purely syntactic and ignores the meaning of the signal, his theory came to be known as Information Theory nonetheless. H has been vigorously pursued as a measure for a number of higher-order relational concepts, including complexity and organization. Entropies, correlates to entropies, and correlates to such important results as Shannon's 10th Theorem and the Second Law of Thermodynamics have been sought in biology, ecology, psychology, sociology, and economics.

We also note that there are other methods of weighting the state of a system which do not adhere to probability theory's additivity condition that the sum of the probabilities must be 1. These methods, involving concepts from fuzzy systems theory and possibility theory, lead to alternative information theories. Together with probability theory these are called Generalized Information Theory (GIT). While GIT methods are under development, the probabilistic approach to information theory still dominates applications. Reference: Heylighen F. & Joslyn C. (2001): "Cybernetics and Second Order Cybernetics", in: R.A. Meyers (ed.), Encyclopedia of Physical Science & Technology , Vol. 4 (3rd ed.), (Academic Press, New York), p. 155-170

Mathematical Modeling of Evolution


http://pespmc1.vub.ac.be/MATHME.html Author: V.G. Red'ko, Date: Feb 17, 1999 (modified), Apr 27, 1998 (created)

[Node to be completed] Biological evolution is a very complex process. Using mathematical modeling, one can try to clarify its features. But to what extent can that be done? For the case of evolution, it seems unrealistic to develop a detailed and fundamental description of phenomena as it is done in theoretical physics. Nevertheless, what can we do? Can mathematical models help us to systemize our knowledge about evolution? Can they provide us with more a profound understanding of the particularities of evolution? Can we imagine (using mathematical representation) some hypothetical stages of evolution? Can we use mathematical models to simulate some kind of artificial "evolution"? In order to clarify such questions, it is natural to review the already developed models in a systematic manner. In evolutionary modeling one can distinguish the following branches:

Models of molecular-genetic systems origin have been constructed in connection with the origin of life problem. Quasispecies and hypercycles by M. Eigen and P. Schuster and sysers by V.A. Ratner and V.V. Shamin are the best known. These models describe mathematically some hypothetical evolutionary stages of prebiological self-reproducing macromolecular systems. General models of evolution describe some informational and cybernetic aspects of evolution. The neutral evolution theory by M. Kimura and S. Kauffman's automata are profound examples of such models. Artificial life evolutionary models are aimed at understanding of the formal laws of life and evolution. These models analyze the evolution of artificial organisms, living in computer-program worlds. Applied evolutionary models are computer algorithms, which use evolutionary methods of optimization to solve practical problems. The genetic algorithm by J.H. Holland and the evolutionary programming, initiated by L. Fogel et al., are well-known examples of these researches.

The analysis, accomplished in the child nodes, demonstrates that the relations between evolutionary models and experiments are rather abstract. The evolutionary models are mainly intended to describe general features of evolution process rather then concrete experiments. Only particular models (e.g. some models of mathematical genetics) are used to interpret certain experimental data. Moreover, some branches of evolutionary modeling (life origin models, artificial life evolutionary models) go to more abstract level and describe imaginary evolutionary processes: not the processes as-we-know-them, but the processes as-they-could-be. This abstractness is understandable: because the biological world is very complex and diversified, we firstly try to generalize a lot of experiments and only then we interpret this generalized representation in mathematical models.

Historically, the profound experimental researches have stimulated the creation of evolutionary theories. For example, the mathematical theories of population genetics by R.A. Fisher, J.B.S. Haldane, and S. Wright were based on experimental genetic investigations, performed in the first half of the 20-th century. The outstanding achievements of the molecular biology, which were attained in the 1950-1960s, constituted the underlying background for the life origin models by M. Eigen et al and the models of regulatory genetic systems by S.A. Kauffman. Currently, the evolutionary models are intensively developed in close connection with computer science researches, especially in Artificial Life investigations. There is an obvious tendency towards modeling of evolution of cybernetic, computer-like, intelligent features of biological organisms. Current evolutionary models are actively incorporating such notions as learning, neural networks, adaptive behavior. Nevertheless, a lot of problems, concerning the evolution of animal cognition abilities, have to be investigated. Lets outline these problems briefly. Biological evolution was able to create complex, harmonic, and very effective biocybernetic control systems, which govern the animal behavior. But how do these cybernetic systems operate? How did they emerge through evolution? What kinds of information processing and memory structures are used in animal control systems? How did animal cognitive abilities evolve? What kinds of "internal models" of the environment emerge in the animal "minds"? How are these "models" used in animal behavior? What were the transitional stages between animal cognitive abilities and human intelligence? In order to investigate such a wide spectrum of problems, it is natural to use a certain evolutionary strategy and to analyze the animal control systems and emergence of animal "intelligent" features step by step, considering the biological evolutionary process as underlying background. Such a field of investigations can be called "Evolutionary biocybernetics". The conceptual background for the investigations of the evolution of animal cognition abilities was described in the first chapters of "The Phenomenon of Science" by V.F. Turchin [1]. Some approaches towards the developments of the evolutionary biocybernetics were outlined in the paper [2]. Conclusion. The mathematical modeling of evolution was profoundly elaborated in several directions: life origin models, mathematical population genetics, models of evolution of genetic regulatory systems, artificial life evolutionary models. These models provide us with better understanding of biological evolutionary phenomena; they also give generalized descriptions of biological experiments. Some models provide us with more abstract pictures they describe artificial evolutionary processes: not the processes as-we-know-them, but the processes as-theycould-be. Thus, mathematical modeling of evolution is profound, well-elaborated, intensively developing field of theoretical investigations. Nevertheless, there are serious problems to be analyzed: the problems of evolution of cybernetic, computer-like, "intelligent" features of biological organisms. The theoretical investigations of these problems could constitute the subject of a future scientific discipline "Evolutionary biocybernetics". References: 1. Turchin, V. F. The Phenomenon of Science. A Cybernetic Approach to Human Evolution. Columbia University Press, New York, 1977. 2. Red'ko, V.G. Towards the evolutionary biocybernetics // Proceedings of The Second International Symposium on Neuroinformatics and Neurocomputers, Rostov-on-Don, 1995, pp. 422-429.

The Direction of Evolution


http://pespmc1.vub.ac.be/DIREVOL.html Author: F. Heylighen, Date: Jun 13, 1997 (modified), Aug 6, 1996 (created)

although evolution is chaotic and unpredictable, it moves preferentially in the direction of increasing fitness

A fundamental criticism of the idea of increasing complexity, formulated among others by Stephen Jay Gould (1994), is that such an increase implies a preferred direction for evolution, a continuing "progress" or advance towards more sophisticated forms. Recent advances in evolutionary theory (such as the theory of punctuated equilibrium) and observation of evolutionary phenomena seem to indicate that evolution is a largely unpredictable, chaotic and contingent series of events, where small fluctuations may lead to major catastrophes that change the future course of development. At first sight, this seems inconsistent with any constant "direction". Yet, an example will show that there is no necessary contradiction. Consider a rock that rolls down from the top of a steep mountain. Given that the slightest irregularity in the terrain may be sufficient to make the rock fall either into the one or the other of a host of downward slopes or valleys, the exact path of the rock will be virtually impossible to predict. Repeated experiments are likely to produce final resting positions that are miles apart. Yet, one thing will be certain: the final position will be lower than the initial position at the top. Although we cannot know the direction of movement in the horizontal dimensions, we do know that there is only one possible sense in which it can move along the vertical dimension: downward. To apply this metaphor to evolution, we need to discover the equivalent of the "vertical" dimension, in other words we need to define a variable that can only increase during evolution (like vertical distance from the top). Entropy plays the role of such a variable for thermodynamic systems, but this seems hardly useful to describe complexification. Fisher's (1958) fundamental theorem of natural selection has shown that another such variable exists for populations of living systems: average fitness. This follows straightforwardly from the fact that fit individuals by definition will become more numerous, while the proportion of less fit individuals will decrease. This reasoning can be generalized to cover non-biological systems too (cf. the principle of asymmetric transitions). It might be objected that fitness is a relative notion: what is fit in one type of environment may no longer be fit in another environment. Thus, the inexorable increase of fitness only holds in invariant environments (which seem wholly atypical if one takes into account co-evolution). Gould proposes the following example: the evolution from hairless elephant to woolly mammoth is due merely to a cooling down of the climate. If the climate becomes warmer again the woolly variant will lose its fitness relative to the hairless one, and the trend will be reversed. Yet, there are ways to increase "absolute" fitness. First, the system may increase its internal or intrinsic fitness, by adding or strenghtening bonds or linkages between its components. This is typically accompanied by the increase of structural complexity. Second, the system may increase 9

its fitness relative to its environment by increasing the variety of environmental perturbations that it can cope with, and thus its functional complexity. This may be illustrated through the climate change example: though the warm-blooded, woolly mammoth is only relatively fitter than its hairless cousin, it is absolutely fitter than a cold-blooded reptile, which would never have been able to adapt to a cold climate, with or without hair. Warmbloodedness means temperature control, i.e. the capacity to internally compensate a variety of fluctuations in outside temperature. The appearance of control is the essence of a metasystem transition, which can be seen as a discrete unit of evolutionary progress towards higher functional complexity. All other things being equal, a system that can survive situations A, B and C, is absolutely fitter than a system that can only survive A and B. Such an increase in absolute fitness is necessarily accompanied by an increase in functional complexity. Thus, evolution will tend to irreversibly produce increases of functional complexity. This preferred direction must not be mistaken for a preordained course that evolution has to follow. Though systems can be absolutely ordered by their functional complexity, the resulting relation is not a linear order but a partial order: in general, it is not possible to determine which of two arbitrarily chosen systems is most functionally complex. For example, there is no absolute way in which one can decide whether a system that can survive situations A, B and C is more or less complex or fit than a system that can survive C, D and E. Yet, one can state that both systems are absolutely less fit than a system that can survive all A, B, C, D and E. Mathematically, such a partial order can be defined by the inclusion relation operating on the set of all sets of situations or perturbations that the system can survive. This also implies that there are many, mutually incomparable ways in which a system can increase its absolute fitness. For example, the first mentioned system might add either D or E to the set of situations it can cope with. The number of possibilities is infinite. This leaves evolution wholly unpredictable and open-ended. For example, though humans are in all likeliness absolutely more functionally complex than snails or frogs, evolution might well have produced a species that is very different from humans, yet is similarly at a much higher functional complexity level compared to the other species. In perhaps slightly different circumstances, the Earth might have seen the emergence of a civilisation of intelligent dogs, dolphins or octopuses. It is likely that analogous evolutions are taking place or have taken place on other planets. Though humanity seems to have reached the highest level of functional complexity in the part of evolution that we know, more intelligent and complex species may well exist elsewhere in the universe, or may appear after us on Earth. The conclusion is that a preferred direction for evolution in the present, generalized sense does not in any way support the ideology of anthropocentrism. References:

Heylighen F. (1997): "The Growth of Structural and Functional Complexity during Evolution", in: F. Heylighen & D. Aerts (eds.) (1997): "The Evolution of Complexity" (Kluwer, Dordrecht). (in press) Gould S.J. (1994): "The Evolution of Life on Earth", Scientific American 271 (4), p. 6269. Fisher R. A. The Genetical Theory of Natural Selection, 2nd edition, Dover Publications, New York, 1958. Wilkins, J.: Progress and Teleology (in the context of Evolution and Philosophy) Progress: the very idea, a excellent and extensive list of links on all aspects of evolutionary (and social) progress 10

THERMODYNAMICS
That branch of physics which is concerned with the storage, transformation and dissipation of energy (including the flow of heat from which the term is derived). Its first law, or the conservation law, states that energy can neither be created nor destroyed. This law provides the basis for all quantitative accounts of energy, regardless of its form, and makes energy the most important concept in physics. Its second law, or the entropy law, states that in all processes some of the energy involved irreversibly looses its ability to do work and is degraded in quality. The latter is called thermodynamic entropy whose extreme form is dispersed heat and manifested in a uniform temperature distribution. Another statement of this second law is that in any process entropy never decreases. The irreversibility of physical processes implicit in this law makes the entropy law probably the most important law in understanding terrestrial processes including living organisms and social forms. The third law of thermodynamics, or the asymptotic law, states that all processes slow down as they operate closer to the thermodynamic equilibrium making it difficult to reach that equilibrium in practice. This law suggests that the powerful and fast changes which are typical of technology and characteristic of living forms of organization are bound to occur only at levels far removed from thermodynamic equilibrium. (Krippendorff)

ENTROPY
unavailable energy or molecular disorder. Entropy is at a maximum when the molecules in a gas are at the same energy level. Entropy should not be confused with uncertainty. Uncertainty is at a minimum when all elements are in the same category. (Umpleby)

THERMODYNAMIC ENTROPY
The quantity of energy no longer available to do physical work. Every real process converts energy into both work or a condensed form of energy and waste. Some waste may be utilized in processes other than those generating it (see recycling) but the ultimate waste which can no longer support any process is energy in the form of dispersed heat (see second law of thermodynamics). All physical process, despite any local and temporal concentration of energy they may achieve, contribute to the increased overall dispersion of heat. Entropy therefore irreversibly increases in the known universe. (Krippendorff)

STATISTICAL ENTROPY
A measure or variation or diversity defined on the probability distribution of observed events. Specifically, if P is the probability of an event a, the entropy H(A) for all events a in A is: H(A) = -SUM_a P_a log_2 P_a The quantity is zero when all events are of the same kind, p =1 for any one a of A and is positive otherwise. Its upper limit is log_2 N where N is the number of categories available (see degrees of freedom) and the distribution is uniform over these, p = 1/N for all a of A (see variety, uncertainty, negentropy). The statistical entropy measure is the most basic measure of information theory. (Krippendorff)

11

SECOND LAW OF THERMODYNAMICS


elements in a closed system tend to seek their most probable distribution; in a closed system entropy always increases. The paraphrases below were compiled by Heinz Von Foerster. l. Clausius (l822-l888) It is impossible that, at the end of a cycle of changes, heat has been transferred from a colder to a hotter body without at the same time converting a certain amount of work into heat. 2. Lord Kelvin (l824-l907) In a cycle of processes, it is impossible to transfer heat from a heat reservoir and convert it all into work, without at the same time transferring a certain amount of heat from a hotter to a colder body. 3. Ludwig Boltzmann (l844-l906) For an adiabatically enclosed system, the entropy can never decrease. Therefore, a high level of organization is very improbable. 4. Max Plank (l858-l947) A perpetual motion machine of the second kind is impossible. 5. Caratheodory (l885-l955) Arbitrarily near to any given state there exist states which cannot be reached by means of adiabatic processes. (From Sears and Zemansky): 100% conversion of heat into mechanical work is not possible by any form of engine. (p. 342) There is a tendency in nature to proceed toward a state of greater molecular disorder. This one-sidedness of nature produces irreversible processes. (p. 347)

SOCIAL ENTROPY
A measure of the natural decay of the structure or of the disappearance of distinctions within a social system. Much of the energy consumed by a social organization is spent to maintain its structure, counteracting social entropy, e.g., through legal institutions, education, the normative consequences or television. anomie is the maximum state of social entropy. (Krippendorff)

ANOMIE
Literally, without law, a condition of disintegration of a society into individual components resulting from the absence of conventions, shared perceptions and goals. A social system describable as a mere aggregate (see aggregation), the state of maximum social entropy (Etzioni). (krippendorff)

SOCIAL SYSTEM
In cybernetics, a system involving its observers. Such a system is constituted (see constitution) by communication among observers who participate within that system by drawing distinctions and creating relations within it (see analysis, second-order cybernetics). This contrasts sharply with the use of the same term in the structural-functional school of sociology, where it denotes a pattern of social acts in pursuit of individual and collective goals and governed by the need of the "social system" to maintain its own structure. (Krippendorff)

12

NEGENTROPY
A non-recommendable near synonym for information. The term has created considerable confusion suggesting that information processes negate the second law of thermodynamics by producing order from chaos. The history of the confusion stems from the mere formal analogy between Boltzmann's thermodynamic expression for entropy S = k log W and the ShannonWiener expression for information H = -log_2 pa. The only motivation for the negative sign in the latter is that it yields positive information quantities (the logarithm of a probability is always negative). The probability p of an event a and the thermodynamic value W including Boltzmann's constant k measure entirely different phenomena. A meaningful interpretation of negentropy is that it measures the complexity of a physical structure in which quantities of energy are invested, e.g., buildings, technical devices, organisms but also atomic reactor fuel, the infrastructure of a society. In this sense organisms may be said to become more complex by feeding not on energy but on negentropy (Schroedinger). (Krippendorff)

13

You might also like