You are on page 1of 29

COMPLEXITY THEORY

BASIC CONCEPTS AND APPLICATION TO SYSTEMS THINKING


March 27, 1994
John Cleveland
Innovation Network For Communities
A Short Introduction to Complex Adaptive Systems On the Edge of Chaos
The field of complex adaptive systems theory (also known as complexity theory)
seeks to understand how order emerges in complex, non-linear systems such as
galaxies, ecologies, markets, social systems and neural networks. Complexity
scientists suggest that living systems migrate to a state of dynamic stability they call the
edge of chaos. Mitchell Waldrop provides a description of the edge of chaos in his
book, Complexity:
The balance point -- often called the edge of chaos -- is where the components of a
system never quite lock into place, and yet never quite dissolve into turbulence
either. . . The edge of chaos is where life has enough stability to sustain itself and
enough creativity to deserve the name of life. The edge of chaos is where new idea
and innovative genotypes are forever nibbling away at the edges of the status quo,
and where even the most entrenched old guard will eventually be overthrown. The
edge of chaos is where centuries of slavery and segregation suddenly give way to
the civil rights movement of the 1950s and 1960s; where seventy years of Soviet
communism suddenly give way to political turmoil and ferment; where eons of
evolutionary stability suddenly give way to wholesale species transformation. The
edge is the constantly shifting battle zone between stagnation and anarchy, the one
place where a complex system can be spontaneous, adaptive and alive.
Systems on the edge are notable for a hunger for novelty and disequilibrium that
distinguishes them from rigidly ordered systems. At the same time, however, they also
possess a deep underlying coherence that provides structure and continuity, and
distinguishes them from chaotic systems. Theorists use words like integrity, identity
persistent structure and self-reference to describe this opposite characteristic.
Systems that evolve along the edge of chaos periodically re-integrate into structures
with temporary stability, which bear recognizable resemblance to the string of
predecessor structures. They are free enough to change, but stable enough to stay
recognizable.
Complexity scientists have identified several characteristics that distinguish edge of
chaos systems from systems that are either locked in rigid order, or too chaotic for any
stability to emerge. These include:
Autonomous agents. Like a swarm of bees, a flock of birds, or a healthy market,
these systems are made up of many individual actors who make choices about how
to act based on information in their local environment. All the agents make choices
simultaneously (parallel processing), both influencing and limiting each others
actions.
Networked structure. The agents dont act randomly. They share some common
rules about how they decide what to do next. At the level of matter, these common
rules are the laws of nature (gravity, electromagnetism, etc.). At the level of
conscious actors, these are decision-making rules (preferences, interests, desires,
etc.). These rules connect the agents together and allow a global coherence to
emerge without any central source of direction -- the swarm has velocity, shape,
direction and density that do not reside in any individual agent. The rules used by
agents evolve based on their successfulness in the changing environment. The
connections between agents in edge of chaos systems are moderately dense
not so interconnected so the system freezes up, and not so disconnected that it
disintegrates into chaos.
Profuse experimentation. These edge of chaos systems are full of novelty and
experimentation. They have a quality of dynamic stability that is characterized by
occasional rapid and unpredictable shifts in shape and direction. They can react to
small changes in big and surprising ways (rumors fly like lightning; a mob forms; the
market crashes; the hive swarms). Such systems can communicate almost
instantaneously, experiment with dozens of possible responses if they encounter a
roadblock, and rapidly exploit solutions when one is found.
In describing the edge of chaos, complexity scientists have documented and analyzed
qualities that humans have sought in their systems for some time. A vibrant democracy
is an edge of chaos form of governance; a healthy market is an edge of chaos form
of economics; a flexible and adaptive organization is an edge of chaos institution; and
a mature, well-developed personality is an edge of chaos psyche.
In many of our systems, however, we have created forms of organization that are
locked in rigid order and incapable of adaptable evolution (e.g. bureaucracies,
monopolies, dictatorships). These forms of social control were often responses to
situations that were previously too chaotic. In multiple sectors of society, we now see a
migration from both extremes of incoherent chaos and rigid order towards the middle
edge of chaos where systems have the capacity to grow, learn and evolve.
The attached materials describe some of the basic concepts of complex adaptive
systems theory.
1. TYPES OF SYSTEMS systems
The Basic Concept:
Scientists and others use many different labels to describe different kinds of systems.
These terms can be confusing to the non-specialist. It is helpful to understand the
"taxonomy" of system types in order to understand what complex adaptive systems are
and are not.
Discussion:
These are the most commonly used terms for different kinds of systems. They are
loosely listed in order from least complex to most complex system.
Entropy -- No System. The condition of entropy is a condition where there is no usable
energy in the system, no connections between the elements of the system, and no
observable structure.
Closed Systems. A closed system is a system that does not import or export energy
across its boundaries. The only truly closed system is the universe as a whole.
Traditional physics deals with systems that are presumed to be closed systems. In
closed systems, the final state of the system is determined by its initial conditions. All
closed systems move to a condition of equilibrium and maximum entropy.
Open Systems. The term "open system" is a general term given to any system which
exchanges matter, energy or information across its boundaries, and uses that exchange
of energy to maintain its structure. All living systems, including all complex adaptive
systems, are open systems. Not all open systems, however, are complex and adaptive.
(For instance, a burning candle is an open system.)
Self-Organizing Systems. The term "self-organizing" refers to the spontaneous
emergence of new forms of order. Self-organization is distinguished by the fact that
there is no external agent that designs, constructs or maintains the system. Structure
freely emerges from the internal interactions of the system itself. Many open systems
display qualities of self-organization. The phenomenon of self-organization only occurs
in systems that are "far from equilibrium" -- where there is continuous flux and vigorous
exchange of energy between the parts of the system. (See Self-Organization.)
Dissipative Structures. This is the term the chemist Ilya Prigogine gave to self-
organizing systems. The terms "self-organizing system" and "dissipative structure"
mean essentially the same thing. The term "dissipative" refers to the fact that these
systems consume energy and "dissipate" it into the environment (thereby creating
entropy.) Dissipative structures maintain a form of global structure and stability by a
constant pattern of internal fluctuations. Through the process of autocatalysis, small
fluctuations are often magnified into large disturbances that either cause the system to
disintegrate, or to reorganize into a new form. (See Feedback.)
Autopoetic Systems. The term "autopoetic" is used to refer to any system that renews
itself and regulates the renewal process in such a way that its overall structure is
preserved. The terms "autopoetic", "self-organizing" and "dissipative" are generally
meant to mean the same thing when referring to systems. An autopoetic system
(whose only purpose is self-preservation and renewal) can be distinguished from a
machine, whose purpose is geared toward the production of a particular output.
Natural Systems. This is the term that the general system theorist Ervin Laszlo gives to
self-organizing systems. The four characteristics of natural systems as Laszlo identifies
them are: 1) they are wholes, with irreducible properties; 2) they maintain themselves in
a changing environment; 3) they create and recreate themselves in response to the
challenges of the environment; and 4) they mediate interaction between the
subsystems that make them up, and the larger "supra-systems" of which they are a
part. Again, the term "natural system" can, for all practical purposes, be seen as
synonymous with open, self-organizing, dissipative and autopoetic systems.
Classes of Systems. The complexity scientists have adopted a classification of
systems based on the work of the physicist Stephen Wolfram. The four classes are
based on the behavior of the system: Class I systems move quickly to a single point,
and stay there (vaguely equivalent to a closed system moving to equilibrium); Class II
systems oscillate between a limited number of end states; Class III systems are
"boiling" -- totally chaotic, with no stability or structure; Class IV systems are "on the
edge of chaos", "alive" -- they have enough structure to create patterns, but the
patterns never really settle down. Class IV systems is what is meant by "complex
adaptive systems." (See Classes of Systems.)
Complex Adaptive Systems. Complex adaptive systems are open, self-organizing
systems that have the added capacity to conserve and process high levels of
information. They live on the "edge of chaos" where the system maintains enough
structure to process information, but fluctuates enough that new information (in the
form of new patterns and structures) is always being created. (See Complex Adaptive
Systems.)
Relevance for Thinking About Complex Systems:
The terms and jargon used to talk about systems are often confusing for three
reasons: authors often do not clearly define the terms that they are using; many terms
that mean basically the same thing are developed by different authors in different
disciplines; and the distinctions between different kinds of systems are often vague and
imprecise. Throughout these materials, our focus is on complex adaptive systems, that
are differentiated from other systems by their capacity for learning.
2. CLASSES OF SYSTEMS
The Basic Concept
Systems fall into various "classes" of behavior. One classification used by some
complexity scientists put systems into four categories ( Class I, II, III, and IV) according
to the nature of their global dynamics, and the shape of their attractor. (See Strange
Attractors). In this scheme, complex adaptive systems are referred to as "Class IV"
systems.
Discussion:
Steven Wolfram of the Institute for Advanced Study developed four "universality
classes" to describe the various kinds of rules governing the behavior of cellular
automata (computer creations that are used to model the behavior of complex systems.
) Chris Langton of the Santa Fe Institute used this classification system to categorize
systems according to their global behaviors:
CLASSES OF SYSTEMS
Class Behavior Attractor Description
I Always returns to a
single equilibrium state
Single point
attractor
Ordered
II Oscillates between a
limited number of states
Periodic attractor Ordered
III Never establishes a
coherent pattern --
"boiling"
Strange attractor Chaotic
IV Growing and changing
patterns that never settle
down -- regions of
connected order in a sea
of chaos
Strange attractors Complex
(Edge of Chaos)
Other terms have been used for some of these classes. Among them:
* Static equilibrium. This is another term for Class I systems that quickly return to
a state of equilibrium and high entropy.
* Dynamic equilibrium. This is another term for Class II systems, where there is
alot of local fluctuation, but the global behaviors oscillate between a limited
number of end states.
* Near-equilibrium. This is another version of Class II systems, but with a wider
range of end states.
* Far from equilibrium. This is another term for chaotic Class III systems.
Langton noted the similarity between the three basic categories of dynamic systems
(ordered, complex and chaotic) and the categorization of other physical phenomena,
including phase transitions in matter, the capacity for computation in computers, and
the emergence of life-like behaviors in cellular automata systems:
Phenomena Categorization
Cellular Automata Class I, II Class IV Class III
Dynamical
Systems
Ordered Complex
(Edge of Chaos)
Chaotic
Matter Solid Phase Transitions Fluid
Computation Halting Undecidable Non-halting
Life Too static Life/Intelligence Too noisy
Relevance for Thinking About Human Experience:
The "classes of systems" categorization is a useful way to understand the different
kinds of behaving systems in our world. It gives us a framework for looking at the
phenomena around us. This categorization again emphasizes that complex, life-like
behavior occurs "on the edge of chaos" where order and chaos are sufficiently
intermingled to create coherent patterns, but never to let them "freeze" or "boil away."
3. COMPLEX ADAPTIVE SYSTEMS
The Basic Concept:
The phenomena of life and evolution can be understood as the phenomena of complex
adaptive systems. These systems are complex (they have many parts interacting
with each other in many different ways); self-organizing (they spontaneously emerge,
without being designed from the outside); adaptive (they change their behavior based
on experience); dynamic (they are poised on "the edge of chaos" -- stable enough to
maintain their structure, but sensitive enough to external changes that they can undergo
rapid and unpredicatable periods of change); and co-evolving (the evolve together with
the systems that they interact with.)
Discussion:
Complex Adaptive Systems Distinguished from Other Systems. Complex adaptive
systems are distinguished by their capacity to conserve and process information, and
their ability to evolve new forms of behavior based on that information. A weather
system, for instance, is a complex system, but not a complex adaptive system. (See
Types of Systems.)
Complexity. "Complexity" is a difficult concept to precisely define. In the context of
complex adaptive systems, it generally means that there are a large number of
elements interacting in many diverse ways in the system.
Self-Organizing. Complex adaptive systems are "emergent" phenomenon. (See
Emergence.) The particular form of their structure emerges from the patterns of
interaction between the elements making up the system. They are not designed from
the "outside" (in contrast to a machine and some other human systems), and you
cannot determine the shape of the system from the characteristics of the elements.
(Just as knowing the characteristics of bricks doesn't tell you whether they will be used
to build a wall or a cathedral.)
Adapting and Learning. Because of their ability to conserve, process and create
information, complex adaptive systems have the ability to change their behavior and
adapt to new relationships with the environment. They display the basic elements of a
learning process: rules that govern their relations with the environment; feedback loops
that tell them about how the rules are performing; and the ability to form new rules from
combinations of old rules and new information from the environment.
Dynamic. While they maintain stability in the midst of fluctuation, complex adaptive
systems are senstitive enough to the environment that they can undergo rapid and
unpredictable transformations as they adjust to internal and external fluctuations.
Co-evolving. Complex adaptive systems both change and are changed by their
environments. They and their environments co-evolve. The patterns of relationship
between themselves and other systems forms a "fitness landscape" that is constantly
changing as they change.
4. THE EDGE OF CHAOS edge
The Basic Concept:
Systems fall into three broad categories of behavior: ordered, complex and chaotic.
The term "edge of chaos" is used to describe complex systems that lie in the region
between ordered and chaotic systems. Living systems migrate to the edge of chaos
because that is where the opportunity for information processing is maximized.
Discussion:
The image of "life on the edge" has an intrinsic appeal to human beings that live in a
complex, turbulent and unpredictable world. The term is more than just a fanciful image,
however -- it is also a precise scientific description of the condition of all living systems.
Scientists who study whole system behavior have observed that systems tend to fall
into four categories. (See Classes of Systems.) Two of these classes are ordered
systems, where the behavior of the system rapidly moves to a predictable and repetitive
cycle of behavior. Another class of systems is chaotic and never really settles down into
any observable pattern. The final class of systems (Class IV systems) are complex
systems that have many areas of order, but also many areas of flux and chaos. Chaos
and order weave together in a complex and always changing dance of connections and
fluctuations. There is enough stability for the system to store information, but also
enough fluidity for rapids and intense communication to occur. This region is called the
"edge of chaos."
Systems on the edge of chaos have the ability to learn. Areas of stability allow for the
storage of information, and areas of flux and change allow for communication. The
combination allows the system to store information, receive communication "signals",
process information and act in response to it. (See The Nature of Information.)
Because this region is the region most favorable to life, living systems tend to adjust
their parameters so that remain on the edge of chaos. Chris Langton, one of the
pioneers of complexity science, expresses his vision of "...life as eternally truing to keep
its balance on the edge of chaos, always in danger of falling off into too much order on
the one side, and too much chaos on the other." (Complexity, P. 235.) The process of
evolution is the process of systems adjusting its parameters (e.g. reproduction rate;
environmental temperature, etc.) that keep the system on the edge of chaos.
Relevance for Thinking About Complex Systems:
The edge of chaos is the desired state for systems that want to avoid the two kinds of
death -- death by freezing into rigid patterns, and death by "boiling alive" in the fire of
chaos. The edge of chaos is where life occurs, so we should seek those "rules" of
interaction (complexity scientists call them "parameters") that will produce an emergent
structure that lives on the edge of chaos.
5. SELF-ORGANIZATION selforg
The Basic Concept:
Self-organization refers to the spontaneous emergence of order in a system. This
occurs when the conditions in the system allow for the formation of steady patterns of
relationships between elements of the system. Self-organization is another way of
looking at the emergence of order out of chaos.
Discussion:
The idea of self-organization, or the spontaneous emergence of self-maintaining order,
is in direct contrast to a machine orientation of the world, which assumes that the
appearance of order requires the imposition of a design from outside the system, just
as an engineer designs and builds a machine. Examples of self-organization can be
found in chemical systems, weather systems, natural systems, and human systems.
Increasingly, the phenomenon of self-organization is seen (along with the process of
natural selection) as one of the two primary forces underlying all natural life processes
in our known universe: "...there is a natural tendency for self-organized wholes to form.
The wholes retain their identities, return to maximum stability after they have been
disturbed, and even to a certain degree regenerate their form when these have been
fractured." (The Unfinished Universe, P. 41.)
A self-organizing structure emerges when many different parts form a steady pattern of
relationships over time. This might be molecules in a chemical solution; species in an
eco-system; cells in an organ; stars in a galaxy, or human beings in an organization. In
all cases of self-organization, you can observe:
A very large number of elements that are richly connected with each other
(obviously, without any connections, no stable patterns of interaction can develop);
Various forms of rapid feedback and iteration occuring between the parts of the
system (particularly the phenomenon of autocatalysis, where an element duplicates
itself by iteration with other parts of the system);
Sufficient sources of negative feedback that allow for the maintainance of a
condition of stability (referred to as "homeostasis"); and
Rich exchange of matter, energy and information with the external environment.
The self-organized (or "self-ordered") system is a stable structure that exists in a
condition of "far from equilibrium". The external, or "global" stability is maintained by
the patterns of fluctuation within the system. Rythmic movement creates stable
structure -- structure "emerges" from the interactions of the parts of the system. The
form of the structure is impossible to predict from the characteristics of the individual
parts, but is instead a function of the patterns of interrelationship that they engage in.
Relevance for Thinking Complex Systems:
The phenomenon of self-organization is a key characteristic of complex adaptive
systems. The capacity for self-organization suggests that we should seek to work with
the self-ordering properties of complex systems rather than attempting to impose
structure on them from the outside.
6. EMERGENCE emerge
The Basic Concept:
Structure in complex systems is an emergent phenomenon, meaning that it arises out
of the interactions within the system, rather than being imposed on it from the outside.
Discussion:
We are used to thinking about structure as something that is "designed into" a system,
and that is regulated by some kind of central control function. In contrast, in complex
systems, structure emerges from the many interactions of the elements in the system.
The phenomenon of emergence has several dimensions:
Bottoms-Up . The structure that we observe in complex systems arises out of the
"local rules" that parts of the system use to guide their interactions with each
other. Observance of these rules leads to recurring patterns of relationships
between the parts in the system. These recurring relationships are what
constitute the "structure" of the system. (See Feedback.)
Complexity From Simplicity . Very complicated structure can emerge from a
small number of elements, and only a few rules.
Unpredictability . It is not possible to predict what structure (if any) will emerge in
complex systems from any given set of rules. The only way to find out is to let
the system play itself out. This reality is what make reductionist analysis of
limited utility in understanding the behavior of complex systems. It is not
possible to understand the characteristics of ligher levels of organization from the
characteristics of the smallest parts. Understanding the nature of bricks doesn't
tell you whether the building is a cathedral or a bus depot.
Stratified Autonomy . Emergence occurs in layers. Individual elements interact
and form integrated wholes; these wholes then interact with each other and form
large wholes, and so on. At each scale, the parts retain high levels of autonomy,
but also enter into patterns of relationships with other parts. Each layer includes
all the layers below it, and leaves their individual functions intact. The part is not
controlled by the system that it is a part of. (See Stratified Autonomy.) Note
that these "nested" systems have the fractal structure that is characteristic of
complex systems. (See Fractals.)
Relevance for Thinking About Complex Systems:
As we better understand the nature of emergent structure in complex systems, we can
begin to replace our machine-oriented ideas about top-down design with a more natural
approach that allows ever higher levels of integration to emerge from the bottom up.
7. FEEDBACK feedback
The Basic Concept:
The term "feedback" is used to describe the information a part of a system (or a whole
system) gets back from its environment about something that it has done. Feedback
connects elements in a system to each other, and is their means of "communication."
There are several different kinds of feedback, and the combination present in any one
system will determine the overall structure and dynamics of that system.
Discussion:
What is Feedback? Feedback is information that is "fed back" to a system or
subsystem from outside its boundaries. A "feedback loop" is the full cycle of:
1) Inputs -- taking in matter, energy or information;
2) Processing -- transforming the matter, energy or information in a way that is
useful to the system;
3) Output -- affecting the environment in some way; and
4) Feedback -- having the output affect the the new inputs the system processes.
In a strict sense, the term "feedback" refers exclusively to inputs from the environment
that have been influenced by or stimulated in some way by the system's own activity.
Terms like "stimulus" or "external forces" are used to describe external influences on
the system that the system does not affect in any way (e.g. the force of gravity on an
object.) However, the more densely connected the parts of a system are, the more
vague this distinction becomes. The concept of co-evolution, for instance, postulates
that microsystem create their own macrosystem environments, and co-evolve with
them. (See Evolution and Coevolution.) The quantum physics theory of non-local
reality also postulates that all particles in the universe are in constant contact with all
other particles in the universe, therefore blurring the distinction between feedback and
stimulus. (See Non-Local Reality -- Bell's Interconnectedness Theorem.)
There are a number of different kinds of feedback that are important to the
understanding of complex adaptive systems.
Positive feedback encourages the system to do more of what it was doing before.
Uninhibited positive feedback can lead to exponential rates of growth in output, and an
"exploding" of the system into chaotic behavior. Positive feedback is an important
source of growth and change in systems.
Negative feedback tells a system to stop doing what it was doing. It "negates" the
previous action. The ultimate in negative feedback leads to equilibrium, system
"death", and no activity at all ("entropy"). Negative feedback is an important source of
stability in complex systems.
Iteration is a term often used to mean two things: 1) one cycle of a feedback loop as in
"The system went through one interation." and 2) a feedback loop where the output of
one cycle is the exclusive input for the next cycle.
Autocatalysis is a special form of positive feedback in which a catalyst produces more
of itself by being in the presence of other elements. All forms of sexual procreation are
forms of autocatalysis. Autocatalysis can lead to exponential rates of growth, as is
observed in natural populations when limiting condition (such as predators or food
scarcity) are not present.
Feedback and Structure
In a literal way, feedback creates structure. The "coupling" or stabilizing of feedback
loops between elements in a system is what creates the stable patterns represented by
structure. Some are rigidly structured (such as atoms and atomic elements) and others
are more loosely coupled (such as species in an ecosystem). To say that structure
exists is just another way of saying that a stable pattern of interaction between
elements has been achieved. In system language, stability is often referred to as
homeostatis. This refers to conditions where a balance between positive and negative
feedback has been achieve. (A very simple example of a device designed to create
homeostasis is the thermostat, which alternately sends positive ("turn on") and negative
("turn off") feedback to the furnace, in order to maintain a steady temperature in the
house.)
Feedback and Decision Rules
Every system has certain "rules" it uses to decide how to respond to external
influences, including feedback loops. These rules determine whether or not feedback
from the environment is interpreted as "negative" or "positive" feedback. The rules
control the nature of the system's relationships with its environment. Scientists use
different words to describe these rules (in machine intelligence, they are called
algorithms, in learning theory they refer to "mental models".) These rules vary in how
"tight" and "loose" they are -- in other words, in what level of input a system needs
before it responds in either a negative or positive fashion. This determines how
sensitive a system is to external influences.
Feedback, Delay and Response Time
There are two time-related dimensions of feedback that warrant mention. The first is
the amount of delay between when a system influences its environment, and there is a
response fed back to the system (for instance, between the time that we spray CFC's in
the air, and we are affected in the form of ozone depletion.) The more delay, the less
likely it is that the system can respond in a way that maintains its equilibrium. The
second is the response time, or the time elapsed between when the system recieves
feedback, and it acts in response to it (for instance between when I touch a hot object
and I respond by withdrawing my hand.) The slower the response time, the less likely
the system is to be able to maintain equilibrium with its environment.
Feedback and Information
Feedback loops provide a system with information about its environment. As systems
become more sophisticated at information processing (i.e. become more "intelligent")
they build the capacity to anticipate events by linking one form of feedback with future
events. (For instance, we see signs of bad weather and prepare for it before it come;
animals sense a hard winter and grow extra thick fur; businesses interpret signs of a
changing market and respond in advance.) The development of information
conservation devices allow systems to accumulate experience about the kinds of
feedback that improve their odds of survival. In other words, they learn. A
distinguishing feature of the brain is its ability to build symbolic representations of
feedback loops, and "learn" the iteration of these symbolic feedback loops as opposed
to from direct experience.
Feedback and the Edge of Chaos
The combination of internal and external feedback loops, and the speed of iteration will
determine the dynamics of the system. If there is an excess of negative feedback, the
system will usually fail to adapt to environmental changes and will eventually die. If
there is an excess of positive feedback, the system will eventually approach chaos and
disintegrate. The "edge of chaos" is a balance between these two extremes where life
occurs, because there is enough negative feedback and structure to process
information, and also enough positive feedback to drive growth, change and adaptation.
Relevance for Thinking About Complex Systems:
The concept of feedback and feedback loops is central to an understanding of complex
systems. Feedback is the core "life process" of complex systems. It is by understanding
the dynamics of feedback loops that we can eventually understand the dynamics of
complex systems.
8. THE NATURE OF INFORMATION inform
The Basic Concept:
A key characteristic of complex adaptive systems is their ability to process information.
Being "on the edge of chaos" maximizes this capacity. The ability to conserve and
process information, and use it for adaptation is characteristic of all "living" systems. To
understand the relationship between complexity and information, it is useful to review
some of the many meanings of the term "information."
Discussion:
Information as Structure and Order
At it's most basic level, information is synonomous with order, with the presence of
distinguishable patterns and relationships. "Random" systems contain no information,
in the sense that it is not possible to distinguish patterns in the system. (Eric Jantsch
quotes a definition of information as "...any non-random spatial or temporal structure or
relation." (P. 50)) Structure, patterns and differentiation are synonomous with
information. When we say a system contains information, we mean that it is possible to
make distinctions within the system. Information is structure, and structure is
information. (Ervin Laszlo refers to information as "encoded patterns of energy.")
The original sources of "information" and structure in the universe were the four basic
forces (the weak nuclear force, the strong nuclear force, gravitation, and the
electromagnetic force.) These forces created the first differentiated structure of matter
after the undifferentiated "singularity" of the big bang. (All four forces are theorized to
have appeared within 10 to the minus 12 power seconds after the big bang.)
Subsequent sources of information (i.e. the creation of new patterns of matter and
energy) derive from this original structuration.
Information as Representation of Order
The more common meaning of "information" is as a signal or representation of an order
that is different from the information itself. All complex systems, even chemical ones,
display rudimentary forms of "memory". Living systems are characterized by their
ability to conserve representations of order, pattern and process, and therefore make
available in the present the accumulated experience of the past. The information
conserving function of DNA is a primary example, as well as the "social DNA" in human
systems of culture, books, stories, rituals and customs. The memory made possible by
the neural patterning of the human brain allows us to store representations of patterns
and relationships we have observed in the environment, or created through our own
mental processes. The representation of information in symbols and symbol systems
(e.g. speech, writing, mathematics) is a distinction of the higher computational
capabilities of human consciousness.
Information as Both Novelty and Confirmation
The concepts of "novelty" and "confirmation" can be thought of as two aspects of
information: "Pure novelty, that is to say, uniqueness, does not contain any information;
it stands for chaos. Pure confirmation does not bring anything new; it stands for
stagnation and death." (The Self-Organizing Universe, P. 51.) The process of life (and
the region where information processing occurs) falls between these two extremes --
where there is enough novelty to create new information, but enough confirmation to
carry out the process of computation. This territory is the "edge of chaos" where
maximum capacity for information computation occurs.
Systems as the Self-Organization of Information
If information is seen as the representation of structure, pattern and differentiation, then
it is possible to see dynamic systems and the phenomenon of self-organization as the
self-organization of information itself. Information is not different from the patterns of
matter and energy. Thus Eric Jantsch can say that the brain is "...a communication
mechanism which is used and directed by the self-organization of information."
Information and Communication
Communication is the interaction of one system with another, when the "pattern" of one
system is "recognized" by another through the sending and receiving of "signals." (The
field of Information Theory is the study of this process.) Communication occurs when
there is enough similarity between the two systems for "recognition" to occur -- in other
words when one system can "recognize" the signals sent from another system. The
most basic "signals" are those associated with the four basic forces -- strong and weak
nuclear, electromagnetic and gravitational. The gravitational "signal", for instance, is
"recognized" by all matter in the universe, regardless of its structure.
The nature of communication changes as structures become more complicated, as in
the evolution from inorganic to organic systems. Eric Jantsch defines four different
types of communication that are characteristic of biological systems. These forms of
communication are differentiated by the speed with which the signals travel:
Gentic communication, in which the "signal" is the gene (e.g. DNA.) This is the slowest
of the four forms of biological communication, and occurs over generations of the
species.
Metabolic communication is communication that occurs in chemical processes, as
chemicals react in the presence of each other. This includes all exchanges of energy
and matter in production-consumption cycles (such as the digesting of food;
photosynthesis; etc.) and chemical regulatory processes, such as hormone regulation.
Since the "signaling" consists of the movement and transformation of matter, this is a
relatively slow form of communication and can take anywhere from minutes to days.
Neural communication occurs when information is transmitted by electrical impulse
over the central nervous system and between nervous systems (organism to organism.)
through the senses (auditory, touch, smell, cognition, etc..) Neural communication is
measured in tenths or hundredths of a second.
Biomolecular communication occurs in relatively small volumes between individual
atoms, and is the most rapid form of biological communication, occuring in milliseconds.
Information, Consciousness and the "Ecology of Ideas"
Information processing reaches its known apex in the structure of the human brain and
our varying levels of consciousness. (See The Nature of Consciousness.) With the
development of the neocortex layer of the brain, we evolved the capacity to store and
create information in symbol systems such as language, mathematics and art. This
process is made possible by the self-organizing dynamics of the neural networks in our
brain. (See Neural Darwinism.) These symbols now create complex adaptive systems
of their own, in which the elements interacting in the system are the symbols and
patterns of symbols (an "ecology of ideas".) The combining and recombining of these
ideas and concepts creates new information that we conserve in symbolic structures
such as books, mathematical formulae, video, and more recently, ordered electrons on
optical disks. The development of the computer in turn allows us to direct machines to
create new combinations of information that supplement the capacity we have in our
own brains.
The combination of our own capacity for memory, thinking and imagination; the ability
to store information in physical artifacts; and now our ability to have machines augment
our capacity for the creation of new information, we have increasingly made our
capacity for learning independent of contact with the external world we are learning
about. We are no longer entirely dependent on experience itself for learning: "...the
processing and organization of information becomes independent...of direct sensory
impact." (Jantsch, P. 164) This capacity is reaching a new level of independence from
sensory reality in the evoloution of "virtual reality" technology.
All forms of self-reflective information conservation reduce the role of DNA in our
species evolution. As we evolve more flexible, creative and adaptive forms of
information conservation and creation, we increase our ability to "evolve" without being
dependent on the very slow process of DNA selection. This capacity is reaching a new
level in the current development of knowledge about the structure of DNA itself, and our
ability to change that structure to alter the shape of individuals formed by it.
Learning as the Creation of New Information
In its most rudimentary form, learning is a process of taking in new information (novelty
), selectively combining it with conserved information (confirmation) and creating more
new information. (See Adaptation and Learning.) All "dissipative structures", "open
systems" or "complex adaptive systems" engage in this processs. (Note the difference
between learning as the creation of information, and the more common usage in
education as the "receiving" of information.)
The Subjective Nature of Information
All information is by definition subjective, because it is dependent on both what kinds
of signals the receiving system can recognize, and what kinds of patterns in those
signals they are capable of recognizing. Depending on your structure, you receive
some signals and you don't receive others. For instance, the human eye is structured
to detect a certain range of frequencies of light wave. Anything outside of this range is
"invisible" to us. Assuming that I can receive the signal in the first place, I will then vary
with others about how I interpret the signal. For instance, we all may see an object, but
we may differ on the shades of color in the object, depending on our interpretations of
it. (This is the distinction between innate and contextual attributes. See Subjectivity.)
When we use the term "objective" in reference to information, we simply mean that we
have identified a large group of pattern recognition devices (e.g. human beings and
their senses, perhaps supplemented by mechanical devices such as microscopes or
lasers) that have similar capacities for signal detection, and similar rules for
interpretation of the signals. The "information" is contained in the relationship between
the observer and the observed and does not exist independent of this relationship.
(Nick Herbert quotes the physicist Niels Bohr as saying: "Isolated material particles are
abstractions, their properties being definable and observable only through their
interactions with other systems." Quantum Reality, P. 161.)
Signal Detection and the Collapsing of the Probability Wave
There is a correlary between how we collapse the potential information in a collection of
signals into specific "knowledge", and how quantum physics understands the collapse
of a "probability wave" into a specific particle. (See Wave-Particle Duality.) In both
cases, the act of detection and observation "collapses" a broad range of possibilities
into a more discretely bounded "thing." We take a collection of signals from the
environment, and we interpret it in a specific way -- we make it "mean" something. Out
meaning of course, is only one of a possible range of meanings. In quantum physics,
the "probability wave" of the particle, which represents all possible paths of the particle,
and all possible mix of attributes, is "collapsed" into a specific particle in a specific
location. We need to be aware therefore, that everytime we create information in the
form of meaning and interpretation, we are also destroying the potential for a broad
range of other meanings and interpretations.
Yes/No or Maybe?
Traditional information theory is based on "digitized" order in which all information is
expressed as yes/no decisions or strings of "0's" and "1's". While this is useful for a
wide variety of applications (the evoloution of cybernetics from information theory forms
the basis of much of our computer programming) it misrepresents some of the
"fuzziness" of reality as represented by quantum probability and uncertainty (See
Heisenberg's Uncertainty Principle.) The emerging field of "fuzzy logic" is building
programming capacity based on the "shades of gray" represented by the region
between the 0 and the 1.
Relevance for Thinking About Complex Systems:
Complex adaptive systems appear to be drawn to the "edge of chaos" because that is
where their capacity for information processing is maximized. As we begin to see
information as the source of structure in complex systems, we become extremely
sensitive to the role of information flow, processing, and creation in human systems.
And as we understand the inherently subjective nature of information, we begin to take
care that the information potential in signals we receive is not too narrowly "collapsed",
thereby destroying our capacity to create new information.
9. LOCAL RULES rules
The Basic Concept:
The nature of the rules that govern the interaction of parts in a system determines both
the global structure (what its shape is) and the global dynamics (how it changes over
time) of the system. Different "rules" will produce different system "shapes" and
different patterns of change (e.g. ordered, complex, or chaotic.)
Discussion:
"Local rules" are the decision criteria that parts of a system use to decide how to
respond to different kinds of feedback. (Link to parameter space?) The term "decide" is
used in a broad sense here. In relatively simple inorganic systems, the "local rules" are
purely physical rules, dictated by the basic physical forces of the universe
(electromagnetic, gravity, strong and weak nuclear.) There is no "choice" involved. As
systems evolve and become more complex, and gain the capacity for information
conservation and processing, local rules become more sophisticated and the systems
are freed from the constraints of response to immediate physical forces. (This shift
between domination by physical forces and capacity to process information is where the
difference between living and non-living things begins to be drawn. And this difference
occurs on the edge of chaos: " So one of the interesting things we can ask about living
things is, Under what conditions do systems whose dynamics are dominated by
information processing arise from things that just respond to physical forces? When
and where does the processing of information and the storage of information become
important? Complexity, P. 232. "The edge of chaos is where information gets its foot in
the door in the physical world, where it gets the upper hand over energy." Lewin, P. 51.
See The Nature of Information.)
Simple Rules, Complex Behavior. One of the important insights of complexity science
is that fairly simple rules can generate enormously complex patterns. The almost
infinitely complex shape of the Mandelbrot Set fractal is created by one simple
equation. (See Fractals.) As one complexity scientist framed it: "Hey, I can start with
this amazingly simple system, and look -- it gives rise to these immensely complicated
and unpredictable consequences." (Complexity, P. 329) This is possible because of the
rapid iteration and feedback in the system. (See Feedback.)
Change the Rules, Change the Dynamics. As you change the rules in the system, you
get very different kinds of system behavior. A slight change in the reproduction rate for
a species, for instance, can result in stable population patterns becoming completely
unpredictable. One way of classifying rules is according to what kinds of system
dynamics they produce. In this scheme, rules fall into one of four "universality classes"
that result in ordered, complex, or chaotic behavior. (See Classes of Systems.) Stuart
Kauffman contends that living systems adjust their rules (in his language, their
"parameters") until they "find" the rules that keep them in the territory of complex
behavior (on the "edge of chaos"), since that is the system state most conducive to
information processing and hence life. Kauffman refers to this process as the process
of systems taking "adaptive walks through parameter space to find 'good' dynamical
behavior." (The Origins of Order, P. 181)
Human Rules. As the parts that are interacting with each other in the system become
more complex (e.g. from atoms to molecules to protiens, to cells, to organisms, to self-
conscious beings) the rules and feedback loops within the system also become more
complex. The emergence of self-reflective consciousness in human beings introduces
a new phenomenon in "rule making." We have the capacity to reflect on and predict the
consequences of different kinds of "local rules", and deliberately choose the rules we
think will lead to desired future system states. In human systems, our rules are the
mental models we use to decide how to behave. Some of these are driven by biology
(e.g. hormones.) Others are built out of our social and historical experiences. These
"rules" are encoded in our culture in the form of values, beliefs, customs, laws, theories,
assumptions and morals. It is this human capacity that allows us to create "unnatural"
and "designed" systems. Science can be understood as our attempt to understand the
"rules" or "laws" of the universe, partly for the purpose of just knowing, but also for the
purpose of being able to predict and control the results those laws produce.
Rules, Adaptation and Learning. Adaptation is the capacity to change behavior based
on feedback. Learning is the capacity to change our "local rules" based on experience.
In Kauffman's language, learning is the ability to wander through parameter space.
Relevance for Thinking About Human Experience:
When we seek to change human systems, our tendency is often to try and "restructure"
and "design" systems from the top down. A proper understanding of the role of local
rules in determining global shape and dynamics would lead us instead to look for the
simple rules we would need to change that would then drive change in shape and
dynamics. And we would seek to understand in particular those rules that will keep the
system on the "edge of chaos" where it is most able to adapt, learn and grow.
10. SELF-ORGANIZED CRITICALITY critical
The Basic Concept:
Complex systems naturally bring themselves to a condition which is stable, but fragile
enough that small disturbances can cause large "avalanches" of change. The physicist
Peter Pak refers to this state as "self-organized criticality". Self-organized criticality is
another way of understanding systems that are poised on the "edge of chaos." (See
Edge of Chaos.)
Discussion:
Self-organized criticality is best understood by thinking of the example of a pile of sand
that is created by dropping grains of sand one by one on top of another: "...the resulting
sand pile is self-organized in the sense that it reaches the steady state all by itself
without anyone explicitly shaping it. And it's in a state of criticality, in the sense that
sand grains on the surface are just barely stable." (Complexity, P. 304) Like other
systems at the edge of chaos, the sand pile has a balance of stability and fluidity.
As grains of sand are dropped on the pile, unpredictable "avalanches" will occur, where
a single grain of sand will cause a major reshaping of the pile. The size and frequency
of these avalanches conforms to what is called a "power law." (See Transition Laws.)
"Big avalanches are rare, and small ones are frequent. But the steadily drizzling sand
triggers cascades of all sizes -- a fact that manifest itself mathematically as the
avalanches' 'power law' behavior: the average frequency of a given size of avalanche is
inversely proportional to some power of its size." (Complexity, P. 305) The key to the
behavior of self-organized criticality is the way in which the steady input of energy (in
this case the grains of sand) drives the system to be structured in such a way that you
have many small subsystems and elements in a state of delicate interconnection with
each other.
Self-organized criticality is another way of talking about systems at "the edge of chaos."
Power laws are a special case of small inputs causing large changes in non-linear
system -- in other words, another example of sensitive dependence. (See Sensitive
Dependence on Initial Condition.)
Relevance for Thinking About Complex Systems:
Complexity scientists suspect that self-organized criticality and power laws are features
of many kinds of complex systems, including earth quakes, traffic jams, stock market
fluctuations, species extinctions, and economies. One distinction is important -- self-
organized criticality does not require complex, adaptive behavior. Complex adaptive
systems are systems that have the capacity for information manipulation and
conservation, which is a feature that cannot be identified with piles of sand and
earthquakes. Thus, while complex adaptive systems might display features of self-
organized criticality, self-organized criticality does not mean the system is a complex
adaptive system.
11. STRATIFIED AUTONOMY stratifi
The Basic Concept:
Complex systems tend to arrange themselves in "layers" of integration, such that each
system is part of a larger whole, which in turn is part of an even larger system, which is
part of a larger system, and so on. At each level, each system is simultaneously
autonomous and integrated with the systems at its level, below it, and above it.
Discussion:
There is a distinct hierarchical structure in nature, but it does not operate anything like
the top-down control model that is associated with human hierarchies. Instead, nature's
form of hierarchy (referred to as "stratified autonomy"), is a system of emergent layers
of system itegration that is driven from the bottom up, not the top down. (See
Emergence.):
"...stratified order...The tendency of living systems to form multi-leveled structures
whose levels differ in their complexity is all-pervasive throughout nature and has
to be seen as a basic principle of self-organization. At each level of complexity
we encounter systems that are integrated, self-organizing wholes consisting of
smaller parts, and, at the same time, acting as parts or larger wholes." (Capra, P.
280)
Even to speak of "layers" and "top" and "bottom" infers the wrong kind of
relationships.In the stratified arrangements of natural systems, the "levels" refer not to
who controls who, but to the scale at which system integration occurs. (In this sense of
scale, the stratified autonomy of complex systems forms a fractal structure in which
there is a high degree of self-similarity at different scales of the system. See Fractals.)
At each level of integration, each individual system maintains a high level of autonomy,
but also establishes stable patterns of direct relationships with other systems.
(Predators and prey, for instance, establish stable patterns of relationships with each
other over time.) This set of relationships then comes to constitute a system of its own,
which in turn develops stable patterns of relationships with other systems of similar
scale. (See Cosmogenesis chart showing communication and coordination patterns.)
In their stratified order, complex systems proceed through a dual process of
individuation and integration. Individuation creates the parts that can form and reform
patterns of relationship. These patterns of relationship then can constitute the process
of integration. When the integration is so complete that the parts become dependent
on each other for their very existence, individuation has again occured, but at a higher
level of complexity. Thus the building of complex stratified order has a temporal
dimension to it, in that each level of integration is built on the level that came before it:
"Systems on a given level are aggregates of systems at the preceeding level."
(Cosmogenesis, P. 213)
The dynamics of stratification through differentiation and integration at progressively
higher levels of complexity can be seen in many different kinds of complex systems,
including biological ecosystems; human social systems; the structure and function of
DNA; and even the process of language acquisition (see Cosmogensis, P. 251).
Different authors have constructed a number of different charts demonstrating the
hierarchic structure of natural systems. Several are attached.
Relevance for Thinking About Complex Systems:
Understanding the stratified autonomy of complex systems gives us a way to think
about the building of layers of complexity and integration without the need to "design"
the system from the top down, and without the need to control the parts from a central
location.
12. PHASE TRANSITIONS
The Basic Concept:
A phase transition is a radical shift of a system from one condition to another condition
with highly different properties. The transformation from steam to water to ice is an
example. the term "phase transition" describes the intermediate transition period
between the two different conditions. Both material substances, physical systems, and
organic systems exhibit a common pattern when they transition from one state to
another. They move from order to complexity to chaos. The middle condition is called
the point of "phase transition". It is also sometimes referred to as the "edge of chaos".
(See Edge of Chaos)
Discussion:
A phase transition is a form of bifurcation, a change from one state to another. (See
Bifurcation) This particular kind of bifurcation, however, is one where the fundamental
properties of the material or system change. Examples from the material world include:
liquid to vapor; nonmagnetic to magnetic; fluid to superfluid; conductor to
superconductor; and fluid to solid. Complexity scientists noticed that there seemed to
be similarities in the phase transition, regardless of the system involved. The exact
point of transition is one characterized by highly non-linear phenomena.
There are two kinds of phase transition -- first order phase transitions, where the shift is
abrupt, and the system jumps directly from one state to another; and second order (or
"continuous") phase transitions where the shift is more gradual. In first order
transitions, the elements (e.g. molecules) are forced to make an either/or choice
between two states. In second order transitions, right at the transition, there is a fluid
balance between order and chaos: "Order and chaos intertwine in a complex,
ever-changing dance..." (Complexity, P. 230) On one side stands the rigid pattern of
solid order; on the other swirling, boiling turbulence of chaos. In the middle, at the point
of transition, stands a region of mixed order and chaos -- enough order to have form,
but enough chaos never to get rigid. This point of "phase transition" is the "edge of
chaos", and is the condition that all living systems evolve towards.
(A field of study that is related to first order phase transitions is "catastrophe theory",
which develops models of the radically abrupt change from one condition to another
(e.g. the popping of a balloon). See P. 84 - 87 of Turbulent Mirror.)
Relevance for Thinking About Complex Systems:
Phase transitions have two implications for how we think about complex systems,
including human ones:
Systems have the potential for radically changing from one state to another (e.g. the
collapse of the Soviet Union); and
Living systems are systems that exist in the constant tension between order and
disorder, in the territory of "second order" phase transitions.
SELECTED BIBLIOGRAPHY
Popular Integrators
Fritjof Capra, The Turning Point
Margaret Wheatley, Leadership and the New Science
Marilyn Ferguson, The Aquarian Conspiracy
Michael Talbot, The Holographic Universe
Ken Wilber, ed., The Holographic Paradigm and Other Paradoxes
Anna Lemkow, The Wholeness Principle
George Land and Beth Jarman, Breakpoint and Beyond
George Land, Grow or Die
John Briggs and David Peat, Looking Glass Universe
Louise Young, The Unfinished Universe
David Layzer, Cosmogenesis
John Casti, Paradigms Lost
Michael Rothschild, Bionomics
Quantum Physics
Nick Herbert, Quantum Reality
Danah Zohar, The Quantum Self
David Bohm and David Peat, Science, Order, and Creativity
Fritjof Capra, The Tao of Physics
Heinz Pagels, The Cosmic Code
Bruce Gregory, Inventing Reality -- Physics as Language
Fred Wolfe, The Quantum Leap
Paul Davies, ed, The New Physics
David Bohm, The Unidivided Universe
Danah Zohar, The Quantum Society
Ken Wilber, ed., Quantum Questions
Chaos Theory
James Gleick, Chaos
John Briggs and David Peat, Turbulent Mirror
John Briggs, Fractals -- The Patterns of Chaos
Stephen Kellert, In the Wake of Chaos
Robert L. Devaney, A First Course in Chaotic Dynamical Systems
Systems Theory
Ludvig von Bertalanffy, General System Theory
Ervin Laszlo, The Systems View of the World
Erich Jantsch, The Self-Organizing Universe
Peter Senge, The Fifth Discipline
Arthur Young, The Reflexive Universe
Complexity Theory
Mitchell Waldrop, Complexity
Roger Lewin, Complexity -- Life at the Edge of Chaos
Ilya Prigogine, Order Out of Chaos
William Roetxheim, Enter the Complexity Lab
John Casti, Complexification
Murray Gell-Mann, The Quark and the Jaguar
Lawrence Slobodkin, Simplicity and Complexity in Games of the Intellect
Stuart Kauffman, The Origins of Order
Ellen Thro, Artificial Life Explorer's Kit
Kevin Kelly, Out of Control
Neuroscience and Learning
Renate Caine, Making Connections -- Teaching and the Human Brain
Howard Gardner, Frames of Mind
William Glasser, The Quality School
Jacqueline Brooks, The Case for Constructivist Classrooms
Gerald Edelman, Bright Air, Brilliant Fire
Charles C. Bonwell, Active Learning
Louis Perelman, School's Out
Leslie Hart, Human Brain and Human Learning
Bobbi DePorter, Quantum Learning

You might also like