A mathematical theory of discontinuous change in a system formulated by the French mathematician Renee Thom from his work in topology. A "catastrophe" is an abrupt change in a variable(s) during the evolution of a system that can be modeled by structural equations and topological folds. Catastrophes are governed by control parameters whose changes of values leads either to smooth transition at low values or abrupt changes at higher, critical values. Catastrophes indicate points of bifurcation in dynamical systems. For example, the way a dog can change abruptly from a playful mood to an aggressive stance can be modeled by a simple "catastrophe." In organizations, the presence of sudden change can similarly be modeled using Catastrophe Theory. In recent years, Catastrophe Theory is understood as a part of Nonlinear Dynamical Systems Theory in general.
Bibliography: Guastello (1995)
Computer programs that are composed of a grid of "cells" connected to their neighboring cells according to certain "rules" of activity, e.g., a cell might be "on" if its four neighbor cells (east, west, north, and south) are also on. The entire array can self-organize into global patterns that may move around the screen. These emergent patterns can be quite complex although they emerge from very simple rules governing the connections among the cells. Cellular automata were originally conceived by the late, eminent mathematician John von Neumann, and were realized more recently by the equally eminent living mathematician John Conway in his "Game of Life." Today, the study of cellular automata goes under the name "Artificial Life" (A-Life) because the exploration of cellular automata and their patterns at such places as the Santa Fe Institute has led to insights into the way structure is built-up in biological and other complex systems. Businesses and institutions can be modeled by cellular automata to the extent they are made up of interaction among people, equipment, and supplies. For example, the strength, number, and quality of connectivities among people or groups can be modeled by cells and rules among cells, so that how changing the rules influences the emergence of patterns can be investigated. These cellular automata models hopefully will yield important insight into the dynamics of human systems.
See: Complexity; Emergence; N/K Model; Self-organization
Bibliography: Langton (1986); Poundstone (1985); Sulis and Combs (1996)
A type of system behavior appearing random-like, yet is actually deterministic and is constituted by a "hidden" order or pattern. Chaos can be found in certain nonlinear dynamical systems when control parameters surpass certain critical levels. The emergence of chaos suggests that simple rules can lead to complex results. Such systems are constituted by nonlinear, interactive, feedback types of relationships among the variables, components, or processes in the system. Chaotic times series of data from measurements of a system can be reconstructed or graphed in phase or state space as a chaotic or strange attractor with a fractal structure. Chaotic attractors are characterized by sensitive dependence on initial conditions so that although the behavior is constrained within a range, the future behavior of the system is largely unpredictable. However, unlike a random system which is also unpredictable, chaos is brought about by deterministic rules. However, there is some measure of predictability due to the way the attractor of the system is constrained to a particular region of phase space. For example, if the weather is a chaotic system, particular states of the weather are unpredictable yet the range of those states is predictable. Thus, it is impossible to predict what the weather will be exactly on August 10, 2000 in New York, yet it is predictable that the temperature will fall within a range of 65-95 degrees Fahrenheit. That is, the climate acts as a constraint of the unpredictability of the state of the weather. In organizations chaos may show-up under certain circumstances, e.g., inventory or production processes, admission rates, timing of procedures, and so on. As a result, certain aspect of organizational functioning may be unpredictable. Recent research has pointed to ways of "controlling" chaos by introducing particular perturbations into a system.
See: Attractor; The Butterfly Effect; Control Parameter; Logistic Equation;
Sensitive Dependence on Initial Conditions; Time Series
See: Attractor; The Butterfly Effect; Control Parameter; Logistic Equation; Sensitive Dependence on Initial Conditions; Time Series
Bibliography: Lorenz (1993); Guastello (1995); Peak and Frame (1994)
A term coined by journalist Kevin Kelly to describe how nature constructs complex systems Ð from the bottom up with building blocks (systems) that have proven themselves able to work on their own. This concept is widely appreciated by evolutionary biologists and has been highlighted by complexity pioneer John Holland as a key feature of complex adaptive systems. He used the image of childrens building blocks, of different shapes and sizes, combined in a variety of ways to yield new creations like castles and palaces.
See: Emergence; Genetic Algorithms Self-organization
Bibliography: Holland (1995); Kelly (1994)
A logical/mathematical postulate, independently arrived at by the English mathematician Alan Turing and the American logician Alonzo Church, stating that as long as a procedure is sufficiently clear-cut and mechanical, there is some algorithmic way of solving it (such as via computation on a Turing Machine). Thus, there are some processes or problems which are computable according to some set of algorithms, whereas, other processes or problems are not computable. A strong form of the Church-Turing Thesis claims that all neural and psychological processes can be simulated as computational processes on a computer.
See: Complexity (Algorithm); Turing Machine
Bibliography: Goertzel (1993); Penrose (1989); Sulis in Robertson and Combs (1995)
The coordinated and interdependent evolution of two or more systems within a larger ecological system. There is feedback among the systems in terms of competition or cooperation and different utilization of the same limited resources. Fore example, Kauffman and Macready give as examples of co-evolution the way in which alterations in a predator will alter the adaptive possibilities of the prey. Businesses or institutions can co-evolve in various ways such as with their suppliers, receivers, even competitors. For instance, the numerous types of joint ventures that are recently emerging can be considered a kind of co-evolution.
See: Feedback; Fitness Landscapes
Bibliography: Kauffman (1995); Kauffman and Macready (1995)
The cohesiveness, coordination, and correlation characterizing emergent structures in self-organizing systems. For example, laser light is coherent compared to the light emanating from a regular light bulb. That emergent structures show a kind of order not found on the lower level of components suggests that complex systems contain potentials of functioning that have not been recognized before. Businesses and institutions can facilitate and utilize the coherence of emergent structures in place of the imposed kind of order found in the traditional bureaucratic hierarchy.
See: Dissipative Structures; Emergence; Self-organization
Bibliography: Goldstein (1994); Kauffman (1995); Prigogine and Stengers (1984)
A description of the complex phenomena demonstrated in systems characterized by nonlinear interactive components, emergent phenomena, continuous and discontinuous change, and unpredictable outcomes. Although there is at present no one accepted definitions of complexity, the term can be applied across a range of different yet related system behaviors such as chaos, self-organized criticality, complex, adaptive systems, neural nets, nonlinear dynamics, far-from-equilibrium conditions, and so on. Complexity characterizes complex systems as opposed to simple, linear, and equilibrium-based systems. Measures of complexity include algorithmic complexity; fractal dimensionality; Lyapunov exponents; Gell-manns "effective complexity" and Bennetts "logical depth."
See: Anacoluthian Processes; Complex, Adaptive Systems; Nonlinear System; N/K Model; Random Boolean Network; Self-organization; Swarmware
Bibliography: Gell-mann (1995); Holland (1995); Kauffman (1995); Kelly (1994); Stacey (1996)
A measure of complexity developed by the mathematician Gregory Chaitin based on Claude Shannons Information Theory and earlier work by the Russian mathematicians Kolmogorov and Solomonoff. Algorithmic complexity measure of the complexity of a system based on the length of the shortest computer program (set of algorithms) possible for generating or computing the measurements of a system. In other words, the algorithmic complexity of a system is how small a model of the system would need to be that captured the essential patterns of that system. For example, the algorithmic complexity of a random system would have to be as large as the system itself since the random patterns could not be shortened into a smaller set of algorithms. As such, algorithmic complexity has to do with the mixture of repetition and innovation in a complex system.
For example, imagine a simple system that could be represented by a bit string composed of the following sequence, 010101010101010 10101... It would only require a short program or algorithm, e.g., a command to print first a zero, then a one, then a zero, then a one, and so on. Therefore, the complexity of a system represented by the bit string 0101010 101010101010101... would be very low. However, the complexity of a system (such as the toss of a fair coin with a 1 on one side and a 0 on the other) represented by random sequence, e.g., 10110000100110001111...) would require a computer program that was as large or long as the bit string itself since it is randomly produced and no computer program could predict future 1s or 0s. As a result, the algorithmic complexity of a infinite random system would have to be as infinitely large as the system itself.
Bibliography: Chaitin (1987); Goertzel (1993)
A complex, nonlinear, interactive system which has the ability to adapt to a changing environment. Such systems are characterized by the potential for self-organization, existing in a nonequilibrium environment. CASs evolve by random mutation, self-organization, the transformation of their internal models of the environment, and natural selection. Examples include living organisms, the nervous system, the immune system, the economy, corporations, societies, and so on. In a CAS, semi-autonomous agents interact according to certain rules of interaction, evolving to maximize some measure like fitness. The agents are diverse in both form and capability and they adapt by changing their rules and, hence, behavior, as they gain experience. Complex, adaptive systems evolve historically, meaning their past or history, i.e., their experience, is added onto them and determines their future trajectory. Their adaptability can either be increased or decreased by the rules shaping their interaction. Moreover, unanticipated, emergent structures can play a determining role in the evolution of such systems, which is why such systems show a great deal of unpredictability. However, it is also the case that a CAS has the potential of a great deal of creativity that was not programmed-into them from the beginning. Considering an organization, e.g., a hospital, as a CAS shifts how change is enacted. For example, change can be understood as a kind of self-organization resulting from enhanced interconnectivity as well as connectivity to the environment, the cultivation of diversity of viewpoint of organizational members, and experimenting with alternative "rules" and structures.
See: Adaptation; Emergence; Genetic Algorithm; Self-organization
Bibliography: Dooley (1997); Gell-mann (1994); Holland (1995); Kauffman (1995)
The organizational theorist Gareth Morgans concept for the amount of discretionary influence a manager has in influencing change processes. One of Morgans points is that this 15% can accomplish a great deal in a nonlinear, complex system. For example, nonlinearity means that a small change can have a huge outcome. Therefore, although ones discretionary efficacy may only be 15%, nevertheless, there can still be a large impact resulting from these discretionary efforts.
See: Butterfly Effect; Instability; Nonlinear Systems; Sensitive Dependence on Initial Conditions
Bibliography: Morgan (1997)
Containment (see Boundaries)
A mathematical method for measuring the complexity of a system by analyzing times series data from that system. The correlation dimension measures the degree of correlation among the elements of a system which can reveal the existence of some kind of hidden order such as chaos, nonlinear-coupling, and so on in what otherwise might have been conceived as a random system. Correlation dimension is related to other measures of complexity such as fractal dimensionality, Kolmogorov entropy, Lyapunov Exponents, and Structural Complexity.
See: Chaos; Coherence; Time Series
Bibliography: Peak and Frame (1994)
Top | List of Terms
Copyright © 2001, Plexus Institute Permission
to copy for Educational purposes only.