Complexity catastrophes
This page describes the organizational forces that limit change.  It explains how to overcome them when necessary. 

Power& tradition holding back progress
This page uses an example to illustrate how:
  • A business can gain focus from targeting key customers,
  • Business planning activities performed by the whole organization can build awareness, empowerment and coherence. 
  • A program approach can ensure strategic alignment. 
Be responsive to market dynamics
This page uses the example of HP's printer organization freeing itself from its organizational constraints to sell a printer targeted at the IBM pc user. 
The constraints are described. 
The techniques to overcome them are implied. 
Overcome reactionaries
Primary Navigation

Complexity catastrophes

Dietrich Dorner argues complex adaptive systems (
This page introduces the complex adaptive system (CAS) theory frame.  The theory is positioned relative to the natural sciences.  It catalogs the laws and strategies which underpin the operation of systems that are based on the interaction of emergent agents. 
John Holland's framework for representing complexity is outlined.  Links to other key aspects of CAS theory discussed at the site are presented. 
) are hard to understand and manage.  He provides examples of how this feature of these systems can have disastrous consequences for their human managers.  Dorner suggests this is due to CAS properties psychological impact on our otherwise successful mental strategic toolkit.  To prepare to more effectively manage CAS, Dorner recommends use of:
  • Effective iterative planning and
  • Practice with complex scenario simulations; tools which he reviews.   
The Logic of Failure
In Dietrich Dorner's book 'The Logic of Failure' he illustrates our typical failure to comprehend both simulations and vivid real examples of
This page introduces the complex adaptive system (CAS) theory frame.  The theory is positioned relative to the natural sciences.  It catalogs the laws and strategies which underpin the operation of systems that are based on the interaction of emergent agents. 
John Holland's framework for representing complexity is outlined.  Links to other key aspects of CAS theory discussed at the site are presented. 
complex adaptive systems
(CAS).  The catastrophic is a dramatic breakdown in the operation of a CAS that results in general failure.  Dorner in the logic of failure, sees the Chernobyl nuclear disaster as illustrative: typical human strategies were incorrectly applied by experienced operators, because they were overconfident and incorrectly modeled the current and immediate future state of the reactor.  Beinhocker asserts positive effects generated in a large inter-connected network induce negative effects at other points in the network.  Booch argues that increasing system complexity can overwhelm human designers, inducing catastrophe in software development.  He recommends adopting object oriented hierarchy and modularity to limit complexity.  But many CAS networks include huge number of agents, responding to internal and external signals, and effectively executing evolved, distributed schematic plans.  Eventual loss of control, as in the case of cancers, is notable and highlights the effective agency of the more regular situation.  Human developed systems suffer from complexity catastrophe.  Democratic processes slowly search for representatives who will solve problems for the citizens, but Diamond in Collapse explains that democracy has struggled to cope with the tragedy of the commons.  Cliodynamic cycles operate over multiple lifetimes leaving humans prone to fall into the traps that caught their grandparents.  Evolved amplifiers support bubbles incenting dangerous deregulation, and encouraging broad participation, even though the rules ensure additional wealth accumulates to the legislative elite and aristocracy, who safely ignore moral hazard.  Parasites undermine the detection of problems.  RSS sees catastrophe enabled by a lack of rigorous schematic planning within most developed human systems. 
consequences of our misinterpretations are highlighted by the broad impact of the Chernobyl reactor meltdown.  Dorner shows how this disaster resulted from typical human strategies is Tooby & DeVore's theory that reflects a flexible competitive strategy, described by Steven Pinker, which leverages the power and flexibility of intelligence to defeat the capabilities of genetically evolved specialists focused on specific niches.   incorrectly applied by highly experienced operators.  He argues this was due to the operators incorrectly predicting the current and immediate future state of the Chernobyl reactor and their over confidence.  Dorner notes that the vast majority of participants in his simulations fail to cope with side effects and long term repercussions, take too aggressive or unaggressive corrective measures and generally ignore the key premises.  The results were dismal.  The participants given immense power used it to correct identified problems only to watch in denial as their actions resulted in the collapse of the system they controlled. 

Dorner's interest is in psychology.  He used his simulations to study how people react to complex situations, where he assigns them responsibility and gives them the power to affect key aspects of the system.  But as is typical in CAS the interconnections between the various aspects must be intuited from observing the changes in the system over time.  Dorner's simulations, like those of other important real world CAS have aspects which expand and contract non-linearly.  He assumes that silver bullets would have been found long ago if they existed.  Instead he argues real improvement can be achieved when we take into account the demands of CAS problem solving and the errors we are prone to make in our typical responses.  He writes 'the sources of these failings are often quite simple and can be eliminated without adopting a revolutionary new mode of thought.  Having identified and understood these tendencies in ourselves, we will be much better problem solvers'. 

Dorner's strategy is based on his observation that we fail to solve a CAS problem because of committing a series of small mistakes which then compound.  This view is based on his analysis of the nature of our thinking and emotional responses when we act to achieve goals and deal with complex problems.  He concludes it is necessary to cope with:
Dorner advocates using computer simulations to provide practice, in controlled conditions where actions can be captured in real time and then analyzed in context. 

Dorner identifies a number of demands of CAS:
From which he argues there must be specific steps taken regarding planning and action. 
Dorner's steps in planning and action
He asserts an iterative process:
  1. Formulation of goals
  2. Formulation of models and gathering of information. 
  3. Prediction and extrapolation
  4. Planning of actions; decision making, and execution of actions. 
  5. Review of effects of actions and revision of strategy
Formulation of goals
Dorner reviews goal setting. 

He distinguishes between positive and negative goals, and general and specific goals.  Unless a goal is specific it is hard to tell if it has been achieved.  Specific goals can also constrain how a goal is achieved.  When results are not immediate goal degeneration can occur.  For CAS, contradictory goals are the norm.  Dorner gives typical examples: minimizing costs conflicts with maximizing benefits;
H. A. Hayek compares and contrasts collectivism and libertarianism. 
liberty to benefit from success undermines equality of opportunity

The linked nature of complex systems can create problems for goal setting.  If a positive and a negative goal are linked then when they are executed they will undermine each other.  Long term and implicit goals can lose focus to short term explicit goals.  Dorner observes that intermediate goals can be used to support the achievement of long term goals increasing the specificity. 
Dorner argues that labeling a set of problems with a conceptual label obscures the multi-faceted nature.  To organize the set Dorner suggests:
  • Finding interdependencies can highlight central problems which affect a number of peripheral ones.  Obtaining money or energy are typically central issues. 
  • Rank problems in importance and urgency, with rational assessments of which must be solved first depending on the current situation. 
  • Delegating relatively independent problems to other agents.  Dorner stresses the difference between delegation & just dumping responsibility on others.  
Dorner describes another problematic strategy using intermediate goals - 'repair service' behavior where complaints drive the priorities.  A series of easy intermediate goals becomes the focus of attention.  This allows the agents to achieve a result, with little uncertainty or risk of failure.  However, the hard goals may be being ignored. 

Dorner notes contradictory goals are difficult to cope with, generating strategies including Newspeak, from 1984, and developing conspiracy theories which insidiously support self-protection during a period of environmental confusion. 

Formulation of models and gathering of information
Dorner argues that adaptive systems must be understood so as to differentiate symptoms from underlying causes.  Treating problematic symptoms is futile.  They will just recur.  Hence the challenge is to identify the underlying causes of the symptom and then remove them.  To do this Dormer suggests developing an effective model of the system which indicates where the problem is.  Otherwise responding to symptoms typically has unexpected consequences as the system adapts. 

Dorner views a CAS as a network of variables in causal relationship with each other.  He concludes you must recognize the different ways variables can affect one another and themselves.  He identifies categories of interrelationships:

He notes techniques which have been used to help here:
  • Abstractions and metaphors are helpful if they represent key aspects of the system under study and model. 
  • Knowledge of the components and their key linkages can provide insights about how the system works.  Dorner concludes the art is in deciding the level of detail needed to provide understanding of the interrelationships among the 'goal' variables (those that we need to influence). 
To acquire knowledge of the structure of the system Dorner proposes:
  • Analogy,
  • Observe the changes that variables undergo over time.  
And then it is necessary to know the system's current state.   Traditionally we did not have the power to affect the critical variables of systems we participate in.  So our evolved coping strategies are inconsistent with Dorner's proposals.  He describes three strategies which many of his participants utilized: solving problems serially, and holistic aggregation and overgeneralizing

Solving problems one at a time (serially)
Dorner notes that instead of building a model of the system by observing it over time many of us attempt to solve problems serially.  But provided with the ability to significantly affect systems, such as the participants in his simulations, it is easy to impact the carrying capacity of the CAS resulting in its collapse.  Many participants in his simulations see bundles of subsystems rather than conceiving of the total system being simulated.  They focus on immediate goals, and without an effective model of the system fail to cope with information overload undermining their attempts to organize. 

"It's the environment" (holistic aggregation)
Dorner observed participants who developed holistic assessments of the simulations which were totally missing the network of feedback loops.  Instead they used reductionist 'suitcase have multiple attached meanings which encourage us to think in different ways about the word.  Suitcase words are reviewed by Marvin Minsky. 
' variables.  Dorner concludes that this strategy has the benefit of being easy to develop and predictive of how to act.  Unfortunately it is not predictive of how the system will respond.  So the contrary evidence is ignored by explaining it as a property of a 'suitcase' variable such as 'the environment'. 

Overgeneralizing and coping with complexity
Dorner comments on our exceptional ability to detect patterns in the data we observe.  It is a vital part of
Consciousness has confounded philosophers and scientists for centuries.  Now it is finally being characterized scientifically.  That required a transformation of approach. 
Realizing that consciousness was ill-defined neuroscientist Stanislas Dehaene and others characterized and focused on conscious access. 
In the book he outlines the limitations of previous psychological dogma.  Instead his use of subjective assessments opened the window to contrast totally unconscious brain activity with those including consciousness. 
He describes the research methods.  He explains the contribution of new sensors and probes that allowed the psychological findings to be correlated, and causally related to specific neural activity. 
He describes the theory of the brain he uses, the 'global neuronal workspace' to position all the experimental details into a whole. 
He reviews how both theory and practice support diagnosis and treatment of real world mental illnesses. 
The implications of Dehaene's findings for subsequent consciousness research are outlined. 
Complex adaptive system (CAS) models of the brain's development and operation introduce constraints which are discussed. 

human cognition
allowing us to cope with huge varieties of input.  It supports us classifying and grouping objects that differ significantly in detail.  But he points out that we often overgeneralize from specific data.  CAS cannot be understood in this way.  It is necessary to reflect on the specifics of the situation and beware general rules.  Following von Moltke the elder, Dorner argues this makes strategy more of an art than a science.  There is a constant need to adapt action to context. 

Equally problematic we tend to respond to success by reusing the strategy even when it is inappropriate.  Without immediate feedback this overgeneralization of the appropriateness of the strategy can lead to disaster. 

The pale cast of thought
Dorner reflects on the dilemma of more information making it harder to decide on a course of action.  The additional information makes conflicts in the models visible.  Paradoxically it is easier to retreat into a cozy corner where we can feel comfortable. 

Prediction and extrapolation
Often steps in a time series are treated as discrete events.  In part this is because compared to understanding spatial series there are relatively few mechanisms available to us.  We typically extrapolate linearly from the current moment.  Emotional aspects are over represented.  For us exponential time series are not intuitive.  We need an appropriate model to have been highlighted before we can make sense of the situation.  But typically we will need to identify the appropriate model while observing a stream of confusing details.  Worse the start and end points of the series may be obscured.  Dorner illustrates the problem with 1984 AIDS is acquired auto-immune deficiency syndrome, a pandemic disease caused by the HIV.  It also amplifies the threat of tuberculosis.   Initially deadly, infecting and destroying the T-lymphocytes of the immune system, it can now be treated with HAART to become a chronic disease.  And with an understanding of HIV's mode of entry into the T-cells, through its binding to CCR5 and CD4 encoded transmembrane proteins, AIDS may be susceptible to treatment with recombinant DNA to alter the CCR5 binding site, or with drugs that bind to the CCR5 cell surface protein preventing binding by the virus.  Future optimization of drug delivery may leverage nanoscale research (May 2016).   infections in West Germany where he lists the:
Together this results in an incorrectly positioned S-curve, describes the shape of an exponential growth curve.  Products of reactions that are auto-catalyzed follow this curve as they progress over time. Initially there is little product and it is generated slowly but as more product is generated it increases the rate of generation of product. Eventually there is little supply of reactant and so the rate of product generation slows. 
of infection which appeared to suggest prophylactic success when actually the diminished growth in infection-rate was really due to most at risk people already being infected. 

Dorner shows that experts are equally fallible in predicting growth rates as laymen.  They typically investigate the initial series but then select bad models based on extrapolation of the initial series data.  The situation is made more difficult for administrators who must cope with the incorrect predictions while acting to deal with the situation. 

Since CAS time series initial trends tend to reflect negative feedback buffering as well as delays in response to the inputs, early action can add to the confusion. 

Oscillations can become amplified by early action.  Dorner recommends initially leaving the system alone while waiting to observe and understand its initial characteristics.  Dorner warns it's difficult for most people to do this and identify the correct model and strategy to apply because they:
Reversals in direction of series are also typical.  Dorner reviews the interactions of a predator and prey population.  He argues that we have no intuitive feel for how changes to one of the populations will affect the other.  We are typically stuck thinking about linear changes with deltas.  When his participants are provided with graphs of trends in the populations the effect was limited.  Only with experience did participants begin to predict changes effectively. 
Planning, decisions and execution
For Dorner planning begins once goals have been defined, a model of reality has been selected, the system has been observed and we have identified its current state and can predict its behavior in the immediate future. 

Planning should compare the consequences of actions relative to goals. 

A plan can be viewed as a string of conditions, actions and potential results. 

But Dorner highlights a problem, which is that a complete plan must reflect the vast space of possibilities and so is impossible to develop.  Instead it is necessary to focus on certain of the possibilities by:
Dorner adds that it must be able to expand the problem space.  Techniques to do this include:
  • Experimentation,
  • Mutation,
  • Culling unsuccessful approaches,
  • Analogy. 
Dorner reviews how to decide and when to delegate.  Part of the decision making is to select the correct scale for the details of the plan.  But the plan's actions may also take time and eventually fail.  Imagination can miss key details of reality.  Dorner relates that over-simplifications, progressively corrected in subsequent development, are the most potent or indeed the only means toward conceptual mastery of nature.  But it's a trap for experts.  They see things in differentiated forms and thus may overlook other perspectives.  Carl von Clausewitz notes that plans often fail because planners have failed to take account of all the little conditions which he called frictions.  They are another trap for experts.  They are often correct about the lines of development, but they tend to underestimate how long it will take to translate the possible into the actual.  When predictions of how long a plan will take are wrong, or the conditions are removed the plan has no bounds in reality.  It's attractive since it enhances habitual actions and is easy to devise and execute.  While good plans contain discrete actions poor plans often define abstract goals that can't be measured or have no conditions attached. 

Delegation can result in the additional benefit of creating redundancy of command. 

Revision of strategy
Once a plan has been defined, a decision has been reached and implemented there will be consequences of the actions.  Dorner asserts learning is enhanced by making mistakes.  But often people strive to avoid confronting failure.  Dorner describes how:
  • Ballistic behavior - Rather than review the consequences of our previous actions the plan is updated with additional actions which are executed without review.  The actions become proof of success.  With no review a sense of achievement is derived from executing each action. 
  • External attribution - circumstances are blamed for the failure.  Dorner argues that conspiracies are typical. 
  • Goal inversion - by arguing the failed result is associated with the goal that was to be achieved. 
Dorner asks what are the psychological reasons underlying errors in approaching complexity, M. Mitchell Waldrop describes a vision of complexity via:
  • Rich interactions that allow a system to undergo spontaneous self-organization
  • Systems that are adaptive
  • More predictability than chaotic systems by bringing order and chaos into
  • Balance at the edge of chaos

He identifies:
  1. Slowness of conscious thinking - (Kahneman's slow thinking) results in our taking shortcuts, economizing on use of thinking and rushing into action rather than Dorner's advocated approach.  
  2. Preserving a positive view of one's competence which helps sustain a minimum capacity to act. 
  3. The relatively slow speed with which the storage system of the human memory in the brain includes functionally different types: Declarative (episodic and semantic), Implicit, Procedural, Spatial, Temporal, Verbal; Hebb noted that glutamate receptive neurons learn by (NMDA channel based) synaptic strengthening.  This strengthening is sustained by subsequent LTP.  The non-realtime learning and planning processes operate through consciousness using the working memory structures, and then via sleep, the salient ones are consolidated while the rest are destroyed and garbage collected.   can absorb new material. 
  4. We don't think about problems we don't have.  But CAS will involve problems that emerge as side effects of our actions.  
So what do we do?
Dorner argues that since these are comprehensible stumbling blocks, we can find ways to avoid them most of the time.  He notes that experienced managers from large industrial and commercial firms left critical variables in a far better state than a set of students.  Dorner suggests that the managers' greater experience in planning and decision making was significant.  He suggests this is due to 'operative intelligence,' the knowledge that individuals have about the use of their intellectual capabilities and skills.  He claims that we are all capable of acting in the varied ways necessary to succeed.  It is just necessary to make better use of the possibilities. 

Dorner suggests the effective practitioners applied the right rules at the right times.  Further he shows that by reviewing how a participant thought about a series of complex tasks their performance improved significantly.  He stresses that it is the experience of solving the problem supported by the reviews which builds 'operative intelligence'.  Hence he argues simulations of complex problems will be helpful.  As long as they are assembled into a battery of different scenarios that exposes the participants to a 'symphony of demands'.  Expert observers can pinpoint cognitive is the ability to orchestrate thought and action in accordance with internal goals according to Princeton's Jonathan Cohen. 
errors, identify psychological determinants and feed them back.  The goal is the development of common sense.  Simulations are proposed as ways to learn about coping with non-linear time series, long term effects, identifying and coping with side effects and other CAS properties. 

Humans have always participated in CAS.  But prior to the last 150 years a single individual could not typically affect the outcome significantly.  So our responses are designed to cope with the situation aiming to limit the impact or even help us benefit relative to our colleagues and competitorsSpecialization appears to be a typical but ineffective solution, without other supporting strategies, to coping with complexity: health care.  The application of
This page introduces the complex adaptive system (CAS) theory frame.  The theory is positioned relative to the natural sciences.  It catalogs the laws and strategies which underpin the operation of systems that are based on the interaction of emergent agents. 
John Holland's framework for representing complexity is outlined.  Links to other key aspects of CAS theory discussed at the site are presented. 
CAS theory

The logic of failure is an exceptional introduction to the impact of CAS on each of us. 

Market Centric Workshops
The Physics - Politics, Economics & Evolutionary Psychology
Politics, Economics & Evolutionary Psychology

Business Physics
Nature and nurture drive the business eco-system
Human nature
Emerging structure and dynamic forces of adaptation

integrating quality appropriate for each market
This page looks at schematic structures and their uses.  It discusses a number of examples:
  • Schematic ideas are recombined in creativity. 
  • Similarly designers take ideas and rules about materials and components and combine them. 
  • Schematic Recipes help to standardize operations. 
  • Modular components are combined into strategies for use in business plans and business models. 

As a working example it presents part of the contents and schematic details from the Adaptive Web Framework (AWF)'s operational plan. 

Finally it includes a section presenting our formal representation of schematic goals. 
Each goal has a series of associated complex adaptive system (CAS) strategy strings. 
These goals plus strings are detailed for various chess and business examples. 
| Design |
This page uses an example to illustrate how:
  • A business can gain focus from targeting key customers,
  • Business planning activities performed by the whole organization can build awareness, empowerment and coherence. 
  • A program approach can ensure strategic alignment. 
Program Management
| Home

Profiles | Papers | Glossary | E-mail us