Monday, February 6, 2012

System Dynamics


System dynamics modeling is an important technique for analyzing the temporaility of complex systems.  It is a method that engineers can apply to many situations that require complex temporality and causality analyses.

Many of the systems engineers design are subject to dynamic complexity (engineers are also faced with combinatorial complexity - - finding the best solution out of an astronomical number of combinations).  Dynamic complexity arises from the interactions among the agents over time.  A simple model of the automobile market contains dynamic complexity.  New car inventory, late model cars on the road, inventory coverage, average trade-in time, the attractiveness of new cars, and the late model car inventory provide a complex "stock and flow perspective" over time.

One of the best books on the business side of the system dynamics subject is Business Dynamics: Systems Thinking and Modeling for a Complex World (2000) by John D. Sterman.  Sterman explains that dynamic complexity arises because systems are:
  • Dynamic - - Heracilitus said, "All is change."  What appears to be unchanging is, over a longer time time horizon, seen to vary.  Change is systems occurs at many time scales, and these different scales sometimes interact.  A star evolves over billions of years as it burns it hydrogen fuel, then explode as a supernova in seconds.  Bull markets can go on for years, then crash in a matter of hours.
  • Tightly Coupled - - The actors in the system interact strongly with one another and with the natural world.  Everything is connected to everything else.  As a famous bumper sticker from the 1960s proclaimed, "You can't do just one thing."
  • Governed By Feedback - - Because of the tight couplings among actors, our actions feed back on themselves.  Our decisions alter the state of the world, causing changes in nature and triggering others to act, thus giving rise to a new situation which then influences our next decisions.  Dynamics arise from these feedbacks.
  • Nonlinear - - Effect is rarely proportional to cause, and what happens locally in a system (near the current operating point) often does not apply in distant regions (other states of the system).  Nonlinearity often arises from the basic physics of systems: Insufficient inventory may cause you to boost production, but production can never fall below zero no matter how much excess inventory you have.  Nonlinearity also arises as multiple factors interact in decision making: Pressure from the boss for greater achievement increases your motivation and effort - up to the point where you perceive the goal to be impossible.  Frustration then dominates motivation and you give up or get a new boss.
  • History-Dependent - - Taking one road often precludes taking others and determines where you end up (path dependence).  Many actions are irreversible: You can't unscramble an egg (the second law of thermodynamics).  Stocks and flows (accumulations) and long time delays often mean doing and undoing fundamentally different time constraints: During the 50 years of the Cold War arms race the nuclear nations generated more than 250 tons of weapons-grade plutonium.  The half life is about 24,000 years.
  • Self-Organizing - - The dynamics of systems arise spontaneously from their internal structure.  Often, small, random perturbations are amplified and molded by the feedback structure, generating patterns in space and time and creating path dependence.  The pattern of stripes on a zebra, the rhythmic contraction of your heart, the persistent cycles in the real estate market, and structures such as sea shells and markets all emerge spontaneously from the feedbacks among the agents and elements of the system.
  • Adaptive - - The capabilities and decision rules of the agents in complex systems change over time.  Evolution leads to selection and proliferation of some agents while others become extinct.  Adaptation also occurs as people learn from experience, especially as they learn new ways to achieve their goals in the face of obstacles.  Learning is not always beneficial, however.
  • Counterintuitive - - In complex systems cause and effect are distant in time and space while we tend to look for causes near the events we seek to explain.  Our attention is drawn to the symptoms of difficulty rather than the underlying cause.  High leverage policies are often not obvious.
  • Policy Resistant - - The complexity of the systems in which we are embedded overwhelms our ability to understand them.  The result: Many seemingly obvious solutions to problems fail or actually worsen the situation.
  • Characterized By Trade-offs - - Time delays in feedback channels mean the long-run responses of a system to an intervention is often different from its short-run response.  High leverage policies often cause worse-before-better behavior, while low leverage policies often generate transitory improvement before the problem grows worse.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.