Complexity and Robustness

  • Complexity: Networks, cascading failures and implications for prediction of complex systems
  • Robustness and tradeoffs
  • HOT: Models for complex systems
  • Our papers on HOT
  • Back to research page

Complexity, interconnection, and prediction

internet From almost every perspective, global interconnectivity is on the rise. Satellite networks and communication systems allow essentially instantaneous access to data and images worldwide, fluctuations in Tokyo's financial markets are felt in New York and London, and overpopulation, pollution, deforestation, and global warming are widely recognized as everybody's problem. Social, economic, technological, and environmental systems are simultaneously becoming increasingly entangled and interconnected with each other. This globalization offers potential benefits including a higher standard of living, increased access to information, more sophisticated health care, and the possibility of providing economic incentives to protect environmentally sensitive areas. Simultaneously, it carries potentially catastrophic risks associated with cascading failures which may propagate rapidly when systems are strongly interdependent and densely connected. Examples include cascading delays in transportation and communications systems, as well as errant policies or technological blunders that initiate a whirlwind of economic, political, and environmental consequences.

insulin Of course, engineers and policymakers work extremely hard to design systems that will not break down, despite a great deal of uncertainty in the environment in which they operate and the components from which they are built. Few of us would get on an airplane if this were not the case. Simultaneously, we accept the fact that there are no guarantees. Similarly, biological evolution favors organisms which are tolerant to variations in weather and nutrients which are common during their lifetime, while selective pressure does little to protect organisms against rare disturbances. In advanced systems, the robustness architectures that stabilize systems and minimize propagation of damage are so dominant and pervasive that we often take them for granted. At their best, advanced technologies and organisms combine complicated internal networks and feedback loops with sloppy parts to create systems so robust as to create the illusion of very simple, reliable, and consistent behavior, apparently unperturbed by the environment. Nonetheless, failures occur. With increased connectivity there is the potential for increased social, economic, and environmental cost. Can anything be done to predict these events and mitigate their damage?

Robustness and Tradeoffs

A key insight comes from the observation that robustness involves tradeoffs between a broad spectrum of environmental influences that may initiate cascading failure events. Indeed, robustness can be viewed as the underlying mechanism leading to complexity. Robustness architectures dominate genomes for most organisms, and part counts for modern technologies, and provide opportunities for higher fitness and/or performance because of the stability they create. This is the essence of the Highly Optimized Tolerance (HOT) theoretical framework linking robustness and complexity which I introduced recently with my collaborator, John Doyle. HOT describes systems fine- tuned for high performance, despite uncertainties in the environment and components. HOT's essential features include highly specialized, structured, hierarchical internal configurations, and robust, yet fragile external behavior.

fragile Fragilities arise when the very architectures that lead to robustness under one set of circumstances backfire leading to extreme sensitivity in other cases. One example is the automobile airbag, which protects passengers in high speed head on collisions, but poses a danger to small children riding in the front seat, even under low speed or stationary deployment. Another is the immune system which protects people from common colds and viruses, but occasionally backfires attacking an individual's own cells in an auto- immune disease. Such failure modes represent hypersensitivities which are intrinsically coupled to the robustness mechanism itself. While overall, higher performance is achieved when the robustness architecture is included, under rare circumstances the outcome is worse than it would be if the architecture were not there at all. In many cases, this eventually leads to additional layers of complexity, as in the case of increasingly sophisticated airbag deployment circuitry, involving sensors which estimate the passenger's size. Iteration of this process culminates in a complexity spiral, in which new features are developed to mitigate sensitivities associated with previous layers. Since robustness is created by very specific internal structures, when any of these systems is disassembled there is very little latitude to reassembly if a working system is expected. Even the rare cascading failure that is the fragile side of HOT complexity reveals only a limited glimpse of a system's internal architecture. Nonetheless, studying a system's robustness mechanisms provides a potential pathway to predicting fragilities.

Models for complex systems

lattice HOT was developed initially using the models of statistical physics, modified to include a primitive form of robust design. Imagine your goal is to design a toy forest on a checkerboard landscape, where occupied sites correspond to trees, and vacancies correspond to firebreaks. Occasionally, a spark hits the forest due to lightning or some other source. If it hits a firebreak, nothing happens. However, if it hits a tree, the fire burns it and propagates through the connected cluster of nearest neighbor occupied sites. Furthermore, sparks are more common in some parts of the forest than they are in others. If your goal is to maximize the yield of trees, given the possibility of fire, what is the optimal strategy? While this is not a realistic model of forest management, it serves to illustrate the basic tradeoff between maximizing functionality under ideal circumstances (represented by high densities), and the need to devote resources to the development of robustness architectures to protect the system against a spectrum of disturbances. HOT configurations consist of compact, high density, cellular patterns of contiguous trees, separated by efficient, linear firebreaks. Optimal patterns are much more robust to fires than random configurations at similar densities, but are also extremely sensitive to changes in the spatial distribution of sparks, or flaws in the barrier patterns. In many cases, the "robust, yet fragile" nature of HOT systems leads to heavy tails or power law statistics in the distribution of failure events. Models based on the HOT mechanism have been very successful in quantitatively describing the statistical distributions of forest fires, world wide web traffic, and electrical power outages. Heavy tails reflect tradeoffs in systems characterized by high densities and high throughputs, where many internal variables are tuned to favor small losses in common events, at the expense of large losses when subject to rare or unexpected perturbations, even if the perturbations themselves are infinitesimal.

A major component of the research in my group involves development of the HOT framework linking complexity and robustness, and pursuing specific applications in ecology, biology, and technological networks. Our objectives are loosely divided into four overlapping focus areas:

block diagram HOT is motivated by biology and engineering, and builds on the mathematics of control, communications, and computing. While physics focuses primarily on universal properties of generic ensembles of isolated systems, control theory studies specific, often highly stylized systems in terms of their input vs. output characteristics. Control theory provides a mathematical framework for describing systems that are coupled to other systems, identifying sensitivities, and systematically determining the important internal variables for a system with particular objectives immersed in a variable environment. Suitably generalized, these techniques will be powerful, although currently their consequences outside of the controls community are largely unexplored. HOT provides an appealing base for the development of a broad framework for characterizing complex systems. Questions related to robustness, predictability, verifiability, and evolvability arise in a wide range of disciplines, and demand sharper definitions, and new tools for analysis. If complex systems are intrinsically composed of extremely heterogeneous collections of objects, which are combined into intricate, highly structured networks, with hierarchies, and multiple scales, then HOT provides a means to develop a common ground between models, methods, and abstractions developed in different domains.

samo fire Developing models of varying resolution, ranging from the tractable models on which the basic HOT framework is built, to complex, domain specific application models for technological, economic, ecological, and biological systems will connect our new tools with real world applications, and inspire new questions for theoretical consideration. The real test of a general framework for understanding complex systems is the extent to which is can provide new insights which impact future technologies, policies, medicines, etc.. This work has begun on several fronts and involves a spectrum of interdisciplinary collaborations. For the case of forest fires, with collaborators in geography I have developed a sophisticated new fire regime simulation environment to investigate fundamental properties of fire regime dynamics, and the long term effects of evolution and suppression on terrestrial landscapes. This work provides a fundamentally new perspective on the dynamics of disturbance-prone ecosystems and how humans interact with them. Findings will be of significance to science and management, as human disruption of natural disturbance regimes is widespread and increasing.

Click here for publications on complexity & robustness.