Implications of Nonlinearity and Complexity

… a core concept used in Implementation and Delivery and Atlas107

Concept description

Leslie Pal (reference below) reviews the implications of nonlinearity and complexity in public policy problems. He writes (p. 314):

“Crises involve discontinuities and jumps that create challenges. … if normal policymaking may be seen as incremental, with each step more or less predictably or controllably emanating from the last, then what is nonincremental, unpredictable, and not immediately controllable has the potential to create crisis.”

Nonlinearity in policy problems

Pal defines nonlinear policy problems as “problems where small changes in initial conditions can have large consequences, where uncertainty is high, and where there are discontinuities in normal events and shared responsibilities for action” (p. 343). He writes (pp. 315-316):

“Beyond the element of surprise … crises or emergencies as nonlinear policy problems are collective action problems. Comfort (1999), for example, looks at what she calls “shared risk” problems – crises or emergencies that affect large communities – as nonlinear policy problems. Her 1999 classic study reviewed examples of earthquakes and emergency response to them, and concluded that they were characterized by small, unpredictable shifts and ripple effects, where the methods to address them “differ from those used in traditional policy analysis” (Comfort, 1999, p. 4).

“The idea of nonlinearity in policy problems of emergency situations poses more than an incidental challenge to policy analysis. … the conventional discipline is based on a rational model, which, in turn, has deeply embedded assumptions of what constitutes knowledge and the techniques to generate knowledge, as well as how that knowledge is used – essentially as information through communication. “Breakdowns” occur through communication or informational failures. Comfort (2007) argues that in a case like Hurricane Katrina (2005), information about the impending storm and its communication were remarkably accurate; the underlying problem was “cognition,” or developing a shared picture of the likely threat among a very heterogeneous group of actors (multiple jurisdictions at multiple levels, with private sector and nonprofit organizations as well). The rational model of decisionmaking assumes linearity – reasoning from information to decisions. In emergencies, whether floods or fires, that process is too slow. “Emergency managers using cognition do not review the entire set of rules of operation for the system but rather scan the margins for discrepancies or malfunctions. It is the discrepancy between what they view as normal performance and the change in status of key indicators that alerts them to potential danger” (Comfort, 2007, p. 193). Klein, for example, shows that actors involved in providing emergency services (e.g., firefighters, crisis ward nurses) rely on “recognition-primed decisionmaking” that draws rapidly and almost intuitively on analogies of previous experiences (Klein, 1998).

“Another strand in this critique of linear reasoning is based on the challenge of recent scientific theories that break with Newtonian mechanics and deterministic science. These theories include quantum mechanics, complexity theory, chaos theory, and cognitive science (Morçöl, 2002). The difference is between seeing the world as a clock or as a cloud. A clock has deterministic mechanisms, the system works on the basis of clear causality and connections, and the clock, as a whole, is stably configured as a clock and nothing else. Imagine a cloud, either of vapour or of thousands of tiny insects. Its boundary is constantly shifting and changing, its shape elongates and contracts, and yet it is still recognizably a cloud. The mechanisms of interaction are much more challenging to explain – there are no pulleys or gears, only what appear to be random interactions of particles or insects that constantly ripple through the system. And yet the system, with all of its disparate elements, manages to maintain an equilibrium, and is self-correcting and adaptive.

Comfort’s application of chaos and complexity theory

Pal writes (p. 316):

“Normal policymaking and normal responses can be organized hierarchically, but in emergencies with high stress, high uncertainty, and multiple actors, hierarchy breaks down quickly. What are needed instead are dynamic, complex, nonlinear systems of adaptive self-organization. Comfort (1999, pp. 8–9) provides an example, drawing eight key concepts from the scientific literature on complexity and chaos that describes elements of these types of systems.

  1. First, the evolution and dynamics of these communities, like all complex systems, depend greatly on initial conditions and characteristics of the system. Even small differences among systems in these initial conditions can have far-reaching consequences.
  2. Second, random events occurring outside the system can have great effects on the system itself and take it into unpredicted directions.
  3. Third, these random events are irreversible within the system in the sense that whatever impacts they have become part of the system itself – Comfort’s example is how an unexpected earthquake led to revisions in building codes that significantly altered construction in seismic zones in California.
  4. Fourth, feedback loops of communication and coordination lead to adaptation by mutual adjustment (our cloud example above, or a school of fish).
  5. Fifth, because multiple actors create constraints for action through the need for coordination, centres of energy and influence crop up in these systems (leaders or “strange attractors”) that move the system forward.
  6. Sixth, this forward motion in a complex system can involve a transition to a new equilibrium of a substantially changed system.
  7. Seventh, the behaviour of the system often yields unpredictable results, and, finally, these systems can develop recurring patterns of behaviour in different contexts to achieve similar system-wide goals.
Gladwell’s tipping point theory

“A final illustration of thinking differently about complex systems and change that casts light on shifts and jumps is the idea of the tipping point. Gladwell (2000) argues that the world is full of instances – from fashion to crime rates to drug use – of often abrupt, dramatic, and inexplicable changes. Certain books and clothing styles appear out of nowhere and become social phenomena. Inner-city crime can suddenly leap up or decline precipitously. These phenomena are equivalent to systems that undergo a sudden transition. Gladwell is focusing on social systems and rapid transitions, and argues that these tipping points are analogous to epidemics and function with similar mechanisms. “Epidemics are a function of the people who transmit infectious agents, the infectious agent itself, and the environment in which the infectious agent is operating.

Pal’s five implications of nonlinearity and complexity

Pal concludes his review with five implications of nonlinearity and complexity for practical policymaking (excerpted and reformatted from pages 318-319):

  1. For one thing, initial conditions make a difference in how systems evolve. Coupled with this is an emphasis on a system of interactions, not on a single problem (Perrow, 1984).  Conventional policy analysis … begins with problem structuring. Even though we alluded to the fact that problems are complex and that they come in clusters, it is still largely an analysis that focuses on one vector of issues or challenges. And it presumes that one can make an intervention along that vector and change causal relations and thus change outcomes. An example might be an inner-city community-based program to counsel teenagers on sexuality and drugs. A conventional approach would see the “problem” in terms of negative outcomes from unsafe sexual practices and dependency on drugs. Obviously, the analysis would highlight the role of schools and families, but it would target youth as “clients” of the program and seek interventions. A complex systems approach would look at teenagers within the community context and at “systems” embedded in teenage groups (peer pressure, social bonding, what is considered “cool”) and take that as a point of departure. It might also be sensitive to the fact that the same program might have very different results depending on what community and what groups of youths were being supported – the initial conditions of each system would make a big difference in outcomes.
  2. A second implication is the importance and, indeed, the inevitability of small, random events that can shock a system and change its trajectory dramatically. “Planning for the unexpected” is, of course, difficult to do, but this mindset encourages a mentality of monitoring the environment as well as internal processes regularly. It might also encourage deliberate redundancy in systems that are fragile or whose failure will have far-reaching consequences.
  3. A third implication is that large changes can come from small interventions. This insight is embedded in the idea of tipping points, but it is a feature, as well, of other complex systems theories. From a policy point of view, it implies that significant change can be generated from small, focused interventions that take system characteristics seriously.
  4. Fourth, in line with the notion that complex systems have internal mechanisms of modest equilibrium, there is an emphasis on feedback loops of communication and information. When a school of fish is observed in motion, it seems almost like a single organism, even while it is made up of hundreds of actors mutually adjusting to the myriad of each other’s actions. Complex social systems – from loose coalitions to formal organizations – also require inordinate amounts of communication and information exchange to work. This lesson comes from the implementation literature, of course, but it takes on a new angle in the context of complex systems theory. In traditional implementation thinking, the problem is communication from above down the line, to ensure that the original policy idea unfolds as planned and that everyone is more or less operating in the same framework. From a complex systems perspective, the communication has to be 360 degrees in three dimensions. It is about information moving up as well as sideways and from top to down. It is about information and communication as the loose glue that ties the system together, but also makes it possible to adapt and adjust.
  5. The final implication is about adaptation. The conventional approach to policy analysis and implementation sees it as a linear unfolding. But bring in randomness, unpredictability, and the impact of small events, and it is likely that at any given point, the system (an organization, a political party) will need to respond to shocks and either regroup or transform itself in the light of that external shock. This need means adaptation as well as learning that maintains the integrity of the system in some fashion. There needs to be a capacity to build on experiences, incorporate them into practice, and embed them into some sort of collective memory.
Atlas topic, subject, and course

Managing Risk (core topic) in Implementation and Delivery and Atlas107.


Leslie Pal (2014), Beyond Policy Analysis – Public Issue Management in Turbulent Times, Fifth Edition, Nelson Education, Toronto. See Beyond Policy Analysis – Book Highlights.

Comfort, L. K. (1999). Shared risk: Complex systems in seismic response. Amsterdam, Netherlands: Pergamon.

Comfort, L. K. (2007). Crisis management in hindsight: Cognition, communication, coordination, and control. Public Administration Review, 67(Issue supplement s.1), 189–197.

Gladwell, M. (2000). The tipping point: How little things can make a big difference. Boston, MA: Little, Brown.

Klein, G. A. (1998). Sources of power: How people make decisions. Cambridge, MA: MIT Press.

Morçöl, G. (2002). A new mind for policy analysis: Towards a post Newtonian and postpositivist epistemology and methodology. Westport, CT: Praeger.

Page created by: Ian Clark, last modified 10 April 2017.

Image: Widewalls, Ludo’s Chaos Theory, at, accessed 9 April 2017.