A Systems Approach to Crisis Preparedness and Organizational Resilience

Home | Topic Index | Course Info | Student websites | Organizations | Class Sessions | Discussion | Site Map | Video | Quotes | Links 

Wildavsky: Anticipation Versus Resilience

From Searching for Safety, Aaron Wildavsky, (Transaction Publishing, 1988)
Chapter 4. Anticipation Versus Resilience, pages 77-79

Anticipation is a mode of control by a central mind; efforts are made to predict and prevent potential dangers before damage is done. Forbidding the sale of certain medical drugs is an anticipatory measure. Resilience is the capacity to cope with unanticipated dangers after they have become manifest, learning to bounce back. An innovative biomedical industry that creates new drugs for new diseases is a resilient device. Are risks better managed, we may ask, by trying to anticipate and prevent bad outcomes before they occur, or by trying to mitigate such effects after they have shown up? What proportion of anticipation and of resilience (since we need both capacities) is desirable under which conditions?

Anticipation attempts to avoid hypothesized hazards; resilience is concerned with those that have been realized. If it were possible (1) always to predict efficiently with a high degree of accuracy, that is, to guess right often enough to make up for the costs of guessing wrong, and then (2) to react effectively-controlling the expected condition so as to leave life better off-anticipation would seem to be the preferred strategy. Why court unnecessary damage? Usually, however, the uncertainties are so substantial that we cannot tell in advance which, if any, among a multitude of hypothesized dangers will actually turn out to be the real ones. In focusing on a specific hazard that might have been averted, it is easy to lose sight of the many false predictions that were made at the same time. How, then, with the best will and the brightest thinkers in the world, can we know in advance which dangers will really come about? Wrong guesses are not merely a single person's error. Presumably, each "guesstimate" would be followed up with preventive (central) governmental measures. In addition to the cost of using up society's resources on false leads-that is, leaving insufficient resources to counter unexpected dangers-each preventive program contains its own pitfalls. Inoculations against disease or agricultural subsidies or regulation of industry can do injury: inoculations make some people sick; subsidies raise prices, thereby lowering the standard of living, and encourage overproduction, thereby driving farmers out of business; negative regulations deny people access to drugs that might help them. Each course of action also interacts with existing patterns and creates new developments that we do not yet understand but that may turn out to be harmful. All actions, including those that are intended to increase safety, are potential hazards.

I stress the counterintuitive implications of anticipation as a strategy for securing safety because this should guard us (and policy makers as well) against the facile conclusion that the best way to protect people is always to reduce in advance whatever hypothetical risk may be imagined, rather than enabling people to cope in a resilient fashion with dangers when, as, and if they manifest themselves. Are we better off doing nothing unless we are absolutely certain it is safe, or are we better off doing as much as we can, ruling out only high probability dangers that we can effectively prevent and relying otherwise on our ability to deal with harms as they arise?

Resilience

Ecologist C. S. Holling compares anticipation as a means of control of risk with the capacity to cope resiliently[1] :

Resilience determines the persistence of relationships within a system Stability on the other hand, is the ability of a system to return to an equilibrium state after a temporary disturbance With these definitions in mind a sys- tem can be very resilient and still fluctuate greatly, i.e., have low stability. I have touched above on examples like the spruce budworm forest community in which the very fact of low stability seems to introduce high resilience. Nor are such cases isolated ones, as Watt has shown in his analysis of thirty years of data collected for every major forest insect throughout Canada by the Insect Survey Program of the Canada Department of the Environment. This statistical analysis shows that in those areas subjected to extreme climatic conditions populations fluctuate widely but have a high capability of absorbing periodic extremes of fluctuation In more benign, less variable climatic regions the populations are much less able to absorb chance climatic extremes even though the populations tend to be more constant.

To repeat: "low stability seems to introduce high resilience." Yet the very purpose of anticipatory measures is to maintain a high level of stability. Anticipation seeks to preserve stability: the less fluctuation, the better. Resilience accommodates variability; one may not do so well in good times but learns to persist in the bad. As Holling sums up : [2]:

The very approach, therefore, that assures a stable maximum sustained yield of a renewable resource [say, a single variety of wheat or complete elimination of predators or toxic substances] might so change these deterministic conditions that the resilience is lost or reduced so that a chance and rare event that previously could be absorbed can trigger a sudden dramatic change and loss of structural integrity of the system?

Grass bends before wind but usually does not break; it is temporarily unstable. Trees maintain strong stability under high winds but when they break, they have no resilience. The damage is irreversible, even after the wind has died down. There is a Darwinian explanation for this variation in vulnerability. With a uniform environment (i.e., one that can be predicted with high certainty), the best adapted genetic variants in the population will quickly spread and dominate the rest, producing genetic uniformity; then, when the environment does change in an unexpected way, the whole population tends to die out at once. In a changing environment (i.e., one in which any single prediction has a low probability of actually coming to pass), by contrast, first one variant is favored, then another, and another, so that all of them tend to be maintained in the population as a whole. When a change occurs, therefore, several genetic variants are available, and some will like the new environment.

Though the language of Holling's concept of resilience is abstract, it has concrete policy implications: The experience of being able to overcome unexpected danger may increase long-term safety; but maintaining a state of continuous safety may be extremely dangerous in the long run to the survival of living species, since it reduces the capacity to cope with unexpected hazards. Keeping "out of harm's way" (something Don Quixote preached but never practiced, hence his longevity) may be harmful.

"We are not fully free," Friedrich Hayek warns, "to pick and choose whatever combination of features we wish our society to possess, or to...build a desirable social order like a mosaic by selecting whatever particular parts we like best.."3 The good and the bad are inextricably mixed says the axiom of connectedness; all we can do is try to choose a strategy that over time will leave us better rather than worse off.

There is nothing inherently better in the one strategy over the other in all situations. Rather, each –Anticipation and Resilience—is well suited to different conditions. An environment with periodic extremes would correspond to a situation where uncertainties are large ….

Anticipation is preferable if:

1. we can predict the worst. We can’t

Misprediction in military affairs is legendary. Failure to predict what has occurred vies for honors with predictions that never came to pass. The belief in the 1930s …

2. foresight that does exist is used in policy. It isn’t: populating flood plains.

Anticipation doesn’t work well because we don’t understand: [2]:

 

1. human systems
2. systems in general



[1] Holling, C. (1973). "Resilience and stability of ecological systems." Annual review of ecology and systematics 4: 1-23. (quote is from page 17).
[2] Ibid, p 21

Topics All  |  Crisis Topics  | Global Crises  |  Resilience Topics  | Resilience Tools | linkedin discussion group

Home  |  Author Index  |  Course Info  | Class Sessions  |  Field Trips  |  Organizations  |  Videos  |  Links  |  Quotes

Date Page Created: Apr 13, 2014 Last Page Update: Apr 13, 2014