![]() | The Long Road to Recovery: Community Responses to Industrial Disasters (UNU, 1996, 307 p.) |
![]() | ![]() | 4 Seveso: A paradoxical classic disaster |
![]() | ![]() | The lessons of Seveso |
Many students of disaster have concluded that uncertainty and communication are key factors in the management of emergencies. During emergencies, uncertainty increases and formerly dominant consensual views of problems and solutions often break down; different parties tend to evaluate the same evidence differently and, at times, tend to perceive different sorts of evidence. Such divergent interpretations create antagonisms and mistrust, which persist after the acute phase of an emergency has ended and complicate the tasks of recovery (Quarantelli 1988; Otway and Wynne 1989).
Our study of Seveso and other disasters (De Marchi, Funtowicz, and Ravetz 1993) suggests that there are six basic types of uncertainty (table 4.2) and eight distinctive strategies for managing the communication of uncertainty (table 4.3). Together, these two sets of variables provide the basis for a model of uncertainty management that has broad applicability.
Table 4.2 Types of uncertainty
Type |
Description |
Situational |
Inadequacy of available information in relation to necessary
decisions |
Legal/moral |
Possibility of future liability or guilt for actions or
inactions |
Societal |
Absence or scarcity of integration of publics and
institutions |
Institutional |
Withholding of information by agencies for bureaucratic
reasons |
Proprietary |
Contested rights to know, to warn, or to conceal |
Scientific |
Difficulty of risk assessment or of forecasts of
emergencies |
Table 4.3 Strategies for communication of uncertainty
Interpretations |
Policies |
Suppression |
Secrecy |
Discounting |
Confidentiality |
Recognition |
Publicity |
Amplification |
Sharing |
Situational uncertainty involves a poor match between the decisions that must be taken and the information at hand. It is normally the most salient type of uncertainty because information is central to decision-making. It is also a very common type of uncertainty because complete highquality information about major hazards is usually lacking. Moreover, interagency collaboration in decision-making is usually required and knowledge about the capabilities of such agencies is often incomplete.
In an ideal world, legal/moral uncertainty would not be salient because decisions would always be made in the public interest with due consideration of social justice; decision makers would be held free of liability. But few public decisions about industrial hazards meet these exacting criteria, so decision makers cannot ignore the possibility that they will be subject to legal action or moral censure. Concern about legal/moral uncertainty often leads to indecisiveness and defensiveness about the release of information.
Societal uncertainty occurs when institutions and the publics that they are intended to serve are not well integrated. Decisions that are subject to high degrees of legal/moral uncertainty also tend to be affected by societal uncertainty. Such uncertainty is most marked where every action is scrutinized by lawyers who represent other stakeholders. But societal uncertainty can be manifested in other ways. For example, respect for government agencies may be low, or individualism may be carried to extremes, either among the public or among leaders in major institutions.
Institutional uncertainty is brought about when agencies withhold information for bureaucratic reasons. It is most likely to be high in circumstances where there are difficulties about informal communication, acquaintance, and trust among personnel of agencies with different traditions and missions. This ensures that the necessary channels of understanding and confidence are absent during a crisis. Institutional uncertainty can be high even in relatively consensual societies, if there happens to be a tradition of bureaucratic secrecy.
When the parameters of confidentiality are strained, proprietary uncertainty becomes salient. Thus, in the midst of an emergency there may be a debate about the rights of persons to know, to warn, or to conceal.
Scientific uncertainty is the last (but by no means the least important) type of uncertainty. It is mobilized at various phases of hazard including before, during, and after emergencies. For example, (scientific) risk assessments that are undertaken well in advance of a crisis may employ long-established techniques to evaluate industrial plants and equipment but may have to depend on less-seasoned methodologies to analyse the transport of environmental pollutants (Funtowicz and Ravetz 1990). When a hazard is in the acute (emergency) phase, the possibility of effective forecasting may be either good or poor, depending on the circumstances (which themselves cannot always be predicted). Thus, scientific uncertainty can vary from low to very high.
Two sets of strategies (table 4.3) are available for communication of uncertainty, one of which is an attribute of people or agencies that make decisions; the other refers to the way in which communication is accomplished. Some people may decide to suppress information about uncertainty entirely, even from themselves. This may translate into a refusal to admit that uncertainty exists or a failure to notice it. It is an extreme form of discounting. Ordinary discounting will recognize a possibility but (as with many events in the distant future) will assign such a low value to its salience that it can be neglected for policy purposes. Recognition of an uncertain contingency is a balanced appreciation. By contrast, amplification is an emphasis - perhaps even an overemphasis - of the significance of uncertainty.
Corresponding to the interpretations are the policies concerning communication of uncertainties. At one extreme lies secrecy, the extreme case of confidentiality; then comes publicity, with its own extreme form - sharing. There are many variations and nuances in any practical policy of communication. The utility of these classification schema can be illustrated with reference to the Seveso disaster, the Seveso Directive, and the Karin B incident.
At the time of the Seveso disaster, the complexity of communication problems under conditions of severe uncertainty was recognized, if not fully managed. Before the gas release, no one outside the plant neither residents nor political or health authorities - had any idea that there was a hazard of such magnitude. The explosion and release were greeted by incredulity, followed by alarm and dismay. The firm's initial behaviour led to subsequent suspicion about their motives; various instructions for precautionary measures were issued almost immediately, but the firm denied knowledge of the toxic substances involved (Rocca 1980; Rocca 1992, personal communication). Ten days passed before the firm confirmed that dioxin had been released (Pocchiari, Silano, and Zapponi 1987). Only then did the governmental authorities and the public learn that there was a grave risk. Even so, it was impossible to assess the danger with any precision. There was an onset of genuine dread, about illness in general and about malformed babies in particular. The widespread illness and deaths of animals of many species was an ominous sign. The authorities had their own severe problems of decision-making under uncertainty, including the definition of different polluted zones, programmes of evacuation of endangered residents, and disposal of contaminated material.
From the very beginning of the disaster, situational uncertainty was salient; decisions had to be taken, sometimes under conditions of great urgency' in the nearly complete absence of information that might guide actions. Scientific uncertainty was salient, as shown by the fact that local investigating magistrates closed off the site within eight days of the accident. Societal uncertainty was severe because there had been no previous institutional preparation or consultation for the accident. Legal/moral uncertainty was also severe. For example, the (Swiss) Technical Director of ICMESA found himself under arrest when he attended a works meeting 12 days after the accident (the Director of Production was also placed under arrest at that time, and was assassinated by terrorists four years later). One of the few relatively straightforward aspects of the accident was the low level of proprietary uncertainty. Although the provision of relevant information did not proceed as quickly or smoothly as desired by all, at least there was no need for the government authorities to use legal means to force the firm to divulge information. The fact that the ICMESA factory was already sequestered would have made it highly imprudent for its owners to withhold information about the contaminants, and it was noted at the time that the dioxin threat had already been publicized by the media before it was officially confirmed. Later, and off the Seveso site, proprietary uncertainty was not as low, particularly in connection with the disposal of barrels containing toxic materials. From 1982 onwards, stories of concealment and blunders began to circulate and these have not yet ended (see Chronology).
Our model of uncertainty management is also reflected in the regulations of the Seveso Directive. The main concern here is with communication:
Member States shall ensure that information on
safety measures and on the correct behaviour to adopt in the case of an accident
is supplied in an appropriate manner, and without their having to request it, to
persons liable to be affected by the major accident originating in a notified
industrial activity within the meaning of Article 5. The information should be
repeated and updated at appropriate intervals. It shall also be made publicly
available. Such information shall contain that laid down in Annex VII. (Article
8 of Directive 88/610/EEC, amending Directive 82/501/EEC)
This portion of the Directive reflects concerns about several sorts of uncertainty. First, there is an attempt to institute progressive reduction of scientific uncertainty via updating requirements. Second, the various phrases that call for effective implementation of the public's right to know show clear awareness of the need to confront problems of institutional uncertainty and proprietary uncertainty. Moreover, the very existence of the Directive, particularly Article 8, underscores heightened awareness of legal/moral uncertainty, for the Seveso event showed that simple "accidents," or "acts of God," are not the most important problems affecting the safety of industrial installations and surrounding communities.
When we consider the implementation of the hazard communication requirements of Article 8, we find that the model illuminates practice. First, actual EC regulations seem to assume that societal and institutional uncertainties are not salient or severe. Nor do they deal with the possibility of situational uncertainty (i.e. less than complete competence of available official expertise for prediction, prevention, or control). The contrast between European and American practice is noteworthy. In the United States, provision is often made for the inclusion of alternative expertise via environmental legislation that permits the use of public funds for the incorporation of local citizens' knowledge into the policy discourse on the grounds of due process or fairness.
Finally, the model can also be applied to the Karin B incident. Some 12 years after the Seveso gas release, a shipload of Italian industrial toxic wastes was first dumped in Nigeria and then reloaded after protests. In the full glare of publicity and widespread public dread, the regions of Emilia-Romagna and Tuscany undertook the final task of disposal, in the process showing how a large quantity of mixed toxic wastes could be managed, with full satisfaction of technical requirements and local concerns.
Between the time that the Karin B was discovered to be carrying a toxic cargo and the eventual agreement on destruction of the wastes, all uncertainties were effectively out of control. Whoever knew about such shipments had previously kept them secret; when they were discovered, therefore, all the issues of knowledge, uncertainty, and responsibility came into play simultaneously. However, when the regional authorities of EmiliaRomagna and Tuscany together with several local authorities - finally took physical possession of the wastes, the change was dramatic. Acting in cooperation with each other and with the media, and creating opportunities for the participation of interested communities, they were able to reduce salient uncertainties, starting with the scientific ones and then proceeding to others, such as institutional uncertainties. The societal uncertainties became less severe and less salient, and the clean-up operation proceeded peacefully to a successful conclusion in all respects (Centro Informativo Karin B 1992; Egidi 1993).
Health and safety have recently joined goodness, truth and justice among the pantheon of Western culture's root ideals. Moreover, better health and safety have become prominent public goals, precisely because there seem to be real possibilities for achieving them. Unfortunately, none of these ideals is unambiguous: all are characterized by internal contradictions that may generate either fruitful or destructive outcomes.
In the debates on risks in the 1970s, it gradually emerged that "safe" does not mean zero-risk. Just as an empirical proposition may be accepted as true and later proven false (e.g. the Ptolemaic system of the world), or an action apparently good later becomes judged to be bad, similarly an installation accepted as safe may later explode. But the reverse does not hold: if there is an explosion, it is not a simple refutation of the judgement "safe." This is an example of the principle that allows people to continue believing that flying in airplanes is "safe," even though there are occasional crashes.
These and similar contradictions associated with the concept of safety are managed pragmatically by a variety of devices. One of these is linguistic interpretation. "Safe" can mean that risk is variously "negligible," "acceptable," "tolerable," "in accordance with best (or even standard) practice," or "unavoidable." Many of these interpretations are equivalent to the legal meaning of "non-culpable" risks. The pragmatic interpretation that is invoked will depend on circumstances.
In spite of the fact that many experts and critics are aware of the dialectical character of safety, most public discussions reflect the belief that an objective condition of safety is obtainable with just a little more application and honest effort. When such expectations are disappointed, critics seek explanations in simplistic theories that usually involve misguided or malevolent parties. Academics are just as prone to this behaviour as others. An important recent example was the use of "cultural theory" by certain social scientists to explain why Americans apparently considered that environmental safety had declined during the 1970s despite considerable progress in pollution control. This explanation was based on a fourfold model of social psychological ideal types of people, in relation to their social groups. For example, environmentalists of all sorts were labelled "sectarians" and were said to possess a romantic cosmology that derived from the psychological contradictions of supposedly closed and egalitarian millenarian groups (Douglas and Wildavsky 1982). In our terms, Douglas and Wildavsky had become partially aware of the contradictions in the ideal of safety, and realized that it is not reducible to numbers. Yet they could not move on to accommodate the contradictions by means of practical measures for realizing safety in the face of real hazards (Funtowicz and Ravetz 1985).
The Seveso Directive provides an important and relevant example of the contradictory character of safety. Article 8 of the Directive is based on the assumption that openness on the part of firms and authorities is good for safety. Clearly, policies of concealment can be very bad for safety. But it is questionable that perfect openness leads to perfect safety. Let us consider what might have happened if the Seveso Directive had been in place in 1976; this is an imaginary, counterfactual case, which cannot be used for the logical proof of a thesis but which can be a useful heuristic device.
The Directive as a whole demands certain sorts of institutional behaviour, in return for which it provides a certification of quality of performance. In simple terms, if an installation meets the Directive's criteria it is deemed "safe." Suppose, now, that the Seveso regulations had been in force in July 1976. Then the ICMESA factory would have previously submitted its safety report and we suppose, further, that there would have been no objections to it. The local population and the authorities would have been provided with some information about the chemical processes and their hazards. Presumably, knowledge of the earlier accidents involving TCP would have been in the public domain. Also, there would have been some emergency procedures in place. Now, supposing that, in spite of all the available information, the explosion had still happened, what would have ensued? First, it is likely that there would not have been a delay of 10 days before dioxin was publicly identified, nor another 10 days lost before there was any clarity about what to do. Would it have helped the community response, for this information to have been known instantly?
There would doubtless have been a more speedy evacuation and, therefore, probably less exposure of the affected human population. But would there have been less trauma (Conti 1977; Edelstein 1988) resulting from the sight of dead and dying animals and from the evacuation, or less dread from the unknown consequences of the invisible poison, or less of a stigma associated with Seveso and its population and products (see Chronology, July 1977)? Probably not.
However, as we have remarked, it was the relatively successful recovery from the accident that enabled Seveso to become an uncomplicated symbol of successful response to industrial disasters. The contrast with Bhopal and Chernobyl is striking. Of course, there was an early period characterized by the recriminations and accusations of incompetence and cover-up that commonly afflict such victim communities. This aggravation reached its height about six months after the Seveso gas release, when little remedial work was under way and the regional government proposed to install an incinerator in the district. Since then there have been periods of lesser and greater tension, mainly associated with the use by others of Seveso as a symbol; but suspicions about the behaviour of the company and the authorities seem never to go away.
In the context of such heightened tensions, Seveso became a microcosm where all the existing conflicts within society (political, institutional, religious, industrial) were reflected. However, within a relatively short time such conflicts abated and the recovery of the community proceeded. For, in Seveso, blame was never at issue: the responsible party was known from the outset and soon offered reparation. Moreover, the eventual disappearance of the offending factory itself and the physical exportation of the toxic substances and polluted soil enabled the community to feel cleansed. The resolution of the emotional after-effects of the trauma, so necessary for the recovery of a community, was facilitated by these favourable circumstances.
All these achievements, which made Seveso a symbolic example of recovery from industrial disaster, depended on the construction of a working relationship between the community, the government agencies, and the firm. This was accomplished through open and sometimes bitter struggle among the various parties, but the common interest in a reasonable outcome was never in question. The victims knew that they would receive assistance. Had there been uncertainty and strife about the source, amount, and timing of compensation, the communities would not have been able to pull themselves together as they did within a year and a half, once the threat of malformed babies receded and evacuees were returned to their homes. Instead, we can imagine a permanent state of mistrust between the different governmental agencies and companies and, indeed, within the communities themselves, where the processes of recovery would have been seriously inhibited. Histories of recovery from other disasters, both natural and man-made, show how important are these factors in the political and moral spheres (Barton 1969; Erikson 1976; Couch and Kroll-Smith 1991).
Now we must ask, if a firm had already been in compliance with safety regulations of the kind later required by the Seveso Directive, would its response have been different? Suppose that a firm's legal advice was that its prior compliance with all regulations decreased its responsibility for the accident and hence its liability for compensation. It is a commonplace of the theory of regulation that the submission of firms to the financial costs of external regulation is compensated by the legal protection they receive for compliance.
All we need to imagine is a case where a firm's top management would have decided against total acquiescence in the picture of the disaster and its aftermath as presented by the local community and authorities. That would have been enough to slow down the reparations. But it was the unprecedented speed of compensation offers, along with acceptance of blame and contribution to rehabilitation, that made all the difference to the recovery of Seveso. Otherwise, there could have been the protracted litigation that occurs in so many such cases and which causes psychological and moral harm, ultimately inhibiting the healing processes of recovery.
Thus, we encounter a moral paradox illuminated by Seveso: more effective prior safety regulation could conceivably have prevented the achievement of the best path to the subsequent recovery of a community. Once an accident has occurred, the cleansing of resentment and guilt, which are experienced by agents and victims each in their own way, could be inhibited by a denial of moral liability. The paradox can be expressed as an ill effect of a good principle: prior regulation, with openness of information, could lead to a confusion concerning responsibility after the event. Such paradoxes are familiar to those managing hazards of various sorts in the insurance field; thus "moral hazard" refers to the tendency of people to take chances once they know that the insurers will pay; and the "no fault" principle for common accidents, while seeming to exculpate the responsible persons, is promoted as being useful in preventing the expenses and injustices of litigation.
Seveso also produced a paradox about the use of scientific knowledge in the policy process. Although there was undoubted physical and psychological illness among people, together with the deaths of many animals, dread consequences for human health have been elusive (Mastroiacovo et al. 1988; Regione Lombardia 1989; Mocarelli et al. 1991). In this respect it could be said that Seveso is a disaster that has not yet produced identifiable disastrous consequences. Even the most recent epidemiological results, while showing an increase in some sorts of rare cancers, do not provide firm evidence for a generally increased cancer risk to the monitored population (Bertazzi et al. 1993)
In the Seveso case, dread was associated with the perceived toxicity of dioxin. Once it was realized that the population had been subjected to dioxin contamination, the accident became, by definition, a disaster with severe psychological, social, and economic effects. However, in this case, scientific certainty about the extreme toxicity of dioxin gradually dissipated. No established scientists have argued that Seveso's population continues to suffer significant health effects.7 So the recent accusations (Chronology 1992, 1993) that dioxin was a component of the factory's production would, paradoxically again, amount to evidence that the substance was less toxic to humans than was initially believed.
A visitor to Seveso now finds a park where the factory once stood; some say that Seveso is now the least polluted place in Italy. Of course, the history of illness, dread, and disruption cannot be undone. But the recovery of the community proceeded smoothly; only the stigma of the town's name survives as a present source of harm. So Seveso has become, simultaneously, a symbol of an industrial disaster and a monument to relevant ignorance in science (Keynes 1921). But such ignorance is not absolute and it need not be paralysing for decision-making. At Seveso, monitoring continues, and the lessons of this relevant ignorance are being assimilated into our understanding of the place of science in the modern world.
Seveso now functions partly as an experiment, along with other monitored disaster sites such as Hiroshima. Data from the affected Seveso population are used as evidence in other, less straightforward, pollution cases and also for the ongoing review of regulations. Every experiment exists in a particular context, and inferences from its data depend on an assumption of similarity between the experimental setup and that of the other case in question (Funtowicz, MacGill, and Ravetz 1989a, 1989b, 1989c). The extent to which Seveso, with its single event of atmospheric contamination (and later contact with contaminated objects), is an appropriate model for situations of long and continuous contamination will be debated among scientists and policy makers.
Toxicology necessarily makes inferential leaps - from animals to humans, from large doses to small, and from acute to chronic doses. In turn, these inferences underlie the dose-response models that are used to define "safe limits." Thus, toxicological models have large inherent uncertainties, and large-scale accidents with good subsequent monitoring can provide less unrealistic sources of data (Funtowicz and Ravetz 1995).8
The very classic status of Seveso as a dioxin disaster could possibly lead to the use of its data in a paradoxical way. As we have seen, Seveso was an immediately perceived disaster, but one where the long-term health consequences have up to now been accepted as far from disastrous. We may be tempted to make a simple inference: Seveso was a harmless dioxin disaster; therefore, other dioxin releases need not be harmful. Such an argument was recently made in Arkansas, where the evidence of Seveso has been used in arguments supporting the safety of a proposed toxic waste incinerator that would emit dioxin in a similar quantity to that estimated for Seveso (Schneider 1992). Thus, we have the scientific paradox of Seveso: an event at first accepted as a disaster (with great consequences for regulatory policy) is now being used as evidence for safety. The symbol of Seveso may now be becoming increasingly complex: its original connotation of dread is challenged by one of reassurance. Paradoxically, the excellence of the recovery of Seveso could be used for the assertion of limited liability, with possible consequences for litigation and impeded recovery elsewhere.
However, as scientists know, it needs only a single long-delayed pathological condition to appear in the monitoring process for the original negative resonance of Seveso to be restored. And then the recovery of Seveso, apparently so complete at this time, could suddenly be thrown into question. Even the complete absence of conclusive evidence of cancer among chloracne victims and others in the most exposed zone A might be explained in terms of "the small population size, youth of the subjects, and short follow-up period" (Bertazzi et al. 1993)
Since the 1970s, a number of serious industrial accidents have provoked a reappraisal of safety issues. First, it was realized that even apparently unique industrial disasters have regular causes; in one sense they are all "man-made" (Turner 1978) because of the way they occur through failure of systems for prevention. A more radical interpretation, derived from a study of Three Mile Island, is that they are actually "normal accidents" (Perrow 1984). The affected industries, while not planning such accidents, accept them as a normal aspect of operations. We can even consider industrial systems as "accident generating systems" (Haastrup and Funtowicz 1992), routinely producing unwanted outputs along with their intended products; these include continuous pollution and wastes, along with occasional incidents of different intensities. When an incident goes beyond a certain threshold (defined conventionally by the terms of relevant regulations) it is deemed to be an "accident," and some accidents eventually become disasters. But, as the Seveso case shows, even a "disaster" has strongly conventional elements in its definition and response (Susman, O'Keefe, and Wisner 1983; Quarantelli 1987). Thus, our comprehension of industrial risks has moved completely away from the acausal or "acts of God" approach; they are creations of the industrial system as much as its intended products.
This new awareness about industrial risks has coincided with an increasing concern for the perceived loss of environmental quality due to the synergistic effects of technological development and environmental processes, as in the cases of acid rain and global warming. We now appreciate that the technological system is global, complex, and rather tightly coupled. The dividing line between the "goods" and the "bads" produced by the system is sinuous and indistinct. Implementation of this ecological awareness in industrial and regulatory practice is now under way.
The new ecological awareness includes an appreciation not only of the interconnectedness of the effects of the "bads" of the industrial system but also of the conventional character of the traditional distinction between "manmade" and "natural." Industrial accidents, and recovery from them, cannot be seen in isolation from the pathologies of the total industrial system, itself a subsystem of the planet. Contradictions within that subsystem, and between it and other components of the total system, are the key to its comprehension. Thus, famine and floods (for example) may now be no different in kind from the sudden events called industrial accidents and disasters.
To understand the processes of recovery from such unwanted events we must conceive of them as occurring within that total system. In the case of industrial disasters, the recovery of a community takes place not only in the societal sphere but also in its moral dimensions and, equally importantly, in its ecological aspects as well. Thus, community recovery exists as part of a wider process, involving all the elements of the total ecosystem.
Seveso's recovery was dependent on the special character of the incident itself and especially on the response of the firm and the authorities. Seveso was especially fortunate, not merely because the damage occurred over a short time rather than a protracted period but also because the factory at Meda could be dispensed with. Other classic industrial disasters, such as Chernobyl and Bhopal, involved installations which, although themselves taken out of service, belong to a class that is kept in operation - even in the same locality. In such cases the hazard is chronic and there is no escape from the relevant pathologies of the industrial system.