THE OPEN MIND
Host: Richard D. Heffner
Guest: Charles Perrow
Title: “Risky Business and Normal Accidents”
I’m Richard Heffner, your host on THE OPEN MIND
When I was a youngster, I was taught – as you must have been, too – that “accidents happen”. That was surely the gentler, kinder assumption to make. So your parents didn’t really punish you when they did occur, and you grew up with just a bit more forgiveness and forbearance when others accidentally broke this or messed up that.
“Normal accidents”, so to speak. They go with the territory. The human condition. That’s life. Perfectly understandable. Grin and bear it.
But what about when as a nation as well as individuals we’ve grown up enormously bigger and stronger, have developed vast technological skills, built huge chemical plants and nuclear weapons, have manipulated nature both in inner and outer space, have learned to forge biological, genetic change itself, and work over every day the very stuff of creation: the beginning of life…and, presumably, the end of life, too?
What about the notion of “well, accidents happen” now of there being “normal accidents” just in the nature of things when – given what in our times we’ve done with the genius of man and the forces of nature – “normal accidents” can be potentially so absolutely catastrophic?
Most of us don’t like even to think about this fact of life, about the horrendous threats now posed by “normal accidents” in a technological age of man-and-machine interdependence. But my guest today makes us do so. Some years ago, in fact, before many of us were quite so concerned about Chernobyl and Challenger and Bhopal and jet planes self-destructing, I began an OPEN MIND program by saying: “You know how every once in a while you read a book that just knocks you on your ear? Well, I did the other week. Published by Basic Books, its title is “Normal Accidents, Living with Hi-Risk Technologies”, by Yale University sociologist Charles Perrow…who is back with us again today because in the past half-decade things don’t seem to have gotten any better, seem, indeed, to have gotten worse. And I want to see if Professor Perrow has some cheerier insights for us. How about it?
Perrow: None whatsoever. That…as you noted that book came before Chernobyl, Challenger, Bhopal, the Basil, Switzerland plant, the poisoning of the Rhine accident and may others, and the script was more or less laid out in the book and I just kept nodding and saying “Yes, I told you so. It’s bound to happen. There’s no way you’re going to avoid these kinds of catastrophes”. And today I have no better news. As a matter of fact, they seem to be springing up in new areas that we really didn’t anticipate before.
Heffner: Well, for instance?
Perrow: Well, we always thought that the government’s nuclear weapons program was fairly safe. There were no Three Mile Islands at Savannah, Georgia, or the other nuclear plants, Hanford, Washington, and then last…oh, a few months ago the disclosures came out that in the DuPont plant there was a memo that came out thirty accidents, near–misses, potentially dangerous had been suppressed over the last two decades at that plant. The Fernwall plant in Ohio, they just turned on the radon and let poisonous gases leak out when it got too heavy inside the plant. They drifted down the street and were beginning to see the effects upon the people there. Hanford has been covering up, literally, its nuclear waste with about two feet of porous soil, and allowing it to seep into the Columbia River eventually. So there’s one that I didn’t really anticipate unfolding in such a spectacular way.
Heffner: Well, of course, knowing you from talking with you on the program last time and reading “Normal Accidents”, I know that you’re not taking very great pleasure out of saying “I told you so”. What are the prospects then? That we’re going to sit back and let it keep happening until it’s all over, over here?
Perrow: No. No, there are some interesting news along these lines.
Heffner: You mean good news?
Perrow: Yes. Somewhat good, difficult to live with, but difficult to do much about. But let’s take the airlines. Deregulation started with President Carter, long about 1978 or so forth. Proceeded very fast. Deregulation in terms of the economics of the industry and for the price of tickets was really quite good. We got a lot more flights and they got cheaper. But we didn’t read the other side of deregulation which is that you have to keep the safety aspects intact. You have to keep Federal Safety Inspectors and you have to keep the planes inspected on a regular basis and so forth. So, quite predictably, for a while we had a rash of near-misses and some crashed from airplanes, and a lot of turmoil about this. Now the system is more or less corrected itself, it’s righted itself, and the safety record is still very good. So here is an area where we can expect that there will be occasional disasters, but not that many and the number will get vanishingly small.
Heffner: Yet your thesis has always been, if I understand it correctly, that the more complicated life becomes, the more opportunities there are for several things to go wrong at the same time, and then there’s a major catastrophe.
Perrow: Yes, but the airline problem is basically solvable. It is fairly complex, but it can be made more linear and straightforward. One of the things they did to make air traffic control safe was put a lot more routes into the sky, and to give control over the pilots by the ground controller. They centralized a system which normally we don’t like to do, centralize a system, but there it worked. And we’ve reduced the number of accidents. Then accidents came on because of the lack of inspection. That’s easy to repair, you just hire more inspectors. Congressmen fly on those planes. The Airline Pilots Association is there ready to raise a fuss if there are too many accidents, to investigate them on their own, independent investigations. There are six organizations positioned around that industry that are ready to correct it, and keep inspection as a fairly high level, and to introduce new technologies. Now if you take a system like the nuclear power plants that you have very few organizations, and if you take nuclear weapons plants, the Savannah plant or the Fernwall plant or the Hanford plant, there are almost no organizations. DuPont was there and it did just what it did. Westinghouse came in, took over the mess, and turned out behaving very much as DuPont did; firing people who blew whistles and make a fuss. There’s not an organizational network to promote safety in that industry. Other industries there are. How you would create one for that I don’t know. That’s the bad news. I don’t know what you could do about it. But the good news is that not everything has to be catastrophically dangerous.
Heffner: Tell me what you mean “not everything has to be catastrophically dangerous”?
Perrow: If you could have a technology that can be made more linear and less complexly interactive, then you get more safety. And if you can have a regulatory system or an organizational environment that makes an investment in investigating that technology continually then you get more safety. Let me, let me just show one point.
Perrow: It’s not hi-tech itself. The jet engine is hi-tech compared to the piston engine, but it’s much safer, much less complex, more linear, and less tightly coupled than the piston engine. Some of our systems are moving towards more safety in that way, and as a matter of fact, industry tries to do that all the time because accidents cost money.
Heffner: Explain what you mean by “less coupled”.
Perrow: Tightly coupled. If you have…let me back up just a moment. Everything, as you pointed out in the beginning is subject to failure. We know that. therefore we design into our systems safety devices, back-ups, buzzers, alarms, fire walls, fire breaks, flare towers and so forth because we know that something’s going to go wrong. Those are designed in. Fine. But, if you get two or three things going wrong, they may be quite trivial, and they often are, small things wrong…going wrong at the same time and they interact in an unexpected way. It can go around these safety devices and now you have a problem. That still doesn’t mean a disaster. My university does that continuously. We just live with that. It’s always happening that things are going wrong and there are all kinds of messes out there. But my university’s not tightly coupled. It’s very loosely coupled because we can substitute an Assistant Professor for a Professor. We can substitute a decision on Monday for one that should have been made on Friday. That’s pulling a system apart so that the coupling is not tight. When you wire together, and just putting a garden hose onto a faucet, like that, get tight coupling, anything that happens here is going to be transmitted here, and there’s not much you can do about it. So when you get an accident in one of these systems and it’s tightly coupled, the operators don’t know what to do or don’t have time to do anything about it. And the system cannot be stopped in mid-stream like a space shuttle cannot be just stopped up there and held until everything’s corrected and it goes its full term.
Heffner: How do you achieve that…excuse me…where it needs to be achieved?
Perrow: The loose coupling?
Perrow: Very expensive. You move things apart. For example, our nuclear power plants have something called a “spent storage pool”, and the one I saw, one of the ones I saw, is about the size of an Olympic swimming pool and a half. Enormous swimming pool, but nobody swims in it because it’s radioactive water, into which they put the spent fuel rods, and if that’s not cooled continuously, the water will boil away and those rods will start sparkling like sparklers on the 4th of July, and spewing out radioactivity. Okay, so this is a safety device. How are we going to keep the spent rods until they cool down enough so we can ship them off to some place where we hope we can bury them, but we haven’t been able to yet? Fine, that’s nice. But why put it on top of a nuclear power plant? Why is it right in the main building of most power plants?
Heffner: What’s the answer?
Perrow: That’s tightly…move it away.
Heffner: But what’s the answer as to why?
Perrow: Oh. Cost considerations.
Heffner: Okay, and you know that. I wanted to get to the point where I could quote you…this article in The Nation in ’86. You say, “Even if there are reasonably thorough investigations of accidents, we should not expect sweeping reforms or even promising ones. Reforms cost money, slow down production, and can even prompt curtailment of the system. No wonder they are unwelcome”.
Heffner: So where’s the…
Perrow: And that’s not just true in the U.S., which is very concerned, right now about fiscal matters. It’s true in Russia. Two years after Chernobyl there was a story in The New York Times about the gross error and neglect and indifference to safety matters, sleeping on the job at even a Russian nuclear power plant. Not the one that blew up, that hasn’t started, but one of the adjacent ones. Westinghouse took over from DuPont, which had made a mess out of some of our weapons plant…did the same thing. I just don’t think we’re going to get away from this unless you have a system like the air transport system where you have the FAA, the Congress, the Airline Pilots Union, the aircraft manufacturers that can be sued, and the airlines themselves which can be sued. That’s a constellation of interested parties, and all of those people are going to be looking at the cause of accidents.
Heffner: But let’s not kid ourselves. Murphy’s Law…if anything can go wrong it will…still prevails, doesn’t it?
Perrow: No, it doesn’t, no. Murphy…
Heffner: Tell me how not.
Perrow: Murphy was really quite wrong, and one of the things I learned since the book pertains to that. It’s very hard to have an accident, a catastrophic accident. We have the small ones all the time. They go on constantly. They go in my life. I suspect they go on in your life, and they go on in every organization. But we have all these safety systems involved and all these back-ups and fire alarms and so forth. So we have some kind of stability from this. To have a serious accident you need to have not only the interaction of a number of failures in a tightly coupled system that you can’t recover from, but you have to have people nearby, and often there isn’t. For example, let’s see, we had some vapor cloud explosions in the Midwest from railroad car accidents and they would wipe out a large area, but there were only six people there. But we had the same kind of hexachlorophene vapor clouds in a Florida suburb, and which would have wiped out that whole suburb, but another condition; there was no spark there. So you need a lot of people there, and you need something to set it off. Not only do you need that, but you need no warning. Now we had a warning when the Teton Dam went down of roughly two hours, and there were very few people killed. There’s a dam in Italy, three thousand people…no warning. So there’s six or seven conditions that have to come together to make a catastrophe. Bhopal is a case where it did occur, but that’s very rare.
Heffner: But you’re making very rare sound as though it doesn’t happen. And it does happen.
Perrow: Well, if you have more Bhopal plants, it’s going to happen more often. But any one of them, Murphy is wrong. Bhopal had…West Virginia, Charlotte, West Virginia…Union Carbide plant. Modern plant, hi-tech. They spent six million dollars after Bhopal to make sure it wouldn’t happen there. And a few months later they ha almost a replay of Bhopal.
Heffner: But of course I‘ve always been amused by the elaboration on Murphy’s Law…if anything can go wrong, it will and then a list of twelve others. Just a few of them: “Left to themselves, all things go from bad to worse”, “If everything seems to be going well, you have obviously overlooked something”, “If you play with something long enough…”, and that brings us to the airlines and their problems now…
Heffner: …”it will surely break”, “Nature always sides with the hidden flaw”, and then amendments to Murphy’s Law such as “Things get worse under pressure”. And what we live in now is a very highly, highly pressurized society.
Perrow: Yes, but if you look at the imminent dangers around…why am I so optimistic?
Heffner: I…that’s what puzzles the hell out of me.
Perrow: (Laughter) If you look at the imminent dangers around us in all our systems, in our chemicals, nuclear, flying and so forth, you do not get up every morning and read about a few thousand people dying. We lose maybe, as a very rough guess, but I would say fifty to one hundred thousand people a year from natural calamities…earthquakes, and floods and droughts, and things like that. The chemical industry in the U.S. kills through toxic chemicals, kills only about 17 people a year. That’s very small, considering the size of that industry, and the amount of danger there.
Heffner: Except for Bhopal and things like that.
Perrow: Yes. The problem is that you’re going to get one ringer in there, and the Institute West Virginia accident, the Union Carbide plant, we were just very lucky that it was aldecarboxide instead of MIC, that the tank was a third full instead of full like it was at Bhopal, that the weather conditions were such as they were, and so on. Now we are…continue to be lucky with all those accidents. We have, I don’t know, maybe a thousand of them a year and we only kill 17 people.
Heffner: Well, you know I’m amused in terms of what you just said because when Linda Murray, the Associate Producer of THE OPEN MIND prepared research material for me, her first line was “food for the phobic”…
Heffner: …that’s what you’ll find in this folder. “It’s no wonder that some people are afraid to get out of bed in the morning”. Largely stemming from what you had written and from what others deriving from your concerns had written. Now, is Pollyanna, is there a new phase to Charles Perrow?
Perrow: Oh, no. I have to live in this world, too.
Heffner: What do you mean by that?
Perrow: And I like…I like to ski and if I were younger I’d be hang gliding, and I like to scuba dive and I want tot take my risks and enjoy them, too, and I do a lot of flying and a lot of traveling, so there is one side of me. It’s a side, to be serious about this…it’s a side that says “don’t listen to Murphy’s Law and do not get hysterical about these things”. There are ways of handling some of them and where we don’t have ways, we can shut them down. There is not a risky system on this earth that we need, essentially, for life.
Heffner: Now wait a minute.
Perrow: Not a single one.
Heffner: Wait a minute, “We can shut them down”. Tell me about all the potential dangerous situations that have been shut down.
Perrow: (Laughter) Well…
Heffner: I’m ready to tick them off.
Perrow: (Laughter) That’s very good.
Heffner: Because doesn’t everybody speak the way you just did? There are certain things that I want to do.
Heffner: I want to take risks. I want to ski and this and that and the other thing. But we’re talking now as you have written now about the modern world becoming more and more risky for larger and larger numbers of people.
Heffner: So where does Pollyanna come into this?
Perrow: Because we can get control of it, and we won’t get control of it if we just wring our hands and talk about Murphy’s Laws. We are likely to shut down the Shoreham nuclear power plant running at five percent now for absolutely excellent reasons. It’s likely to happen. I didn’t think it would happen a few years ago. I don’t think the Seabrook plant is going to make it. I just said that they’re behaving badly at the other plants at Chernobyl, but the Soviet Union has cut back on its emphasis upon nuclear power. That’s an encouraging sign. Star Wars…
Heffner: But we’re not going to, are we?
Perrow: Oh, I think we will, yes. Star Wars has been cut back. That was extremely dangerous because they were going to be sending tons of plutonium and nuclear reactors up into space in order to have Star Wars, and that could create all kinds of catastrophes.
Heffner: Cut back, why? Because of the concerns that we’ve been talking about?
Perrow: Good point. Only marginally because of safety concerns. Mainly because it wouldn’t work.
Heffner: And dollars.
Perrow: And dollars. Yes, yes. It wouldn’t work because though, in a sense of the complexity of that system. We realized, we had eminent scientists writing a glossic on normal accidents, saying that there are ten million computer lines that have to be written for Star Wars. Nobody has been able to write one million without having ”x” number of interacting failures within. So there was a lot of concern about complexity. We learned a lesson there. Some of our systems are getting safer. There’s a nuclear reactor. The Pious reactors and others that might be really safe, and I’m prepared to believe the engineers, and I often don’t believe engineers. But I’ve looked at some of the designs and as far as I can tell, they have gotten…they have reduced the tight coupling and they reduced the complex interactions in those systems. Now if those kinds of things can be designed…weapons, that’s something else.
Heffner: But I go back to your quotation about things that cost money, and when you talk about weapons, too, you – at the end of the piece – say, “But as long as national goals are served by risky systems, we will continue to have them and their catastrophes”. Now, this is Charles Perrow…
Heffner: When my friend Neil Postman is here, he sometimes talks about what he writes or says on Mondays, Wednesdays and Fridays, and then the opposite on Tuesdays, Thursdays and Saturdays. Do you feel we have to keep going ahead, talking about linear projections and saying “We want this. We want that. We want to enjoy all these things; therefore we’ll take these risks”.
Perrow: In part, that’s true and that’s always been true, but there are risks and there are other risks. Let me try once again to state where I come out on this. What I’m concerned about is the hysterical reaction to all technologies. Go back to the earth, the “small is beautiful”, extreme case of “small is beautiful”. We see that coming upon us just as we see fundamentalist religions coming upon us. People are running from the realities out there, and they’re asking for completely safe systems and completely safe worlds. That bothers me; I think that’s a hysterical reaction and it’s very easy for the technologist to knock that down. We can’t go that far. We can say parcel it out, this can be made safer, this can’t. Let’s work on abandoning this. You say it hasn’t been abandoned, but I think nuclear power is being abandoned as a risky system. I’m worried about recombinant DNA. I think we’re going to have some real problems there. When we get them, then we may move toward abandonment of that system. We won’t do it until we get some major catastrophes from that. So we have to separate out what we can live with, what can be made more safer, have many more safety engineers in all walks of life and then try to shut down these others. Now that’s not an optimistic conclusion, but it’s not the pessimism that I hear from so many people.
Heffner: The only trouble is, Professor Perrow, and I’m sorry our time is up, is when I go back to Perrow, he says “Even if there are reasonably thorough investigations of accidents, we should not expect sweeping reforms or even promising ones…
Perrow: That’s right.
Heffner: …”reforms cost money, slow down production, etc. And then when they get in the way of national goals, risky systems are going to be maintained”. Anyway, the dialogue is the important thing, I guess, in saving our lives and I appreciate you’re coming here today.
Perrow: And just remember it’s very hard to have a catastrophe.
Heffner: That’s probably the best way to end a program.
Heffner: Thank you very much, Professor Perrow.
Perrow: Thank you.
Heffner: And thanks, too, to you in the audience. I hope you’ll join us again next time. And if you care to share your thoughts about today’s program, please write to THE OPEN MIND, P.O. Box 7977, FDR Station, New York, NY 10150. For transcripts send $3.00 in check or money order. Meanwhile, as an old friend who didn’t fall over his words used to say, “Good night and good luck”.
Continuing production of this series has generously been made possible by grants from: The Rosalind P. Walter Foundation; The M. Weiner Foundation of New Jersey; The Mediators and Richard and Gloria Manney; The Edythe and Dean Dowling Foundation; Lawrence A. Wien; The New York Times Company Foundation; and, from the corporate community, Mutual of America.