Charles Perrow

Normal Accidents

VTR Date: January 19, 1985

Charles Perrow discusses the world of high risk technologies.

READ FULL TRANSCRIPT

GUEST: Charles Perrow
AIR DATE: 1/19/1985

I’m Richard Heffner, your host on THE OPEN MIND. You know how, every once in a while you read a book that just knocks you on your ear? Well, I did the other week. Published by Basic Books, its title is “Normal Accidents: Living with High-Risk Technologies”, by Yale University sociologist Charles Perrow, my guest today. Now, if Professor Perrow and the copyright laws permit, I’d like to begin our program by reading to you the first paragraph of “Normal Accidents”. “Welcome to the world of high-risk technologies”. And that sets us off. We know what we’re going to be doing in this book. “You may have noticed that they seem to be multiplying; and it’s true. As our technology expands, as our wars multiply, and as we invade more and more of nature, we create systems, organizations, and the organization of organizations that increase the risks for the operators, passengers, innocent bystanders, and for future generations. In this book, we’ll review some of these systems: nuclear power plants, chemical plants, aircraft and air traffic control, ships, dams, nuclear weapons, space missions, and genetic engineering. Most of these risky enterprises have catastrophic potential. The ability to take the lives of hundreds of people in one blow, or to shorten or cripple the lives of thousands or millions more. Every year, there are more such systems. That is the bad news”. And Professor Perrow, as we begin, I wonder if that’s the bad news, what in the world could possibly be the good news?

PERROW: I’m afraid the good news is overwhelmed by the bad news, which is why I’m so concerned about this book. But the good news is a beginning. It’s a realization that we are crating systems that inevitably will have catastrophic failure. If we can realize that, and get beyond some of the engineering logic that’s dominated our world up to this point, we can then begin to redesign some systems and eliminate others. We will have sounder reasons for saying, “No, we shouldn’t go any further with this kind of development, because embedded in that system is an inevitable accident”.

HEFFNER: Well, you say, “Embedded in that system is an inevitable accident”, but you seem to be saying too that embedded in each of these systems is an inevitable accident. Then why go forward in the first place?

PERROW: In each of these systems there are inevitable accidents. But in some cases the alternatives to these systems, because of the social structure we have created, the world we have created, are impossible to find. We’re not going to give up flying. There’s just no way that our society could survive without this rapid transport. So while we have system accidents in flying, we virtually cannot do without it. And we just have to try to decouple these systems a little bit and minimize the complexity of them; we can’t do away with it. Other systems, I think, we can do away with. There are alternatives. They are expensive, and they will inconvenience a lot of people, but they are alternatives. And they’re far preferable to having catastrophic accidents that, as I said, can cripple or shorten the lives of millions.

HEFFNER: So that we can pick and choose?

PERROW: Absolutely.

HEFFNER: I mean, I thought for a second that you were going to offer the argument, since you took air travel first and said, “We’re not likely to just shut down or air traffic system”.

PERROW: Or our chemical plants.

HEFFNER: Or our chemical plants. But I thought you were going to go on then, that you might go on – though I know better from the book – that you might go on and say that therefore we can’t shut down any of them, we can’t avoid any of them. And you don’t really mean to say that.

PERROW: Oh, no. I think there’s a lot we could do. As a matter of fact, we could make marginal improvements on some of them. Again, with effort, and at some cost. But we could make the nuclear plants marginally safer, for example. But, I think, in two or three of these systems, there is nothing we can do but shut them down. The others we will have to live with.

HEFFNER: Well, Jane Moreway, who works on this program, gave me before a list of Murphy’s Laws, and there are a great many of them. But they boil down to the fact that if anything can go wrong, it will. Now, are you saying that this is a primary principle…

PERROW: No.

HEFFNER: …of every kind of organization?

PERROW: Murphy was very smart to enunciate that law. It’s very useful, and it accords with our kind of common-sense interpretation when something goes wrong. But most of the time, nothing goes wrong. We live in a world that is incredibly complex. And we get to wok most mornings. And the people we’re working with are there. The machines are there, and so on. So most things do not go wrong. What happens to make it parallel to Murphy’s Law is that what we didn’t think about is that two or three things might go wrong at the same time, and nobody ever contemplated that simultaneous occurrence of three or four failures. Therefore, they can’t figure out what’s happening. No designer ever thought, “Oh my gosh, that would happen!” The operator sits there and says, “I don’t know what’s going on”. Because one side of the plant is having valve failure (That’s commonplace. We can deal with that.) But another side has had another failure or a pipe rupture. If both of those happen in very particular circumstances – these accidents are rare, these system accidents, as I call them – if both of those happen in particular circumstances, the operator is going to sit there and say, “This is unbelievable. One dial is going this way and this one’s going that way, and they should move together. And I don’t know what to do”.

HEFFNER: Of course you’re talking about Three Mile Island.

PERROW: Yes, that’s what started me on this. That was the principal motivator of doing this book. Because Three Mile Island was not caused by operator errors. It was caused by the system. It was embedded in the system itself. It was bound to happen sometime to some plant.

HEFFNER: You mean as a normal accident?

PERROW: As a normal accident. It’s normal for these systems to have accidents. They don’t do it often.

HEFFNER: But then let me ask whether that would lead you to say that nuclear plants – they don’t need to do it often; they need to do it really only once as far as we’re concerned, if you’re anywhere in the area, or if we are the fourth person, we’re in that fourth generation of those affected by an accident, meaning the rest of the world – doesn’t that lead you to take a Luddite position: Let’s stop the world, in a sense? Not that you want to get off, but stop permitting the nuclear plants and the other steps forward, the giant steps forward, from occurring? Perhaps genetic manipulation, genetic engineering?

PERROW: I’m a very selective Luddite, if I’m a Luddite at all. And they are talking now about nuclear plants that might be designed in quite a different way than the way ours were. Ours grew out of the military plants that were designed, power plants that were designed for submarines. Very complex, very tightly packed. And you could easily reload them every time the ship came into port, the submarine. And we expanded that many times over until we got these enormous plants. But there are other, possibly there are other ways of providing nuclear power that I haven’t seen yet, and I don’t think we’re there yet. But it might be possible to do it. But that system would be one that is loosely coupled so that if there is a failure you can recover from it. The time dependent processes are very important. You could have hours instead of minutes. And they also would not be complexly interactive. So things that are just merely adjacent to each other, just because they happen to pass close together, couldn’t interact in a way that would create a failure. So it’s possible.

HEFFNER: May I ask whether, if you were asked to make a bet, whether your bet would be that in this technological age we are likely to move into a period of uncoupled or decoupled components? That we are likely to move into a period of less rather than more complexity?

PERROW: Yes.

HEFFNER: You do think?

PERROW: Yes. There is an advantage to linear systems rather than highly interactive ones.

HEFFNER: The highly interactive ones, I gather, you feel are the most dangerous ones?

PERROW: Yes. Yes. And some things, I think, as far as we know, that’s the only way you can do it. But let’s take air traffic control. There used to be, not many, but a number of accidents in air traffic control. Midair collisions, problems of landing and take-off. Through a combination of technological fixes and organizational changes, we have reduced those accidents now to, they’re just a vanishing number. It’s incredible that that system, with so many moving parts, literally, going on continuously almost 24 hours a day, doesn’t have more accidents. And that’s because they have been able to decouple parts and reduce the complexity.

HEFFNER: Yes, but you know, Professor Perrow, I was reading your book flying back from California…

PERROW: Oh, I’m sorry. (Laughter)

HEFFNER: …the other night. And I was thinking to myself how wonderful it is that such progress is being made, and that there are fewer and fewer of these accidents that are caused by whatever, these natural accidents. But how many do we need, and how many are we going to accept, and what risk are we willing to take? What risk are you willing to take? You say you are somewhat optimistic about air travel. You talked just a moment ago about changes that are taking place in the construction of nuclear plants. But how far, in your limited Luddite approach…

PERROW: Okay.

HEFFNER: …to technology are you willing to go? Where do you draw the line?

PERROW: Yes. I draw the line with nuclear plants. I think right now that they are immensely dangerous. That the chance of an accident is not one in a billion, but more like one in the next decade. I draw the line with nuclear weapons. Not that we would fire in anger, but that we would fire by mistake, in error. I think that probability is really rather significant. It’s higher than most people think. And in the case of marine transport, I would forbid a lot of shipping that goes on, and restrict a good deal of others. I would scale these things down. When we get down to airplanes, mining, dams, things like that, then I think we just have to live with the risk. So you’re not going to be able to pin me down to any absolute criteria on this. It depends on the system.

HEFFNER: But when you say, “Live with the risk”, I was interested in some of the comments you make in this extraordinary book about “Live with the risk”, a question that industry frequently asks itself, and answers.

PERROW: Not frequently enough.

HEFFNER: All right. Not frequently enough, but I gather from your book that even when the question is asked, the question of whose risk, and who will pay, is not, in your estimation, those questions aren’t asked or at least answered satisfactorily.

PERROW: Absolutely.

HEFFNER: But you’re, in a sense, doing the same thing.

PERROW: Well, if I could back up just a little bit about that. You started asking, “What’s the good news?” I think we’re just beginning – not myself of the book, “Normal Accidents” – but a number of people are just beginning to investigate that problem in a systematic way, by looking at a number of empirical cases, actual accidents, the rate of accidents, and also investigating the degree to which the people who benefit from these systems do not share the risk. That the risks are spread over people who derive very little benefit. We’re beginning to put together a science of risk analysis that is not dominated by the technologists, the engineers, and the economists, but one which takes into account human values and takes into account a sense of the complexity of systems that might have normal accidents embedded in them. And I refuse then to rise to the bait to say that I’m in the same position as industry on this, because I think I’m trying to move industry off their own center into another one that takes into account larger values.

HEFFNER: Professor Perrow! Was I so frightened by “Normal Accidents” that I misread? I thought in your comments on these risk analysts that you were rather negative. I didn’t see anything here to be hopeful about. What did I miss?

PERROW: I mistook…I didn’t state it properly. What has happened is the risk analysts are mainly body counters. How many bodies can we count today from this technology?

HEFFNER: They never add themselves in, I presume.

PERROW: No, they don’t. But the problem is that, if the technology is new, relatively untried – and nuclear power is quite new as technologies go – there are not many bodies around to count, so they say it’s safe. The public, on the other hand, has a better understanding in these matters, I think, than the new shamans, the risk analysts. The public says, “Is it new? Is it uncontrollable? Does it have catastrophic potential? Do I have a sense of dread with this technology?” They pout that together and they say, according to the polls that have been conducted by psychologists and social psychologists, that nuclear power, nuclear weapons, and DNA engineering is enormously threatening to the public. They are not counting bodies. They are talking about a social system that might be totally disrupted by catastrophic accident. They’re talking about the destruction not only of people, but of culture, ties, social interactions that they live with. Just to give an example: for the body counters, almost all risk analysts, 100 people killed in one community living closely together, a small community, is equivalent to 100 people killed nationwide that don’t know one another. For the public, they’re not equivalent. You are wiping out much more with that first case. You are wiping out a culture. And we have to take that into account. So that’s my quarrel with the risk analysts.

HEFFNER: Tell me, as a sociologist who has looked into these problems, what kind of thinking, what kind of person, what kind of philosophy of life goes into the notion that there we have a certain cost in terms of dollars, and here we have a certain cost or risk in terms of lives and the capacity to balance them. What in the world, what kind of calculus, or what kind of person makes that calculus?

PERROW: A person trained in economics or engineering is likely to come to that, because these are tools. And the tools come to dominate the total picture, or the total analysis. If you have the tools of multivariate analysis and other kinds of statistical tools, then you just plug the numbers in, and you think in a limited framework. You’re trained that way. You’re trained to be limited to do what the mathematics, the statistics will allow you. And you don‘t look at the larger picture. Though as a citizen, most of those people might. But those disciplines are just embedded in a way in their own methodologies such that they do not stand outside of them. A sociologist or a political scientist or many social psychologists, however, are dealing with a much larger system. And so he or she is likely to say, “There’s a cultural value. You haven’t counted that in. Or a social value and you haven’t counted that in.” And they say, “Well, we can’t count that. There’s no way to count that”. “Well, does it not exist then because you can’t account for it?” we say. The public knows it exists. And the opinion polls are very clear on that.

HEFFNER: Well, you know, I want to go back now. You say, “The public knows that it’s there, and counts it, and that the public opinion polls indicate that”. But I want to go back to these three C’s of yours: Complexity, the complexity of an organization; Coupling; and Catastrophe. And there seems to be a – if you want to talk about linear relationships – there seems to be a line right down between those three, with the last one being catastrophe. You say that there are certain fields, potentially dangerous fields, talk about air transport, etcetera, that can be organized and are now being organized in a way to minimize the catastrophe. Of course, one question I want to ask is what’s acceptable as minimal catastrophe? But let’s set that aside.

PERROW: Good.

HEFFNER: Because I wouldn’t do that to you. It seemed to me so much of your book, “Natural Accidents” was geared toward the notion that there is no way, as we become more technologically oriented in our society, that we can move away from those steps toward catastrophe. That in the very organization, in the complex organization, in the closely knit organizations, that you can get away with it. Do you believe that or not?

PERROW: We can’t in the direction in which we’re going. But there are other ways to get away with it. First, let me just make one more point so I’m not seen as a technological Luddite. Some of our inventions are terrific in the sense that they reduce complexity and coupling. For example, the jet engine is much simpler than the old piston engine. Much more reliable, much more linear, much more efficient. So we may have technological inventions at that scale that come on. But in terms of the organizations, if we start to decentralize organizations, spread them out, decouple them, keep them small, build in a kind of natural redundancy so that if a failure takes place in one organization it doesn’t spread widely to other organizations that are tightly coupled to it, then we are getting control over our technology and our society. There are technologies that will allow us to do that. The computer monitoring of machine tools and so on allows small organizations to grow up which can act as sellers to the big ones and remain independent. And we should foster that kind of growth. Other things are happening in our society though that tighten it and bring it together. One possible catastrophe I didn’t deal with in the book because I don’t know that much about it is the problem of genetic strains for seeds for wheat and corn and things like that. We have centralized those in our country trying to produce the perfect plant. As a matter of fact, it’s centralized in the Midwest, just a few miles from a nuclear power plant. If we wipe out that, or if a microbe or some disease wipes out one of those strains, we have a very tightly coupled system which produces a catastrophe. We don’t have independent, small-scale systems that are developing new strains. We’ve brought it all together.

HEFFNER: But is there any indication at all, on a large scale, if you forgive me, that devolution is the process of the future?

PERROW: No, no. We have both tendencies going on at once. And when I look for a second tendency, devolution, I search hard, and I’m immensely gratified when I find it. But I shouldn’t kid myself, because most things are going toward centralization. And Star Wars is another example of it.

HEFFNER: Then, as a prophet (and every man’s a prophet), what do you see? What do you see happening? The opposite of devolution and then, Boom! The big explosion?

PERROW: Yes, I think that we are getting into an increasingly risky civilization. I think recombinant DNA, Genetic engineering is just the next, is the step we’ve just taken now, immensely dangerous. I think we’re in trouble. I remain optimistic. That’s why I wrote the book. I think we can think about these things. We might get some control. After all, we have things like the Office of Technology Assessment now. We didn’t have that 15 or 20 years ago. We’re beginning to be concerned about the external costs, what’s called the externalities, the social costs of organized activity. Just beginning. We may be too late, but we could make a bigger…

HEFFNER: Why do you want to be able to wear the mantel of an optimist?

PERROW: I have children.

HEFFNER: I rather felt that that must be the answer.

PERROW: Yes. And I’m very concerned about them. And they’re going to have children. And I’m very concerned about that.

HEFFNER: But why an optimist then?

PERROW: I think you misread me, saying I’m an optimist. I really think it’s going in the wrong direction.

HEFFNER: That we’re going in the wrong direction?

PERROW: We’re going in the wrong direction. So in that sense I’m pessimistic. But if some of us can detach ourselves from that drift and write more and more books like this that deal with this, then we have some chance of making an impression. After all, the world changes now very quickly. Once I did a large project on the Sixties, the radicalism and social change in the Sixties. I was immensely impressed by the place we were at in the Forties and the Fifties. We came a long way by the time I made that study in 1970.

HEFFNER: Which leads me, as we are drawing closer to a close in our program, which leads me to come back to the point you made before. You said, “Public opinion polls indicate that the public is increasingly concerned”.

PERROW: Yes.

HEFFNER: “Not about air travel, but about nuclear plants, but about nuclear activity in space, etcetera”. Does that lead you to assume that there will be action that will be responsive to those public concerns?

PERROW: There already is. It’s not all that strong. But there is, and there’s more of it. There’s a whole environmental movement. We have EPA. We have OTA. We have those government agencies. We have congressmen who can run on that kind of platform and make those kinds of noises and sounds. And that’s helpful.

HEFFNER: Let me ask briefly, do you think that has in turn brought about a reaction against the kind of concern you’re talking about? A little too much emphasis? A little too irrational an emphasis?

PERROW: Yes, yes, yes. The risk assessors are the real alarmists, by saying that the public is alarmist. The risk assessors, I think, are saying that risk made our country great. It’s not really true. Certain kinds of risk, but not other kinds of risk. And that we have to just plow ahead into that future.

HEFFNER: Why do you say that risk, “That’s not true?” But risk did, didn’t it? Didn’t the buccaneers and the enterprisers…

PERROW: Certain kinds of risks, but not the risk that would lead to what could be called catastrophic accidents at that time. What made our country great was an empty land with enormous resources and a vigorous people who developed it very fast. That wasn’t much risk.

HEFFNER: Do you think we’re going to have to accept less greatness for safety?

PERROW: I don’t see anything ungreat about safety. So you would have to define greatness quite differently.

HEFFNER: Then we’ll have to wait and do that on another program, because I have to thank you now for joining me. Our time is up.

PERROW: Thank you.

HEFFNER: Thanks so much, Professor Perrow. And thanks, too, to you in the audience. I hope that you will join us here again next time on THE OPEN MIND. Meanwhile, as an old friend used to say, “Good night, and good luck”.