Citation: Sharon Beder, 'The Fallible Engineer', New Scientist, 2nd November 1991, pp32-36.

This is a final version submitted for publication. Minor editorial changes may have subsequently been made.

Sharon Beder's Other Publications

At 4 am on April 30 1988, a railway embankment near the coastal town of Coledale in New South Wales, collapsed sending tonnes of mud and water down a hill, crushing a house and killing a woman and child who were inside. The area was prone to subsidence and evidence given at the inquest suggested that the embankment had not been designed properly to take account of this. Following the inquest four people, two of them engineers, were charged with endangering passengers on a railway. One of them, a principal geotechnical engineer with the State Rail Authority, was also charged with two counts of manslaughter. (None of them were convicted.)

The engineering profession was horrified that engineers should be charged in this way and rallied to their support. Dr Peter Miller, chairman of the Australian Institution of Engineers' Standing Committee on Legal Liability, argued that criminal prosecutions against engineers set a precedent that could have important ramifications on the way engineering was practiced. He said it was likely to result in engineers becoming more conservative in their assessments and decisions and although this was not in itself a bad thing, it would mean higher costs for engineering work.

The Institution was also concerned about apportioning individual blame to engineers who work as part of a team in organisations constrained by the financinal resources available to them. The Institution's chief executive, Bill Rourke, pointed out in the Institutiona?L? magazine Engineers Australia that safety margins are closely related to the availability of funds and the provider of those funds, in this case the community, should carry a significant responsibility for safety levels.

The issue of who should take responsibility when things go wrong is becoming a central concern for the engineering profession. At the end of last year the Institution sent out a discussion paper entitled "Are you at Risk? Managing Expectations" to all its members. More than 3000 engineers replied, the largest response the Institution has ever had on any issue. In the preface to the paper, the Institution's President, Mike Sargent, said that the new trend towards criminal prosecutions based on negligence and the escalation of civil law claims "constitute a significant threat to the ability of our profession to serve the community and might even threaten its continued existence."

Miller also argues that the profession is at risk. "Engineers are being put in untenable positions" he says. "They are being asked to make decisions over matters they have no control over and being forced to take responsibility for those decisions." Nevertheless Miller, one of the authors of the discussion paper, was hesitant about sending it out. He was worried that engineers do not tend to be interested in questions that don? have answers and he was concerned that engineers would not want to discuss philosophy, even if it was engineering philosophy. He has been gratified to find an untapped hunger for it.

The discussion paper broke new ground in suggesting that engineers change the way they communicate about their work. Engineers, it said, had presented a falsely optimistic and idealistic view of their work and were now paying the price of raising too high public expectations of what engineers could deliver. "We know (or should know) that our models are limited in their ability to represent real systems, and we use (or should use) them accordingly. The trouble is that we are so inordinately proud of them that we do not present their limitations to the community, and leave the community with the impression that the models are precise and comprehensive."

The philosophy set out in the paper is that engineering is an art rather than a science and it depends heavily on judgement. The widespread use of heuristics, or "rules of thumb" in engineering require judgement to use properly. Writer B V Koen defines a heuristic as "anything that provides a plausible aid or direction in the solution of a problem but is in the final analysis unjustified, incapable of justification and fallible." Heuristics are used in the absence of better knowledge or as a short-cut method of working out something that would be too expensive or time consuming to work out more scientifically.

An example of a heuristic used to approximate the behaviour of engineering materials is: "the yield strength of a material is equal to a 0.2% offset on the stress-strain curve" (see figure 1). Yield strength is a characteristic of a material that indicates when it will break or fail. This relationship was developed from the study of the properties of materials which exhibit a definite yield point (ie that fail when a particular stress or strain is exceeded) and is applied to materials that do not exhibit a definite yield point. It is therefore an approximation but does not guarantee a correct answer but gives the engineer an approximation to work with.

Engineers also make use of engineering handbooks in their design. For example a structural engineer will use handbooks to choose the size and shape of a beam that will be able to take the stresses he or she has calculated the beam will have to function under. Handbooks enable engineers to choose materials to suit their purposes without conducting the experiments themselves necessary to verify that choice. Yet handbooks only cover a certain range of conditions and situations and the person using the handbook needs to know what limitations are implied. Even then these handbooks disguise a wide variation in materials which the engineer needs to be aware of. Figure 2 shows the huge variation in measurements of thermal conductivity of copper at different temperatures found by a number of researchers. Designers have to exercise judgement in choosing values from a graph like this and even where there is a recommended curve (as shown in the figure) the engineer still needs to decide if that is appropriate for his/her uses.

Engineers usually cope with the variability in materials and also their inability to predict exactly the stresses and strains their technologies will have to cope with whilst in use by applying a "factor of safety". Henry Petroski, an American engineer, who has written extensively on engineering accidents, illustrates the role of factors of safety: "Factors of safety are intended to allow for the bridge built of the weakest imaginable batch of steel to stand up under the heaviest imaginable truck going over the largest imaginable pothole and bouncing across the roadway in a storm."

Engineer Barry McMahon has found his clients believe that a factor of safety implies "certainty"? plus a bit more and they are "far more concerned with the risk of conservative design than they are with other sources of risk." Conservative design tends to be more expensive and so there is always pressure to reduce factors of safety.

The factor of safety is itself a heuristic which changes with time and circumstance. For a factor of safety to be effective the means of failure must be known and the cause of the failure determinable by experiment. All engineering structures incorporate factors of safety and yet some still fail. When this happens the factor of safety might be increased. However when a particular type of structure has been used often and without failure there is a tendency for engineers to suspect that these structures are overdesigned and that the factor of safety can be reduced. Petroski comments "The dynamics of raising the factor of safety in the wake of accidents and lowering it in the absense of accidents can clearly lead to cyclic occurrences of structural failures." He points out that this cyclic behaviour occurred with suspension bridges following the failure of the Tacoma Narrows Bridge.

This process of fine-tuning or cutting safety margins to reduce costs in the face of success is not confined to structural engineering and is present across all engineering disciplines. William Starbuck and Frances Milliken, researchers at New York University, have studied the Challenger Space Shuttle disaster and concluded that the same phenomenon was present there. They argue that as successful launches accumulated, the engineering managers at NASA and Thiokol grew more confident of future success. In 1986, the year of the accident, NASA had relaxed its safety procedures, treating the Shuttle as an "operational" technology rather than a risky experiment. As an operational system it did not have to be tested or inspected as thoroughly as it did for the early launches.

The O-rings sealing the joints in the shuttlea?L? solid rocket booster, which were eventually found to play a major role in the accident (New Scientist 11 Sept 1986), had shown signs of failure in several earlier flights. Inspectors found heat damage to O-rings after three of the five flights during 1984 and after eight of nine flights during 1985. But since this damage had not caused a failure of the shuttle launch, engineering managers at NASA and Thiokol came to accept this damage as "allowable erosion" and presenting an "acceptable risk". Lawrence Mulloy, manager of the solid rocket booster project, is reported as saying: "Since the risk of O-ring erosion was accepted and indeed expected, it was no longer considered an anomaly to be resolved before the next flight."

Brian Wynne, a researcher at the University of Lancaster, has also studied the Challenger disaster and other accidents. He says that O-ring damage and leakage had come to be accepted as "the new normality". Wynne argues that "implementing design commitments and operating technological systems involve the continual invention and negotiation of new rules and relationships" and that if this did not happen most technological systems would come to a halt. Starbuck and Milliken agree with respect to the space shuttle system. They point out that NASA had identified about 277 special ?hazards? associated with the launch of Challenger. ?But if NASAa?L? managers had viewed these hazards so seriously that any one of them could readily block a launch, NASA might never have launched any shuttles.?

For Starbuck, Milliken, Wynne, Petroski and many others, engineering involves experimentation. According to Petroski, "Each novel structural concept - be it a skywalk over a hotel lobby, a suspension bridge over a river, or a jumbo jet capable of flying across the oceans - is an hypothesis to be tested first on paper and possibly in the laboratory, but ultimately to be justified by its performance of its function without failure." Failures will occasionally occur but they are unavoidable, argues Petroski, unless innovation is completely abandoned.

This experimentation occurs at the implementation or operation stage as well as the design stage. Wynne says that "if technology involves making up rules and relationships as its practitioners go along, it is a form of social experiment on the grand scale." Similarly Starbuck and Milliken say that "fine-tuning is real-life experimentation in the face of uncertainty."

If engineering is based on incomplete models, judgement and experimentation, who should be held responsible when engineering projects fail causing loss of life and property and damage to the environment? For many engineers this is not a useful question. Mark Tweeddale, Professor of risk engineering at the University of Sydney, argues that finding who is to blame for an accident is like peeling an onion. In the end you are left with nothing. He feels that legal proceedings to establish blame are particularly unhelpful in sorting out the lessons to be learnt from an accident because sub-judice laws interfere with a free and open discussion of what happened.

Tweeddale believes the people charged in the Coledale rail embankment accident tended to be those slowest on their feet in getting out of the way and covering themselves. He does not think that blame can lie with individual engineers. "You've go to ask why did that engineer not act in that situation? Did the climate of the organisation he/she worked for, make it too difficult?" Miller agrees, "engineers usually work in teams in which an individual engineer cannot totally exercise his or her judgement. Particularly in large organisations engineers are part of complex networks of experience, knowledge and interests which all interact."

Wynne says there is a tendency to refer to "human error" when accidents occur as if there has been some "drastic departures from normal rule-bound operating practices, and as if we were exonerating a supposedly separate mechanical, non-social part of the system." He suggests that part of the problem may be that technological systems are designed as if organisations can operate with perfect communication and that people are not prone to distraction, illogic or complacency. Jean Cross, Professor of safety science at the University of NSW agrees that engineers tend to neglect the "human/technology interface" in their designs. For example, they do not take account of how long it takes people to process information and how people behave under stress.

The Institutiona?L? paper gives some recognition to this. They say that the notional probability of failure implicit in enginering codes does not give sufficient weight to human factors. "It deals mainly with those issues for which we can rationally compute factors of safety." Yet whilst engineers are grappling with their neglect of the social dimension of technological systems, they beleive the public is not, for its part, coming to terms with what they see as the engineering realities. Engineers, are frustrated at what seems to be the public's requirement for complete safety.

Simon Schubach, a consulting engineer who does risk assessments for the NSW Department of Planning, is often asked at public meetings "but will it be safe". But the audience seldom accepts his answer which tends to be along the lines of "On the basis of the assumptions we made, and the limited applicability of the models we used, our assessment is that the project will meet acceptable risk criteria." Schubach finds the publica?L? demand for certainty naive, unreasonable and ill-founded. "Engineering is just not like that".

Until now, the Institution's discussion paper admits, engineers have not made much effort to disabuse the public of its tendency to think engineering is in fact like that. They quote the 1946 Chair of the Scottish Branch of the Institution of Structural Engineers (UK) as saying "Structural engineering is the art of modelling materials we do not wholly understand into shapes we cannot precisely analyse so as to withstand forces we cannot properly assess in such a way that the public at large has no reason to suspect the extent of our ignorance."

The dilemma for engineers today is how to tell the public of the extent of their ignorance without loosing the communities confidence. It greatly assists in getting public acceptance of new or controversial technologies to portray them as perfectly predictable and controllable. "Concern for public reassurance produces artificially purified public accounts of scientific and technological methods and processes" says Wynne. "When something goes wrong, this background is an ever more difficult framework against which to explain that even when people act competently and responsibly, unexpected things can happen and things go wrong."

The way out of this dilemma according to Tweeddale is to educate the public and the clients of engineers that there is a lack of certainty in all aspects of life and death is a normal part of life. Miller offers no answers. He believes the profession needs to discuss the issue more fully internally before it considers ways of managing public expectations. "Engineers need to understand about the social construction of risk and how it differs from the engineering construction of risk" he says. But do engineers really perceive risks very differently from everyone else? How does an engineer who is unwilling to take the financial and legal penalties that may arise from the failure of his or her project different from the local resident who is unwilling to take the illness or injury that may arise from the failure of that same project?

Given the economic pressure to reduce safety margins the removal of professional responsibility from the shoulders of engineers could escalate the experimentation that already occurs. The Society for Social Responsibility in Engineering believes that engieners should accept responsibility for the safety of their work even if they are to be held criminally liable. Philip Thornton, President of the Society at the time the two engineers were charged over the Coledale accident, said "If an engineer makes a structure stronger because the risk of being charged if that structure collapses is too high, then the risk of someone being killed or seriously injured is also too high." Thornton argued that if engineers were concerned about the being personally liable for accidents and failures then they would be less inclined to follow instructions from clients or employers who were primarily concerned with profits and who might not understand the implications of cost-cutting measures.

In contrast Miller believes that those who require engineers to design technologies for them should be willing to take responsibility for the risks associated with them. This will require a change around in the way engineers communicate with their clients and the community but also a change in how engineers see their own role. Traditionally engineers have seen their role as being that of problem-solvers. The Institution of Engineers, Australia is now suggesting that engineers present the community with options, honestly communicating the limitations and uncertainties associated with each and allowing the community to choose. If engineers take on this new approach it will certainly mark a radical change in the way technologies are chosen and shaped.

REFERENCES

Are You At Risk? Managing Expectations, Institution of Engineers, Australia, 1990.

Henry Petroski, To Engineer Is Human: The Role of Failure in Successful Design, Macmillan 1985.

Brian Wynne, 'Unruly Technology: Practical Rules, Impractical Discourses and Public Understanding', Social Studies of Science 18, 1988.

William Starbuck and Frances Milliken, 'Challenger: Fine-Tuning the Odds Until Something Breaks', Journal of Management Studies 25(4), July 1988.