Bridges Too Far?
What can bridge collapses teach us about assigning cause?
Posted Sep 11, 2013
In his 1974 book “A Bridge Too Far,” Cornelius Ryan detailed the failed World War II Allied attempt to invade Germany by capturing several bridges crossing the Rhine River, particularly the bridge at Arnhem in the Netherlands. Ryan makes a strong case that the causes of the doomed operation included poor planning, interpersonal conflict among senior leadership, and hubris, and that a need for the Allies to appear united inhibited those who opposed it from expressing their reservations directly.
In The Why of Things, published in August 2013 by Columbia University Press, I present a model for the often times challenging task of identifying the cause or causes of events. Two relatively recent bridge collapses illustrate the challenges in assigning cause to the failure of a human construction endeavor.
The first, the collapse of a section of the Interstate 5 bridge over the Skagit River, about one hour north of Seattle, occurred after a truck carrying an over-sized load struck a girder on the bridge and caused the immediate collapse into the river of a section of the bridge. Fortunately no one was killed.
The truck had been preceded by a forward car with blinking lights, a sign warning of an “over-sized load”, and a mechanism to measure whether the load was likely to strike something on the highway, as required by regulation, but no following car. The driver of the truck reported that he had driven to the right because of a passing semi-trailer truck.
Undoubtedly, the bridge collapse was caused by the truck’s hitting the bridge structure. In reviewing the collapse, the National Transportation Safety Board (whose final report has not been released as of September 2013) made public comments that cited the design of the bridge as a contributing factor. In what is referred to as a “fracture critical” design, the failure of any single crucial supporting element would lead to collapse. In striking one of the supporting steel gusset plates, the truck did just that.
Causality would seem easy to assign here. The precipitating event was clearly the truck’s hitting the bridge. A predisposing element was the bridge’s design. However, other factors also need to be considered, include the passing semi-trailer truck, the experience of the driver, the rule that required a preceding warning car but not a trailing warning car. There are also issues that relate to the societal context of public expenditure. The need to use economical design approaches that do not include redundant failure prevention features (jet planes with multiple engines, for example, are designed to fly on only one engine in the event of engine failure while aloft) is not a failure but a choice that society makes in weighing costs and benefits. I refer to it as a programmatic cause because it is a not only a contributor to the event but is also a part of the larger context of society that relates to more than construction design. These 3 Levels of cause are similar to the analytic model proposed by Aristotle 2500 years ago. His final level, Purposive isn’t relevant here.
The primary Method guiding this analysis is empirical or scientific. The vulnerabilities of this bridge design can be expressed mathematically, and the failure was inevitable once the truck struck the girder. The narrative Method ties it all together: the truck driver’s perception that he needed to pull over because of the passing truck, the regulation requiring a forward but not rear guide car, and the design features.
The identification of the role of the truck in hitting the bridge superstructure is an example of a categorical causal attribution. Certainly, without that event the collapse would not have happened.
The second relatively recent bridge collapse involved the Interstate 35 bridge in Minneapolis in 2007. Thirteen people were killed and 45 injured, according to AP writer Frederick J. Frommer. The precipitating event was the failure of one of the large “gusset” or supporting plates. Because this bridge, too, had a “fracture critical” design, the failure of this one element made collapse inevitable. The cause of the gusset plate failure was both the plate’s manufacture–it was produced at only half the designed thickness–and the heavy load placed on the bridge by concrete resurfacing equipment that had been parked on the bridge in preparation for bridge repair. Why was the plate manufactured at the incorrect thickness? I was not able to find an explanation. Possibilities include a purposeful, fraudulent method for increasing profit, an error in transcription of the plans, or a recalculation of what was necessary. Evidence for any of these would be empirical and categorical.
Perhaps the best and best studied bridge failure in the U. S. was the Tacoma Narrows Bridge collapse in 1940. In his readable and illuminating book, To Engineer Is Human, Henry Petoski makes the general point that engineers learn from failure and that knowledge advances that would otherwise not have occurred have repeatedly been the results of such failures. He identifies two design elements that contributed to the Tacoma Narrows Bridge failure. First, the bridge had an inherent vibrational frequency induced by wind so that one half of the bridge was “up” and the other “down” in a self-reinforcing fashion when the wind blew. A second aerodynamic force also led to the bridge’s flexing or vibrating from side to side, a possibility that was unknown to bridge designers and had never been considered before. These aerodynamic vibrations were self-reinforcing, similar to the way a rider on a swing can time leg kicks to increase the height of each swing. They were known to the designers of airplane wings and within weeks of the bridge failure were identified by Cal Tech engineering professor Theodore von Karman as the cause of the collapse.
von Karman calculated that a 40 mile an hour wind would cause the forces needed to induce the vibration that would cause the bridge failure, the speed at which the wind was blowing that day. Thus the wind was the precipitating cause, the inherent aerodynamic instability a predisposing cause (had the wind never blown at that speed the collapse would not have occurred), but the inevitability of unknown elements of design, described by Petoski as “unplanned experiments that can teach one how to make the next design better,” are better conceptualized as programmatic cause, inherent in the limited nature of human knowledge.
There were also unique features of the Tacoma Narrows Bridge design. It was only two lanes wide, used a new design in which plates replaced the lattice beam trusses that had been used in older suspension bridges. These two elements made the bridge lighter and less costly, but also increased the ease with which the 40 mile an hour wind could induce the aerodynamic instability. That is, a heavier, wider and thicker bridge would have had different aerodynamic properties. Again, these steps were taken to lower the cost of the bridge's construction, but in retrospect, contributed to its failure, and are a part of the programmatic context of designing costly structures.
What is so powerful and emblematic about the Tacoma Narrows Bridge failure is this coherent, comprehensive, and convincing narrative that pulls together principles of engineering, inherent elements in materials, the need to be cost efficient, and the riveting drama of a collapsing bridge.
Each of these four stories, the failed Rhine crossing, and the I-5, I-35 and Tacoma Narrows Bridge collapses illustrate the complexity of causal analysis as well as the success that a multi-facet analysis can bring to the task of determining causality. There are instances in which a straight forward, single cause can be identified and others in which different levels of analysis, the use of specific analytic methods and a particular conceptual model best identifies cause. I have tried to balance the complexity of this approach with the belief that the identification of specific causes is usually possible in The Why of Things, and look forward to your response to this blog.
As Petoski emphasizes, a correct causal analysis is much more than assigning blame. It provides guidance for preventing future failures and can drive the accumulation of new knowledge.
copyright Peter Rabins 2013