MY AMERICAN SCIENTIST: Login is temporarily disabled for maintenance.

Logo IMG
HOME > PAST ISSUE > Article Detail


Transparent Engineering

Henry Petroski’s insights are always helpful in my work analyzing industrial and engineering failures. The bridge collapse Petroski described in his column “Silver Bridge” (September–October) is a poster child for poor transparency. Lack of transparency is a factor in most, if not all, severe harmful outcomes: The subprime mortgage crisis, the disasters at Chernobyl and Bhopal, and the Deepwater Horizon oil spill are just a few examples. At the heart of the Silver Bridge problem was the existence of a single component whose failure would result in collapse and whose condition was impossible to determine. Bridge inspections merely provided a false sense of security.

In contrast, a transparent system performs in such a way that it is easy to tell what’s wrong. It encourages safety, accountability and early identification of weaknesses. Modern engineering practice includes preemptive analyses that identify essential, “mission-critical” components such as the one that failed on Silver Bridge. Modern procedures would also identify problems like the inability to inspect key components, and would correct such design flaws.

Despite these precautions, failures still happen. One of the questions investigators often ask is, “What were the better, cheaper, safer ways that the harmful factors and their causes could have been found before the tragedy?” The answer often includes providing transparency systematically—built into the design and operation of the system in question. This built-in transparency can—and should—be back-fitted into important structures, components, programs and processes.

William R. Corcoran
Nuclear Safety Review Concepts Corporation, Windsor, CT

comments powered by Disqus


Subscribe to American Scientist