Our review of a book on the safety – or lack of it – for nuclear weapons, got me thinking. One of the book’s points was that complex, interconnected systems are inherently difficult to predict and control. Many of our nuclear weapons were designed without thought to how they would eventually be decommissioned and destroyed. It reminds me of a saying I had posted over my desk on my first “real” job out of college: if you design only for steady state, you’ll have a system that cannot be started-up, shut-down, or maintained.
Our modern society has many complex systems where failures are serious – for example, nuclear weapons, nuclear power, the electric grid, and commercial airlines. Also where people make up a large part of the system – for example, hospitals, pharmacies, and emergency response. We seem determined to insist people in these systems should be infallible, and many highly-skilled people strive to achieve this level of perfection. When something goes wrong, “blame was placed on human error, but ‘the real problem lay deeply embedded within the technological systems… what appeared to be… a one-in-a-million accident was actually to be expected. It was normal.'” Continue reading