Our review of a book on the safety – or lack of it – for nuclear weapons, got me thinking. One of the book’s points was that complex, interconnected systems are inherently difficult to predict and control. Many of our nuclear weapons were designed without thought to how they would eventually be decommissioned and destroyed. It reminds me of a saying I had posted over my desk on my first “real” job out of college: if you design only for steady state, you’ll have a system that cannot be started-up, shut-down, or maintained.
Our modern society has many complex systems where failures are serious – for example, nuclear weapons, nuclear power, the electric grid, and commercial airlines. Also where people make up a large part of the system – for example, hospitals, pharmacies, and emergency response. We seem determined to insist people in these systems should be infallible, and many highly-skilled people strive to achieve this level of perfection. When something goes wrong, “blame was placed on human error, but ‘the real problem lay deeply embedded within the technological systems… what appeared to be… a one-in-a-million accident was actually to be expected. It was normal.'” Whether it’s a solider being reprimanded or a doctor being sued, we punish human beings who display human fallibility. And humans don’t like punishment, so this encourages them to hide mistakes and withhold information. Every accident and every near-miss is a chance to learn and get better. How often do we miss these chances?
History tells us we need to plan for the unexpected, the one-in-a-million. The media mocked the concept of “known unknowns and unknown unknowns,” but those are real concerns. Systems need to offer escape routes, places where the assembly line can be stopped while everyone figures out what’s happening. Take hospital errors as an example: society would be better off with insurance to care for those injured so the human beings involved could honestly assess what went wrong and how to prevent it in the future. Our current Ebola situation is a case in point. If every time the experts learn a better way they are eviscerated for not having already known, will they still implement changes to make things better?
There will be times when the people involved in an incident should be removed from their job. But often, they are human beings doing as a good a job as anyone could. We seem to have a strong need to blame, to chastise, to scream we’ve been a victim of someone else. To claim we had no idea anything could go wrong is naive and disingenuous. I hope when the adrenalin filters out of our systems, we can focus on improvement and not just punishment.