The low rate of overt accidents in reliable systems may encourage changes, especially the use of new technology, to decrease the number of low consequence but high frequency failures. These changes maybe actually create opportunities for new, low frequency but high consequence failures. When new technologies are used to eliminate well understood system failures or to gain high precision performance they often introduce new pathways to large scale, catastrophic failures. Not uncommonly, these new, rare catastrophes have even greater impact than those eliminated by the new technology. These new forms of failure are difficult to see before the fact; attention is paid mostly to the putative beneficial characteristics of the changes. Because these new, high consequence accidents occur at a low rate, multiple system changes may occur before an accident, making it hard to see the contribution of technology to the failure.
From:
How Complex Systems Fail (pdf)
(Being a Short Treatise on the Nature of Failure; How Failure is Evaluated; How Failure is Attributed to Proximate Cause; and the Resulting New Understanding of Patient Safety)
Richard I. Cook, MD
Cognitive technologies Laboratory University of Chicago
When revolving doors replaced hinged doors (1800s) there was a great deal of concern over safety. Someone could get stuck in a revolving door and perhaps suffocate before he was discovered. For many years there were laws that required that every revolving door should be attended by a door man trained in rescue. Or they were outlawed forever.
Elevators made the upper floors of many buildings more desireable than the ground floor. However elevators crashes always got front page headlines. Eventually an elevator manufacturer invented cars that cannot crash. Until that invention many cities banned elevators. For example buildings in Paris were limited to 5 storeys by law until elevators became legal (which is why Paris is mostly 5 storeys tall).
Ships used to be rowed on the ocean hugging the shore line because navigation was an unknown science requiring tools not yet invented. When these tools were invented and ships were rowed beyond sight of land many dissappeared because of sea monsters. Navigators developed maps with monsters marked and rowed safely across the sea.
All these solutions were invented by private enterprise in countries that enforced the property rights of the lower classes, or in places where laws did not exist (high seas).
When change is controlled by rulers, change occurs only in emergency. More societies suffer from over-caution than from members who seek adventure. Risk-taking/adventure is normally banned.
I was in Boston the night of the northeast blackout in 1965. The whole region had been placed on a new grid and the hospitals had disconnected their emergency generators since they would never need them again. I was, appropriately, in the “morbidity and mortality” conference (“M&M”) at the Mass General. They did have battery powered emergency lights but the sterilizers, for example, were not powered by any backup. Of course, somebody with appendicitis came in and they had to boil instruments.
My wife was at a friend’s house and was just screwing in a new light bulb when everything went dark. She was convinced she had done something terrible but they couldn’t figure out what it was. She finally left to go back to our apartment and, on the car radio, heard about people stranded in elevators in New York City. I finally got home and we waited as the apartment got colder until the lights finally came on about eight hours after they went off. The only lights in Boston were a huge Edison Company sign across the Charles River.
Hubris is punished by the gods.
So many people think that opposition to an over-sized, over-fed, and over-intrusive government is all about this particular issue, or that esoteric philosophical position, or some unfathomable attachment to big business/businessmen/mean rich people.
But that’s not it at all, at least for many of us who desire a smaller, less powerful, less all-encompassing state.
It is just what this article is about—the fragile complexity of systems which try to do everything, especially when they’re constructed from a detached, top-down, and terribly flawed elite which generally has little or no experience in the areas they are trying to design.
There was just a bit in our local fishwrap about how good it was that there were no members of the X industry on the board regulating that industry.
The same people who say this kind of stuff with a straight face, and oh so concerned head tilt, would be outraged if a bunch of parents were put in charge of their childrens’ education, or a group of doctors were allowed to make rules for the legal profession, but think it’s great when the people who are doing the work have nothing to say about the way their work rules are made, at long as their those evil, greedy capitalists.
They don’t know what they’re doing, and never have.
And that is the true heart of the matter.
As a nuclear engineer, I continually push back against “safety enhancements” in nuclear power plants ordered by the regulators. Some of them are worthwhile, of course, but too often they only add complexity and new failure modes. A very healthy skepticism against the new should be maintained. This is even though I often make my living improving operating reactors.
The worst case in my personal experience was the discovery of a minor design defect in a safety-related piping system. Though some clever sleuthing, I discovered that my plant had this problem and I was able to verify it through a simple test which justified cutting into the piping and adding four new check (one directional) valves. These check valves had an internal disk that pivoted on a hinge so that flow could go one way but backflow in the opposite direction seated the disk and blocked the pipe.
When it came time to install the valves, the welds had to be inspected by a process called radiography that used high energy gamma rays to “x-ray” the metal. Unfortunately, the technician doing the radiography goofed up and dosed himself with 150 rem of radiation – enough to make one sick of acute radiation syndrome but not likely to be lethal.
When the time came to test the system, the designers had bought valves that had to be horizontal to work but installed them in the vertical position. With the first flow, the valve disk then hung up in the open position, making the installation useless. I wasn’t asked to review the design change package and might not have spotted the error anyway.
So the expensive safety fix I argued for almost killed a guy and still didn’t work.
I’ll never forget the lessons from this fiasco.
}}} The low rate of overt accidents in reliable systems may encourage changes, especially the use of new technology, to decrease the number of low consequence but high frequency failures. These changes maybe actually create opportunities for new, low frequency but high consequence failures.
Hrm. Consider the reliability of the American system of socialization. Now consider how we’re messing with guns to deal with a low frequency low consequence phenomena (low consequence for the nation, it’s clearly devastating for the individuals and their families)
}}} There was just a bit in our local fishwrap about how good it was that there were no members of the X industry on the board regulating that industry.
The downside to having members of “x industry” on the board regulating that industry is obvious: Unless they are scrupulously honest, they tend to encourage corporate cronyism in one form or another. And once such a board exists, the Ones In Charge in that industry always wrangle a way to get one of Their Own into a position of power on such a regulatory board, and once that happens, the board’s initial purposes get ignored all too often in favor of cronyism… This is usually considered a form of “Regulatory Capture”.
The best solution appears to be to try and keep the regulation to a minimum just to limit such chicanery and charlatanism.
I’ve long been partial to a throw-away suggestion by Heinlein — TWO legislative bodies. The job of one is to pass laws, but by a 2/3rds supermajority. The job of the second is to REPEAL laws, by a 1/3rd minority. His reasoning was that if 1/3rd of the populace didn’t like a law, it probably ought not to be law. Be interesting to try and expand that to the regulatory board concept… :-/