This great post by Richard Fernandez reminded me of a quote from George Eliot:
The sense of security more frequently springs from habit than from conviction, and for this reason it often subsists after such a change in the conditions as might have been expected to suggest alarm. The lapse of time during which a given event has not happened is, in this logic of habit, constantly alleged as a reason why the event should never happen, even when the lapse of time is precisely the added condition which makes the event imminent.
(from Silas Marner)
20 thoughts on “How Systems Get Tired”
I adopted part of it over at my blog. I try not to be too negative.
A useful case study is California, whose economy accounts for about 13 percent of U.S. gross domestic product and whose 2.6 million undocumented workers include almost a tenth of the state’s workforce.
For starters, the state’s farms and orchards, where a third to a half of agricultural workers are undocumented, would be crippled.
So, you could never deport the illegals, who are 30% of Los Angeles.
Plus, of course, the left has already destroyed agriculture in the state.
In the 1970s, coastal elites squelched California’s near-century-long commitment to building dams, reservoirs, and canals, even as the Golden State’s population ballooned. Court-ordered drainage of man-made lakes, meant to restore fish to the 1,100-square-mile Sacramento–San Joaquin River Delta, partly caused central California’s reservoir water to dry up. Not content with preventing construction of new water infrastructure, environmentalists reverse-engineered existing projects to divert precious water away from agriculture, privileging the needs of fish over the needs of people. Then they alleged that global warming, not their own foolish policies, had caused the current crisis.
About five years ago, I took friends from England on a tour of the central valley. They are same friends we are going to Waterloo with. I’m glad I could show it to them before it was destroyed.
“Then they alleged that global warming, not their own foolish policies, had caused the current crisis.”
If only there was a political party that would challenge this claim.
This is known as the the induction problem. Believing you can know what will happen by observing what’s happened in the past. Karl Popper believed that in order to avoid it, any empirical evidence can never be used to prove a premise but only to disprove it.
Funny, I was thinking about Caliphornia, but relative to earthquakes.
any empirical evidence can never be used to prove a premise but only to disprove it.
“The trouble is that any statistical study is just that, it’s based on experience and the experience may not cover all the possible situations…” Dr. Ralph Peck, discussing dam failure modes
Most statistical studies, at least in my experience, begin with a “null hypothesis” which means that there is no effect.
many researchers neglect the null hypothesis when testing hypotheses, which is poor practice and can have adverse effects.
As in global warming, etc.
There is a difference between researchers and ideologues.
Another example, paraphrasing Dr David Rogers discussing out of equilibrium slopes…
Rogers: I was near Santa Monica on a job and this guy’s house was 30 yards from the edge of a cliff. Now the only time you get a sheer face like a cliff is when things are out of equilibrium and things are eroding away in large blocks, all at once. I told the guy, you should consider moving your house back from the cliff wall.
Client: What? I’ve been here 30 years! I’ve never seen any erosion on that cliff.
Rogers: Now to a person, 30 years is like 30 million years. It’s forever. A huge chunk of time. In the history of the earth, or even of that slope, it’s an instant. People make decisions based on what they’ve seen and experienced in their lifetime. He’s never seen it, therefore it can’t happen.
Michael Hiteshew – I don’t feel too outraged by the homeowner who may not have been educated very well but what is his insurance company thinking?
“I’ve been here 30 years! I’ve never seen any erosion on that cliff.”
I bought a house on a bluff in Dana Point in 1979. It had been there since the 1930s. In February 1980, the cliff (200 feet high) sloughed off enough to leave my house 7 feet from the edge. It happened in the night and we got up in the morning to see it gone. I called the geologist who had told me, before I bought it, it was fine and asked him to come up with a fix. I had also had a contractor friend look at it before I bought it and I asked him to fix it. We had a soil engineer do some tests and did a $100,000 fix with caissons 35 feet down into the “Capistrano Formation” which was the “bedrock.” The house is still there and solid. Unfortunately, I sold it in a divorce and, even though my wife and I are now back together, I could never afford to buy it now.
Even asking experts to check may not be enough.
It’s a flaw of human nature and goes both ways. In some situations people tend to underestimate the risks of improbable events. This is the “it’s never happened before” phenomenon already discussed here.
But people also tend to overestimate the risks of improbable events in some cases, such as shortly after an improbable event has happened. Around 1995, about three years after the very destructive 1992 hurricane, the City of Miami told coastal residents to evacuate several days before the projected arrival of a subsequent hurricane that ultimately didn’t arrive. People also tend to overestimate risks of violent crime, global warming and other phenomena that are hyped in the media.
There may be no way to mitigate such decisionmaking biases other than to improve our education system. Everyone should learn, not only history, but also the fundamental principles of statistical and scientific reasoning.
“Everyone should learn, not only history, but also the fundamental principles of statistical and scientific reasoning.” Some people never really mastered tying their shoelaces or reading an analogue clock.
RE: “Everyone should learn, not only history, but also the fundamental principles of statistical and scientific reasoning.” Some people never really mastered tying their shoelaces or reading an analogue clock.
All too true. OTOH, if we could only make sure that journalists meet the first set of criteria and don’t fall into the second category, then maybe we wouldn’t be inundated with as much nonsense as we do..
Let me rephrase that: We should try to teach everyone the rudiments of science and stats.
Of course not everyone will pick it up. Not everyone understands basic grammar or arithmetic either. But if the schools start to teach it many more people will learn something about it, and that will make a significant difference in societal outcomes in the long run.
It would also help if people, like those on the barrier island north of Charleston where my daughter now works, would build their houses on stilts about five feet above maximum tidal surge in a category three storm. The people on Guam, which gets hit regularly by terrible typhoons, build houses like the proverbial brick sh**houses.
There is probably no defense against a cat 5 storm like Katrina which hit Ocean Springs MS and washed away everything south of I 10. Fortunately, they are rare. New Orleans could have used the money for dikes instead of casino parking lots.
Another way to look at it is the Rumsfeld Dilemma.
A 2X2 matrix has the first quadrant as the known knowns where we can manage and utilize our knowledge base.
The second quadrant contains known unknowns where we use conventional risk management to assess the uncertainty and take steps to avoid the negative consequences.
The third quadrant contains unknown knowns where we don’t know that if we could just ask the right questions we would find the right answers. This is the domain of false assumptions and inexperience. Intensive research and plain experience unlocks the knowledge that we need.
The fourth quadrant contains the unknown unkowns and those “unprecedented” events. Here we are walking around blind without a cane. Risks are hidden and can’t be planned for. The only strategy is either outright avoidance or just reacting quickly to get out of Dodge.
In Jonathan’s example, Hurricane Andrew was in the fourth quadrant, and the subsequent storms were in the second. As so often happens, the proper responses were reversed.
Reading these comments reminds how people over-estimate rare events like shark attacks, plane crashes, nuclear plant meltdown…and under-estimate familiar risks such as cigarette smoking, driving at 2 am, investing without diversification, etc
Often new and expensive regulations are targeted at the rare risks while the familiar are ignored.
It would also help if people, like those on the barrier island north of Charleston where my daughter now works, would build their houses on stilts about five feet above maximum tidal surge in a category three storm.
Actually, building codes on a lot of those barrier islands have changed. New built homes have to be constructed just like that. Older home are grandfathered to the old codes.
“people over-estimate rare events like shark attacks”
Boolian Algebra can be used for such decisions.
One common area in medicine is true and false positive. The result is a four part square. The top is true positive and false positive/
The lower is true negative and false negative.
The result can be what is called An ROC Curve.
The ROC curve is another way of describing true and false positives. Many general surgeons, for example, over estimate the result of suspicious mammograms. An ROC curve that is a straight 45 degree line is useless.
Anyway, few people understand George Boole’s rules.
Seems to be another one of those ‘rare’ events is playing out right now in the stock market. China and Japan are currently dropping precipitously. A lot can happen between now and tomorrow morning, but it’s not looking good.
Comments are closed.