The sense of security more frequently springs from habit than from conviction, and for this reason it often subsists after such a change in the conditions as might have been expected to suggest alarm. The lapse of time during which a given event has not happened is, in this logic of habit, constantly alleged as a reason why the event should never happen, even when the lapse of time is precisely the added condition which makes the event imminent.
–George Eliot in Silas Marner
I was reminded of the above passage by a couple of recent posts:
Claire Berlinski excerpts some thoughts by Hernando De Soto, asking “Is the knowledge system broken?” Some good discussion in the thread at Claire’s post; see especially the concept of a “knowledge bubble” in the comment by Late Boomer. Although I’d say that it’s more a matter of an assumed-knowledge bubble.
Richard Fernandez suggests that “too big to fail” really means “wait for it,” where “it” means a failure on a very large scale. He cites Nassim Taleb:
Complex systems that have artificially suppressed volatility tend to become extremely fragile, while at the same time exhibiting no visible risks. In fact, they tend to be too calm and exhibit minimal variability as silent risks accumulate beneath the surface. Although the stated intention of political leaders and economic policymakers is to stabilize the system by inhibiting fluctuations, the result tends to be the opposite.
Both of the above are very worthwhile reading. See also my related post penny in the fusebox.