Any system susceptible to a Black Swan will eventually blow up.
–Nassim Taleb [Link is a pdf.]
Some Chicago Boyz know each other from student days at the University of Chicago. Others are Chicago boys in spirit. The blog name is also intended as a good-humored gesture of admiration for distinguished Chicago School economists and fellow travelers.
Any system susceptible to a Black Swan will eventually blow up.
–Nassim Taleb [Link is a pdf.]
Comments are closed.
I imagine any system is susceptible a black swan eventually, and eventually, they’ll all experience one given enough time.
Some systems aren’t susceptible. It depends on the returns distribution. A trading system based on buying options has bounded risks and unbounded returns. It may bleed to death slowly but won’t blow up. A trading system based on selling naked short options has bounded returns and unbounded risk and will probably blow up eventually.
Jon, I’m thinking more generally. Ultimately, no system is closed and unto itself. In the larger, more universal system of human behavior a black swan could occur and disrupt many things, many systems, at the same time. In analyzing BSs of options one must consider the more universal BS to be a “BS of options trading” in particular since the system could blow up (although your loss would limited) along with everything (or many things).
Having read BS itself, I’m still left to wonder: how is this any more profound than saying, no human system can ever be perfect? I mean, that’s not exactly news…
Tyouth, that was my first thought as well. People seem to be fond of convincing themselves that this time everything will be different and we have entered a new age/way of doing business.
Not all failures are alike. Some systems fail relatively safely, others have divergent behavior. The latter are subject to BS. See Taleb’s discussion in the linked interview of the two different types of noise.
Jonathan, I used to ride submarines and we had a phrase for hull valves (valves that went directly to the ocean, therefore very important); “fail safe,” which led to a another phrase which referred to people; “fail stupid.”
When I read BS, my initial reaction was that people naturally are desperate to make sense of their lives/professions. In particular, folks in the finance business who place too much faith in the vagaries inherent in any human system (particularly a system where “value” is a moving target subject to emotion/trends/etc) are setting themselves up when they ascribe too much to data which reveals less than assumed. Taleb’s warning that over-reliance on imperfect data will eventually cause adherents to fail stupid.
I enjoyed the article—particularly the “remedy” Hill offers against confirmation bias—“Use the method of conjecture and refutation introduced by Karl Popper: formulate a conjecture and search for observations that would prove it wrong. This is the opposite of our search for confirmation.” This is an exceptionally difficult proposition; requiring the practitioner to think deeply and attempt to deconstruct their assumptions.
Thanks for sharing.
There are many systems that can degrade gracefully, but even they can be overwhelmed by Black Swan events. Could be the weather (Katrina), terrorism (9-11) or governement action (GM bond holders). Offering predictable, stable returns is what made Madoff so popular.