Judgment Day

I’m re-posting this here as I know the links may be of interest to certain parties.

An interesting confluence of information has crossed my computer screen in the last 24 hours.

Fabius Maximus was kind enough to send me a PDF, “Cognitive biases potentially affecting judgment of global risks” by Eliezer Yudkowsky, Singularity Institute for Artificial Intelligence. It’s a very interesting paper on analytical thinking – or is even though a number of the points made by Yudkowsky I have seen previously made elsewhere ( the blogosphere revels in hyperactive disconfirmation biases). Their central cognitive philosophy – “….the one whom you must watch above all is yourself”, is spot on.

Secondly, over at Kent’s Imperative, one of the Kentians, let’s call them “Most Formal Prose Kent” had a highly congruent post to the Yudkowsky paper, “The sins of analytic methodologists “:

There is an increasingly common conceit that reliance on the analyst – subject to, cognitive bias, information overload, and human fallibility can be engineered out of the process of doing intelligence. Instead, certain methodologists would substitute organizational structures, workflow re-organization, and the introduction of supposedly superior quantitative metrics in order to create a new standard for “answers”. The underlying thrust of these efforts is to reform intelligence activities towards a more “repeatable” process, often described by industrial or scientific metaphors such “foundry” or “lab”. These typically originate from the engineering and technical intelligence disciplines, and are usually directed as criticism of typical all source efforts particularly those grounded in social science fields or qualitative methodology.

…The fundamental flaw in many of these methodologists’ efforts is that they are essentially reductionist attempts to force the difficult and oft-times messy art of intelligence entirely into the narrow box of its scientific side. While there is a place for scientific approaches, particularly in the grounding and validation of assessment, the inherently creative, non-linear, and even non-rational elements of the profession can never be completely discarded. Most recent intelligence failures have occurred, not due to a lack of precision in judgment, but from a lack of imagination in identifying, describing, and forecasting the uncertain dynamics and emerging complexities of fast-changing accounts.”

Sagely described.

Clear thinking is difficult. Few of us begin by adequately checking our premises or, sadly, our facts. Even in the domain of concrete and verifiable factual information, so much rides on our implicit opinion of what exactly, in terms of data points, constitutes a ” fact”, that we are usually off-base before we begin. Even if we are cognizant of these variables from the inception of forming a question, we might be horrified to discover, with some dogged investigation of the finer details, how fuzzy at the margins that even our peer-reviewed, “valid and reliable”, facts can be – much less the breezy assertions delivered by the MSM.

Then, more to the point of the KI post, there is the hasty selection of particular, reductionist analytical tools that a priori blind us to the nature of the emergent unknown that we are trying to understand. We become prisoners of our chosen perspective. One problem with human perception is that there is no guarantee, having recognized the existence of a novel dynamic phenomena, that our perception represents the most significant aspect of it. Much like conceptualizing an Elephant in motion from blind contact with its eyelashes. Or its feces.

Human nature is a perpetual rush to judgment. We must rise above that.

6 thoughts on “Judgment Day”

  1. “an increasingly common conceit that reliance on the analyst – subject to, cognitive bias, information overload, and human fallibility – can be engineered out of the process of doing intelligence”…of course, there is still reliance on an analyst–it’s just a different analyst: the one who designed the system, process, “foundry,” or whatever it is called.

    There are, of course, advantages that the system-designing analyst has that the real-time analyst does not: more time for reflection, the opportunity to get more people involved and gather diverse perspectives, access to more data and computational resources. Offsetting these advantages is that fact that the system-designing analyst can never be aware of the current context in the same way that the real-time analyst will be. Often, too, the real-time analyst has a personal stake in the outcome in a way that the system-designing analyst does not.

  2. In any field of endeavor, analyst usually go wrong when they attempt to use quantitative tools for data which inherently lacks quantitative properties. As the old computer adage goes, “garbage in, garbage out” meaning that the accuracy and precision of any final analysis ultimately depends on the quality of the initial data.

    My chief complaint about economics is that although we create complex theories describing the relationship between various economic factors, we usually lack the tools to make real-world, real-time measurements of these factors. We can find that we have exact models of economic systems and yet find ourselves unable to predict the future of these systems because we lack the good data necessary to make the models work.

    Yet we keep investing a great deal of resources into creating these models and we keep believing in their predictions even though history shows them no more accurate than flipping coin. We do so I think, because culturally, we trust a hunch expressed in numbers more than one expressed in mere words.

    I suspect that intelligence work suffers from the same problem. Framing intelligence judgments in numbers sounds better than framing them with mere words.

    We forget that ultimately science produces better information only due to our superior ability to measure some phenomena. If we can’t accurately measure a phenomenon then science doesn’t really deal with it. Yet, we so desire to concretely understand what is going on in the world that we essentially try to validate unmeasurable phenomenon by mimicking the procedures that scientist use to when dealing with measurable phenomena.

    This is true cargo-cult science. We ritualistically mimic the superficial forms of science in the vain hope that the forms alone will make the output accurate.

  3. “If we can’t accurately measure a phenomenon then science doesn’t really deal with it.”

    Human beings in most of their actions and activities do not produce reliable data. We are endlessly inconsistent. Only at a macro levels where the data sought is narrow enough as to be largely depersonalized, or at micro levels where (for example) specific cellular interaction is observed), and where the samples are large enough to correct for all anomalies, can technology accurately measure our behavior, but even there it cannot reliably predict it.

    Personally, I’m delighted not to be a compendium of statistics, and take much pleasure from interaction with others who feel the same.

    Vera

  4. This respect for “expert analysts” is the 21st century variation of a very old tendency in people—paying attention to people who claim some form of arcane knowledge denied to the ordinary person.

    Somewhere, many thousands of years ago, a guy who normally wasn’t much listened to otherwise announced to the other members of the clan that he had been visited by the spirit of old Urg, the great hunter, who told him to do thus and such. And so it was born, this idea that there is some hidden mystery, some gnostic revelation, that only the insiders are priviledged to know and understand.

    For centuries it was prophets and shamans, oracles and priests. Then there came the “scientists” who could cure all ailments with the newly discovered power of magnets or electricity or some other obscure knowledge that only they had discovered.

    Then, for one of the most gruesomely murderous centuries ever known, we were told that there were special people who understood the developments of history, and were privy to the complex workings of people, and their places in society, in a way that no one had ever been aware of before.

    The result, of course, was the emergence of a group of “perfect” leaders, whose understanding surpasseth all others, and who could not be questioned or denied, and certainly not limited by such trivia as laws or constitutions. Thus was the doctrine of infallibility grafted onto the politics of the 20th century, and continues, lamentably, in a few hellholes around the world to this day.

    But, in the “enlightened”, technologically advanced countries of the west, we no longer believe in infallible leaders. Instead, we have been convinced, somehow, that any number of fallible, error prone “experts” can program their computers so wonderfully that the future is revealed, and major decisions can be made based on this “special” knowledge, and, in certain cases, at least, the debate is over, and any skepticism is a form of criminal denial of an unassailable truth.

    I think I like the Urg story better. It’s more exciting, and spookier, around the campfire.

Comments are closed.