“Scientific Cascades” and Other Decision Traps*

Rand Simberg discusses an NYT column by John Tierney that deals with common biases in group decisionmaking.

Tierney writes:

We like to think that people improve their judgment by putting their minds together, and sometimes they do. The studio audience at “Who Wants to Be a Millionaire” usually votes for the right answer. But suppose, instead of the audience members voting silently in unison, they voted out loud one after another. And suppose the first person gets it wrong.
 
If the second person isn’t sure of the answer, he’s liable to go along with the first person’s guess. By then, even if the third person suspects another answer is right, she’s more liable to go along just because she assumes the first two together know more than she does. Thus begins an “informational cascade” as one person after another assumes that the rest can’t all be wrong.
 
Because of this effect, groups are surprisingly prone to reach mistaken conclusions even when most of the people started out knowing better, according to the economists Sushil Bikhchandani, David Hirshleifer and Ivo Welch. If, say, 60 percent of a group’s members have been given information pointing them to the right answer (while the rest have information pointing to the wrong answer), there is still about a one-in-three chance that the group will cascade to a mistaken consensus.
 
Cascades are especially common in medicine as doctors take their cues from others, leading them to overdiagnose some faddish ailments (called bandwagon diseases) and overprescribe certain treatments (like the tonsillectomies once popular for children). Unable to keep up with the volume of research, doctors look for guidance from an expert — or at least someone who sounds confident.

Physicians and attorneys stand out in our society as professionals who often make high-stakes decisions yet are not systematically educated about decisionmaking biases and how to avoid them. For this reason alone, it is important that anyone who consults a physician or attorney on any serious matter be accompanied by relatives or friends of known good judgment. It’s always better to have additional educated heads around when you need to evaluate expert advice, because you have to tease the expert’s assessment of facts (what he is expert about) away from his evaluation of probabilities and of the patient’s or client’s preferences (what the expert is probably not expert about), so that you can get the best assessment of each. In practice, this means asking the expert enough probing questions to determine exactly how he is evaluating the odds of various outcomes, exactly which factors he is considering in making his recommendation(s), and exactly how much weight he is assigning to each factor.

If you are in such situations more than a couple of times, you will probably notice that some experts, if you let them, will make high-stakes recommendations based on wildly mistaken assumptions about probabilities and/or about the preferences of the patient or client. (Examples: A physician recommends against colonoscopy based on a statistical risk of bowel perforation that turns out, on investigation, to be overstated by an order of magnitude. Another physician recommends major surgery instead of a more limited procedure that will have fewer and less-severe side effects. He also, as we learn by asking around, understates the severity of the side effects. When questioned, he explains that the major procedure will obviate the need for routine tests that would otherwise be necessary on a regular but infrequent basis after surgery. The patient, however, doesn’t mind at all the prospect of regular tests but is adamant that he wants to avoid the side effects of major surgery.)

If you are on a jury, or in a business meeting or other group where you have to reach a group decision, you can guard against decision cascades by stipulating that group members will not state their opinions until after substantial discussion has occurred about facts and alternative courses of action, and that when the time comes to decide group members will vote by simultaneous secret ballot. However, to implement such measures you must control the agenda, which is not possible in many of these situations, and especially not in the context of public controversies such as global warming, health and diet, etc.

—-
*This is a pretty good popular treatment of decisionmaking pitfalls and how to avoid them. There are numerous similar books. I’ve sent copies of this book to a couple of people I know, one of them an excellent physician who treated someone I know. As far as I know, neither recipient of the book read it, which is probably what you’d expect given that the people who can most benefit from the ideas in the book tend to be 1) busy, 2) unaware of the pervasiveness of decisionmaking biases and 3) overconfident of their own decisionmaking ability.

UPDATE: See this column by Arnold Kling, which deals with some of the same issues as this post.

11 thoughts on ““Scientific Cascades” and Other Decision Traps*”

  1. Peter Drucker quoted Alfred Sloan, who ran GM for many years, somewhat as follows:

    “I take it we are all in agreement on the decision here?”

    (nods all around)

    “Then we had better postpone the decision and give ourselves time to generate some disagreement.”

  2. In Don Sheppard’s book Bluewater Sailor, he offers a very interesting example of the social/organization factors which impact decision making in the real world…in this case, aboard a US Navy destroyer. Excerpt here.

  3. The Sheppard post is excellent.

    There is a book about the Cuban Missile crisis as a study in decisionmaking. I don’t remember the book’s title. What I read of it I read long ago. However, I remember that it reviewed very favorably the decisionmaking process within Kennedy’s cabinet. It showed how the participants went out of their way to continue gathering information, and to avoid forming opinions or making decisions, for as long as was possible in their (successful) efforts to frame the critical issues. They handled the decisionmaking process much differently than did Sheppard’s captain.

  4. Jonathan,

    There is a book about the Cuban Missile crisis as a study in decisionmaking.

    That’s funny because the Bay of Pigs decision-making process is considered a textbook example of groupthink.

    Live and learn I suppose.

  5. I think many experts find themselves latching onto some claim and then others falling into place based of the intense social pressures to give an answer.

    In some sense, scientist are the priest of our society. They interpret the signs of nature and tell the rest of us what to do. We don’t like to hear them say, “I don’t know.” If someone does say, “I don’t know and no else knows either,” we tend to pass them over for someone else. We definitely don’t want to hear things like, “we don’t what the optimal diet is” or “we don’t know how much CO2 the biosphere can absorb.” Scientist who don’t give answers about major problems get marginalized.

  6. The following quote is from the Tierney article. As a thought experiment, substitute global warming for diet, Algore for McGovern, etc.

    After the fat-is-bad theory became popular wisdom, the cascade accelerated in the 1970s when a committee led by Senator George McGovern issued a report advising Americans to lower their risk of heart disease by eating less fat. “McGovern’s staff were virtually unaware of the existence of any scientific controversy,” Mr. Taubes writes, and the committee’s report was written by a nonscientist “relying almost exclusively on a single Harvard nutritionist, Mark Hegsted.”

    That report impressed another nonscientist, Carol Tucker Foreman, an assistant agriculture secretary, who hired Dr. Hegsted to draw up a set of national dietary guidelines. The Department of Agriculture’s advice against eating too much fat was issued in 1980 and would later be incorporated in its “food pyramid.”

    Meanwhile, there still wasn’t good evidence to warrant recommending a low-fat diet for all Americans, as the National Academy of Sciences noted in a report shortly after the U.S.D.A. guidelines were issued. But the report’s authors were promptly excoriated on Capitol Hill and in the news media for denying a danger that had already been proclaimed by the American Heart Association, the McGovern committee and the U.S.D.A.

    The scientists, despite their impressive credentials, were accused of bias because some of them had done research financed by the food industry. And so the informational cascade morphed into what the economist Timur Kuran calls a reputational cascade, in which it becomes a career risk for dissidents to question the popular wisdom.

    With skeptical scientists ostracized, the public debate and research agenda became dominated by the fat-is-bad school. Later the National Institutes of Health would hold a “consensus conference” that concluded there was “no doubt” that low-fat diets “will afford significant protection against coronary heart disease” for every American over the age of 2. …

    But when the theories were tested in clinical trials, the evidence kept turning up negative. As Mr. Taubes notes, the most rigorous meta-analysis of the clinical trials of low-fat diets, published in 2001 by the Cochrane Collaboration, concluded that they had no significant effect on mortality. …

    Mr. Taubes told me he especially admired the iconoclasm of Dr. Edward H. Ahrens Jr., a lipids researcher who spoke out against the McGovern committee’s report. Mr. McGovern subsequently asked him at a hearing to reconcile his skepticism with a survey showing that the low-fat recommendations were endorsed by 92 percent of “the world’s leading doctors.”

    “Senator McGovern, I recognize the disadvantage of being in the minority,” Dr. Ahrens replied. Then he pointed out that most of the doctors in the survey were relying on secondhand knowledge because they didn’t work in this field themselves.

    “This is a matter,” he continued, “of such enormous social, economic and medical importance that it must be evaluated with our eyes completely open. Thus I would hate to see this issue settled by anything that smacks of a Gallup poll.” …

  7. Jonathan – You may be thinking of Graham T. Allison’s “Essence of Decision”, which was quite famous among those who study international relations. It has been decades since I read the book, but I still recall some of his arguments about the effect of the bureaucracies.

    I notice that it is available in a second edition at Amazon.

  8. I think the greatest cascade of all times must have been the “energy crisis” of ’73-’83 wherein a temporary interruption in oil supplies due to politics transmuted rapidly into the permanent shortage of all types of energy. By 1980 only a small minority of scientist in many different fields professed to question the entire concept. People took it very seriously and planners in every part of public and private life began making serious plans based around the concept. It even triggered wars and famines.

    Then in ’83-’84 oil prices collapsed and entire hysteria just evaporated.

Comments are closed.