Rand Simberg discusses an NYT column by John Tierney that deals with common biases in group decisionmaking.
Tierney writes:
We like to think that people improve their judgment by putting their minds together, and sometimes they do. The studio audience at “Who Wants to Be a Millionaire” usually votes for the right answer. But suppose, instead of the audience members voting silently in unison, they voted out loud one after another. And suppose the first person gets it wrong.
If the second person isn’t sure of the answer, he’s liable to go along with the first person’s guess. By then, even if the third person suspects another answer is right, she’s more liable to go along just because she assumes the first two together know more than she does. Thus begins an “informational cascade” as one person after another assumes that the rest can’t all be wrong.
Because of this effect, groups are surprisingly prone to reach mistaken conclusions even when most of the people started out knowing better, according to the economists Sushil Bikhchandani, David Hirshleifer and Ivo Welch. If, say, 60 percent of a group’s members have been given information pointing them to the right answer (while the rest have information pointing to the wrong answer), there is still about a one-in-three chance that the group will cascade to a mistaken consensus.
Cascades are especially common in medicine as doctors take their cues from others, leading them to overdiagnose some faddish ailments (called bandwagon diseases) and overprescribe certain treatments (like the tonsillectomies once popular for children). Unable to keep up with the volume of research, doctors look for guidance from an expert — or at least someone who sounds confident.
Physicians and attorneys stand out in our society as professionals who often make high-stakes decisions yet are not systematically educated about decisionmaking biases and how to avoid them. For this reason alone, it is important that anyone who consults a physician or attorney on any serious matter be accompanied by relatives or friends of known good judgment. It’s always better to have additional educated heads around when you need to evaluate expert advice, because you have to tease the expert’s assessment of facts (what he is expert about) away from his evaluation of probabilities and of the patient’s or client’s preferences (what the expert is probably not expert about), so that you can get the best assessment of each. In practice, this means asking the expert enough probing questions to determine exactly how he is evaluating the odds of various outcomes, exactly which factors he is considering in making his recommendation(s), and exactly how much weight he is assigning to each factor.
If you are in such situations more than a couple of times, you will probably notice that some experts, if you let them, will make high-stakes recommendations based on wildly mistaken assumptions about probabilities and/or about the preferences of the patient or client. (Examples: A physician recommends against colonoscopy based on a statistical risk of bowel perforation that turns out, on investigation, to be overstated by an order of magnitude. Another physician recommends major surgery instead of a more limited procedure that will have fewer and less-severe side effects. He also, as we learn by asking around, understates the severity of the side effects. When questioned, he explains that the major procedure will obviate the need for routine tests that would otherwise be necessary on a regular but infrequent basis after surgery. The patient, however, doesn’t mind at all the prospect of regular tests but is adamant that he wants to avoid the side effects of major surgery.)
If you are on a jury, or in a business meeting or other group where you have to reach a group decision, you can guard against decision cascades by stipulating that group members will not state their opinions until after substantial discussion has occurred about facts and alternative courses of action, and that when the time comes to decide group members will vote by simultaneous secret ballot. However, to implement such measures you must control the agenda, which is not possible in many of these situations, and especially not in the context of public controversies such as global warming, health and diet, etc.
—-
*This is a pretty good popular treatment of decisionmaking pitfalls and how to avoid them. There are numerous similar books. I’ve sent copies of this book to a couple of people I know, one of them an excellent physician who treated someone I know. As far as I know, neither recipient of the book read it, which is probably what you’d expect given that the people who can most benefit from the ideas in the book tend to be 1) busy, 2) unaware of the pervasiveness of decisionmaking biases and 3) overconfident of their own decisionmaking ability.
UPDATE: See this column by Arnold Kling, which deals with some of the same issues as this post.
Peter Drucker quoted Alfred Sloan, who ran GM for many years, somewhat as follows:
“I take it we are all in agreement on the decision here?”
(nods all around)
“Then we had better postpone the decision and give ourselves time to generate some disagreement.”
In Don Sheppard’s book Bluewater Sailor, he offers a very interesting example of the social/organization factors which impact decision making in the real world…in this case, aboard a US Navy destroyer. Excerpt here.
The Sheppard post is excellent.
There is a book about the Cuban Missile crisis as a study in decisionmaking. I don’t remember the book’s title. What I read of it I read long ago. However, I remember that it reviewed very favorably the decisionmaking process within Kennedy’s cabinet. It showed how the participants went out of their way to continue gathering information, and to avoid forming opinions or making decisions, for as long as was possible in their (successful) efforts to frame the critical issues. They handled the decisionmaking process much differently than did Sheppard’s captain.
The Sheppard post is very good. There are no good answers to the situation he faced.
Jonathan,
There is a book about the Cuban Missile crisis as a study in decisionmaking.
That’s funny because the Bay of Pigs decision-making process is considered a textbook example of groupthink.
Live and learn I suppose.
I think many experts find themselves latching onto some claim and then others falling into place based of the intense social pressures to give an answer.
In some sense, scientist are the priest of our society. They interpret the signs of nature and tell the rest of us what to do. We don’t like to hear them say, “I don’t know.” If someone does say, “I don’t know and no else knows either,” we tend to pass them over for someone else. We definitely don’t want to hear things like, “we don’t what the optimal diet is” or “we don’t know how much CO2 the biosphere can absorb.” Scientist who don’t give answers about major problems get marginalized.
The following quote is from the Tierney article. As a thought experiment, substitute global warming for diet, Algore for McGovern, etc.
Jonathan – You may be thinking of Graham T. Allison’s “Essence of Decision”, which was quite famous among those who study international relations. It has been decades since I read the book, but I still recall some of his arguments about the effect of the bureaucracies.
I notice that it is available in a second edition at Amazon.
Thanks, Jim. That may be the book. I have a copy somewhere and should dig it up. It is well worth reading.
I think the greatest cascade of all times must have been the “energy crisis” of ’73-’83 wherein a temporary interruption in oil supplies due to politics transmuted rapidly into the permanent shortage of all types of energy. By 1980 only a small minority of scientist in many different fields professed to question the entire concept. People took it very seriously and planners in every part of public and private life began making serious plans based around the concept. It even triggered wars and famines.
Then in ’83-’84 oil prices collapsed and entire hysteria just evaporated.