Via Isegoria, here is an interview with James Sterrett, who is deputy chief of simulation/wargaming for the Command & General Staff College at Fort Leavenworth.
The issue of knowledge transfer between simulations and the real world is important not only in the military, but also in business and aviation..and surely many other areas as well.
Sterrett notes that in simulations:
First, we usually have far better knowledge of the situation than is possible for real armies; consider that one of the key pieces of information from ULTRA decrypts was the Axis order of battle in various theaters simply knowing what units the Axis had was a major intelligence coup, but such information is routinely handed to players. Moreover, the scenario usually tells us what the friendly and enemy win conditions are, while those are often less clear in real life.
Second, in nearly every game, our forces do exactly what we tell them to do, exactly when we tell them to do it. In the real world, subordinate forces need time to conduct their own planning so they can carry out our orders, and they may not go about the task exactly as we envisioned…
Third, gamers are usually planning by themselves, which means they have to explain everything only to themselves and to the game. Military staffs deal with more information than one person can process; even a battalion staff is likely to be several dozen people. Getting this many people to pass information among themselves efficiently, and let alone coming up with a coherent plan that everybody understands, requires practice.
The interview reminds me of a passage in Don Sheppard’s book Bluewater Sailor, which I wrote about several years ago…
When a decision is made in an organizational context (as opposed to a decision by an entirely autonomous individual), additional layers of complexity and emotion come into play. The person who must make the decision is often not the person who has the information/expertise on which the decision must be based. Indeed, the information and expertise are often distributed across multiple individuals. These individuals may have their own objectives and motivations, which may differ from the objectives and motivations of the formal decision-maker, and which may conflict with each other. And the making of the decision may alter power relationships within the organization, as well as influencing the phenomena about which the decision is ostensibly being made.
The above factors are illustrated with crystalline clarity in the story of a seemingly very simple decision, which had to be made onboard a U.S. Navy destroyer sometime during the 1950s.
Don Sheppard was the newly-appointed Engineering Officer of the USS Henshaw, with responsibility for its 60,000-horsepower turbine plant. But his knowledge of propulsion equipment came entirely from study at the navy’s Engineering Officer School. Reporting to Sheppard was the “Chief,” an enlisted man with no theoretical training but with twenty years of experience in the practical operation of naval power plants. When Sheppard assumed his new duties, the Chief’s greeting “bordered on rudeness.” The man clearly believed that engineering officers might come and go, but that he, the Chief, was the one who really ran things, who was the “Prince of the Plant.”
During maneuvers off the Pacific coast, a bizarre accident resulted in the Henshaw dropping a depth charge which exploded very close to its own stern. The shockwave was enough to knock down men who were standing on deck. Sheppard asked the Chief if he thought the plant might have suffered any damage:
He furrowed his brow, glaring at me. “Damage, sir? We’d know about any major damage by now if the plant suffered. i don’t think we got any problems, sir,” he answered–patronizingly–in a civil enough tone, but barely so. Who was I, an interloper, to dare question the Prince of the Plant?
But Sheppard remembered a movie he had seen in Engineering Officer School: it suggested that a shock like the one Henshaw had just experienced might have damaged the stern tube packing and the bearings through which the drive shafts ran. He mentioned this concern to the Chief, who discounted it with considerable sarcasm. “Maybe in some of them fancy movies it happened that way, sir, but nothin’s wrong here.”
Sheppard went to see the captain, and reported his concern about the possible damage. The spring bearnings could not be easily checked with the ship underway. The decision that had to be made was this: to check and possibly replace the bearings while at anchor, or to sail with the flotilla. The flotilla was comprised of eight destroyers, and the commodore was looking forward to having them all sail into Toyko Bay together. Furthermore, if Henshaw didn’t sail with the group, they would miss the rendezvous with the refueling tanker, and would have to refuel at an upleasant place called Dutch Harbor. But if they did sail and the bearings failed, they would have to be replaced while underway–a difficult and possibly dangerous task.
Legally and formally, the decision was the captain’s. But he knew little about the propulsion plant: it is doubtful that he really understood what the spring bearnings actually were. He had to depend on the opinions of his subordinates.
He asked the advice of those assembled for the conference. The Executive Officer said “sail.” The Chief recommended, “sail.” Now the captain turned to his Engineering Officer and asked very formally: “Your opinion, Mr Sheppard?”
What a dilemma the captain was in. Here, a junior officer with six days’ experience as a chief engineer is obviously wanting to pull out of the squadron sail and check all the spring bearings in direct contradiction to a professional, well-experienced engineering chief who’d been doing the job for twenty years.
If the captain said yes to the inspection and we missed the squadron sail, he’d look bad. He’d look even worse if he suspected they might be bad and they were, and they failed at sea. in rough weather he’d still be left behind and another ship would have to be used as an escort. The commodore had his dream set on his full squadron of eight destroyers steaming proudly into Toyko Bay. It hadn’t happened in a long time.
If I said we should inspect the spring bearings and the captain agreed with me, and the bearings were bad, it would injure the chief’s pride and his position in the engineering department. A wise-ass ensign would have shown him up, thereby throwing into question his professional ability.
If I said don’t sail and the bearings checked out okay, it would reinforce the opinion that officers stick together no matter how stupid the officers’ actions might be.
If I said don’t sail before a bearings check and we sailed anyway and the bearings failed, the captain’s competence would be called into quesion by the crew. He would have been wrong, and the word gets around the fleet mighty fast.
On the other hand, if I said we should sail, thereby taking a chance of a failure and the bearings were okay, it would just show my inexperience and that I didn’t really know what was going on. After all I had been a chief engineer for only six days. There would be little harm done.
Who is the real decision-maker in this scenario? The captain has the formal authority, but little relevant knowledge, either practical or theoretical. The Chief has the practical experience, but no theoretical training, and lacks the authority of officer rank. Sheppard has formal authority for the plant, together with theoretical training, but almost no practical experience.
Most likely, the true decision-maker is Sheppard. From the dynamics of the situation, I suspect that the captain would have done whatever he advised.
“Sail, Captain, I think they’ll be okay,” I answered, as the ship whispered to me that I was wrong.
As the ship whispered to him that he was wrong.
Henshaw sailed with the flotilla, and almost immediately came the report that Number 3 spring bearing was running hot. The starboard engine was stopped, and sailors began the arduous task of replacing the bearing. This involved sliding jacks under the shaft and lifting it up a few centimeters, then sliding out the 80-pound bearing and sliding a new one in. This had to be done as the ship pitched and rolled, while standing in icy bilge water. The task wasn’t complete when the report came that another bearing had failed–this time, the Number 2 bearing on the port engine. That engine had to be stopped also, and Henshaw was taken in tow by another ship of the flotilla. Sheppard pitched in with the work, and had his hand badly cut by protuding metal slivers. Others were hurt more seriously; one man had his right hand badly injured when Number 2 bearing broke loose, smashing his hand against the bearing foundation.
Glassy eyed from the painkillers…Smallwood held onto the throttle board, trying to keep his attention on the gauges. His head nodded. Chief Maclin sent him to his bunk. “I’m sorry, Smallwood,” he said, helping him up the ladder. “Goddamn, I’m really sorry.”
Chief Maclin turned to me, wiping a tear from his eyes, and without word or expression offered his greasy, bloody hand.
After everything was under control, the captain called Sheppard to his cabin for a debriefing on what had happened. First, he critized himself for the mishap that had led to the initial proble, the accident with the depth charge. Second, he criticized himself for not listening more seriously to Sheppard’s initial concerns about the bearings. But he also had something else to say:
“And third, Don, you, you’re a direct contributor.” My face dropped. I thought I was a hero. “If you thought you were right–and you did think you were right–you should have put up more opposition, not roll over dead because of the obvious resistance of the three of us. I think, Don, that’s the greatest lesson for you to learn in this whole thing.”
The kind of political anaysis that Sheppard conducted before making his recommendation–what will be the effect of this alternative on my relationship with the Chief?..what will be the effect on the Chief’s image with his own subordinates?–is made every day by people in organizations, and must be made, given the realities of organizational life.
But while considering the political dynamics–don’t forget to listen to the ship.
Simulation-based training, like classroom-based training, can be of great value, but there are important aspects of decision-making that it cannot readily encapsulate.
The Iowa disaster was related to a senior master chief who was a favorite of the captain. Wiki actually does have the story.
In January 1989 Iowa’s Master Chief Fire Controlman, Stephen Skelley, and Gunnery Officer, Lieutenant Commander Kenneth Michael Costigan, persuaded Moosally to allow them to experiment with increasing the range of the main guns using “supercharged” powder bags and specially designed shells. Moosally was led to believe, falsely, that top officials from Naval Sea Systems Command (NAVSEA) had authorized the experiments. In fact, John McEachren, a mid-level bureaucrat with NAVSEA, had given the go-ahead to conduct the experiments even though he had no authority to do so. McEachren concealed his approval of the gunnery experiments from his superiors.[12]
Several of the officers and non-commissioned officers in charge of the main gun turret crews believed that Skelley’s and Costigan’s proposed experiments were dangerous, especially because of the age of, and numerous maintenance problems with, the main guns and gun turrets. Meyer complained to Commander Robert John Kissinger, Iowa’s chief weapons officer, about the proposed experiments, but Kissinger refused to convey the concerns to Moosally or halt the experiments.[13]
On 20 January 1989, off Vieques Island, Iowa’s Turret One fired six of the experimental shells using the supercharged powder bags. Skelley claimed that one of the 16 inch shells traveled 23.4 nautical miles (40 km), setting a record for the longest conventional 16 inch (406.4mm) shell ever fired. Although the shells had been fired without serious incident, Meyer and Petty Officer First Class Dale Eugene Mortensen, gun chief for Turret One, told Skelley that they would no longer participate in his experiments. Skelley asked Turret Two’s gun chief, Senior Chief Reggie Ziegler, if he could use Turret Two for his experiments; Ziegler refused. Skelley then asked Lieutenant Phil Buch, Turret Two’s officer in charge, and Buch acquiesced.[
The Iowa is now in Los Angeles harbor in San Pedro and I toured it with my son and grandson. The turret #2 is not open to the public.
}}} Simulation-based training, like classroom-based training, can be of great value, but there are important aspects of decision-making that it cannot readily encapsulate.
Proper simulations can encapsulate the decision-making process. They can’t really encapsulate the full gravitas of error, but they can certainly put one in the position described, and place pressure for a decision. More critically, the decision’s worst-case scenario can become “true”, laying the sense of responsibility at the feet of the decisionmaker, which is a large, if not critically, important part of the whole D Tree.
If you want to understand how decisions are made in modern war, read Bing West’s “The Strongest Tribe”.
Pay particular attention to the role of the Lawyers, ROE and the catch and release policy towards insurgents.
That’s the reality of modern war.
I meant to also link my post On Trusting Experts–And which Experts to Trust:
http://photoncourier.blogspot.com/2006_06_01_photoncourier_archive.html#115004341046075924
…which is also a story about decision-making in an organizational context.
Having spent a couple of years perfecting a nuclear power plant simulator for Asia and a decade on the emergency response team at a domestic reactor AND being full time since Fukushima on improvements to nukes based on lessons learned there, I agree.
We can model most hardware phenomena pretty well but the exercise of command and control – and diagnosis – is more difficult.
During the Fukushima event, the prime minister called the control room and started to give orders to the plant manager. He was ignored, thankfully.
Imagine Joe Biden doing the same in the US! He would be hung-up on and our people at the plants know it. The professionals at the US Nuclear Regulatory Commission know that their statutory authority does NOT extend to real-time management of events at a nuclear site. They’ll hang you if you blow it but that’s after the fact, but they won’t give you a direct order.