Monday morning quarterbacks are already criticizing the way the American government has exposed thousands of Afghan interpreters and other allies to Taliban reprisals. I join this chorus by looking at the decision-making processes that led to the current chaos in Afghanistan.
To choose well, we must attend to judgments in three categories of thought:
- Facts
- Goals, Values, and Interests
- Future events
That is, we must have an accurate understanding of the facts based on a proper assessment of the available evidence, which depends, in part, on logic and, in part, on having correct theories about how the world works (a/k/a science). We must spell out (i) our goals, (ii) our value priorities, and (iii) our key interests. Stated differently, we must know what we want to accomplish and why achieving those goals is important to us. Third, we must accurately predict the likely consequences of choosing and acting upon each available option.[1]
Failure to attend to these three components of good decision-making leads to poor decisions. A satisfactory outcome will be more the result of chance than of conscious choice and careful execution.
Those having the responsibility to make a significant decision should first make sure they consider the relevant facts, articulate the goals and interests, and predict the future. Second, decision makers should guard against cognitive biases, sloppy intelligence, logical errors, and other mistakes that can infect and distort judgment. Because our judgment can easily be defective, we should bring a healthy humility to major decisions, keeping in mind that we could be wrong.
The decision-making process I have sketched here and the problems that can arise are illustrated by the recent decisions made within the Biden administration related to the withdrawal from Afghanistan, as reported by Michael D. Shear, David E. Sanger, Helene Cooper, Eric Schmitt, Julian E. Barnes, and Lara Jakes in the New York Times.[2]
“First, Pentagon officials said they could pull out the remaining 3,500 American troops, almost all deployed at Bagram Air Base, by July 4 — two months earlier than the Sept. 11 deadline Mr. Biden had set. The plan would mean closing the airfield that was the American military hub in Afghanistan, but Defense Department officials did not want a dwindling, vulnerable force and the risks of service members dying in a war declared lost.”
The italicized part of the second sentence states a goal and an interest (the second element of decision making)–namely, avoid injury and death to American military people. The Times story does not report whether other objectives and values were considered. Nor does it say what facts (intelligence) and probability assessments about future events informed this decision.
“Second, State Department officials said they would keep the American Embassy open, with more than 1,400 remaining Americans protected by 650 Marines and soldiers. An intelligence assessment presented at the meeting estimated that Afghan forces could hold off the Taliban for one to two years.”
The objective here is to keep the embassy open. The estimate of future events turned out to have been wildly mistaken. The second aspect of decision making-soundness of judgment-comes into view. We don’t know what evidence and cause-and-effect relationships the intelligence experts considered. Was their judgment affected by wishful thinking, overconfidence bias, confirmation bias, or one of the many other cognitive impediments that can distort judgment? Did they make logical errors? We won’t know until a full study is released. But already armchair critics say that intelligence experts should have known their judgment could have been wrong. Their worst case should have included the possibility of what, in fact, came to pass.
“[N]o one raised, let alone imagined, what the United States would do if the Taliban gained control of access to [the civilian airport in Kabul], the only safe way in and out of the country once Bagram closed.”
Why not? Why did no one raise the possibility of a total collapse of the Afghan government with Taliban control of access to the airport as one consequence? Was Groupthink at work here? Did some in the room stay mum about misgivings or uncertainty because higher-ranking people clearly had a preference for the decision they made? Did political considerations affect judgment about facts and probabilities? Critics are now saying (and the Times article reports) that various groups had been lobbying for months to extract interpreters and other allies before withdrawing troops.[3] Did these groups predict something that intelligence experts did not allow themselves to imagine?
Was anyone red teaming this decision?
We don’t yet know. Preliminary information suggests not.
A red team is charged with coming up with everything the main team doesn’t want to hear.
A red team not only plays devil’s advocate but actively tries to penetrate the main team’s defenses or to defeat its plan. A similar team established by President Kennedy at the outset of the Cuban Missile Crisis is widely thought to have helped us avoid a nuclear war.
The Times article lists a series of misjudgments that affected the major decisions made in connection with ending an American military presence in Afghanistan. Those in command believed they had “the luxury of time.” They overestimated the ability of Afghan forces to resist the Taliban. They underestimated the impact American withdrawal would have on their will to fight. Both indicate that the group’s judgments may have been affected by the overconfidence and confirmation biases.
But, despite General Milley’s conclusion in April that the Afghan forces were “reasonably well equipped, reasonably well trained, reasonably well led,” the general would not say whether they could stand on their own without support from the US. “We frankly don’t know yet,” he said. “We have to wait and see how things develop over the summer.”
Whoa! Red flag. Red flag. Back in the room, people reportedly were not considering the possibility of a total collapse. But General Milley is openly implying that prospect in a response to the press. And “[t]he president’s top intelligence officers echoed that uncertainty, privately offering concerns about the Afghan abilities. But they still predicted that a complete Taliban takeover was not likely for at least 18 months.”
Something in the decision-making process is not computing here. Yes, the best judgment of the intelligence people — their conclusion — said the Taliban would not take over for 18 months. But what else were intelligence experts saying? Were lower-level analysts talking about a possible rapid collapse? Did that analysis get excluded from the final report that went to General Milley and the President?
Filtering out contrary thinking from reports happens in the government agencies, political organizations, and corporations on a daily basis. It is analogous to Groupthink but more pernicious in some ways. What we might call the Censorship Effect deprives the ultimate decision maker (the President in this case) of information and perspectives he or she would likely want and should have.
The Censorship Effect may have been at work here. “One senior administration official, discussing classified intelligence information that had been presented to Mr. Biden, said there was no sense that the Taliban were on the march.” If the intelligence agencies truly did not know that the Taliban were increasingly gaining control, as we now know they were, then this would be a major failure of our intelligence capacities in Afghanistan, something that would warrant a Congressional investigation on its own.
The Times article tells us that the pleas of Congressional Representatives, Senators, and groups representing the interests of Afghan allies went unheeded because doing what they wanted “would have meant launching a dangerous new military mission that would probably require a surge of troops just at the moment that Mr. Biden had announced the opposite.” This conclusion suggests that Groupthink writ large was, in fact, affecting the thinking of those charged with developing a plan for evacuating Americans and allies. Possible consequences seem to have been excluded because considering them was not politically palatable.
Did they consider alternatives to a troop surge? Is physical force the only option? Did anyone suggest negotiating with the Taliban leaders in Doha for an agreement that would have assured the US ability to evacuate Americans and allies in the event of a Taliban takeover? It was and is in the Taliban’s interest to resolve this issue in favor of a safe evacuation. Were such negotiations attempted? Are they going on now?
Some, like Lawrence O’Donnell of MSNBC, argue that the chaos we have witnessed over the past 10 days is an unavoidable consequence of losing a war in a foreign country, something the US has repeatedly experienced since Vietnam. However true that may be generally, it does not account for or excuse mistakes in decision making that leaders could have avoided with good procedures.
Is the Afghan withdrawal President Biden’s Bay of Pigs? The analogy may be inapposite in some respects. But from the perspective of good decision making, a few parallels come into view. Both Kennedy and Biden inherited a flawed plan from the previous President. Both failed to change course when reasons arose for doubting the wisdom of the previous decision. Neither put a robust decision-making process (with bias mitigation strategies and red teaming) in place before making and executing the decision. Kennedy did not make that mistake twice.
One could argue — I could argue — that there is no excuse for people at the highest levels of government, business, and finance continuing to make decision errors over 70 years after we learned how to avoid them. The lessons of decision-making science are abundant and widely available. I am not covering new ground here. These lessons should be included in basic courses in college, business schools, law schools, and medical schools — perhaps even high school.
But each generation of government agencies, corporations, nonprofits, schools, churches, and families must learn the basics as if for the first time. First, attend deliberately to the three elements of good decisions: facts, objectives/interests, and future events. Second, do everything you can to avoid distorted judgments. Third, be humble. As Daniel Robinson has said, “We could be wrong. We could be grievously wrong. We could be hopelessly and irretrievably wrong.”
[1] A previous version of this model was presented in Herbert Simon, Administrative Behavior: A Study of Decision-Making Processes in Administrative Organizations (4th ed., Free Press, 1997; orig. 1947). The title of this story, “Mistakes Were Made” has become part of Washington Speak, going back to the Nixon era and possibly before that. No one made the mistakes; somehow they just happened to occur. It’s a mystery.
[2] “ Miscue After Miscue, U.S. Exit Plan Unravels,” New York Times (Aug. 21, 2021). Unless otherwise stated, quotations are from this article.
[3] “On May 6, representatives from several of the United States’ largest refugee groups, including Human Rights First, the International Refugee Assistance Project, No One Left Behind, and the Lutheran Immigration and Refugee Service logged onto Zoom for a call with National Security Council staff members. The groups pleaded with the White House officials for a mass evacuation of Afghans and urged them not to rely on a backlogged special visa program that could keep Afghans waiting for months or years.”