Home Page
Fiction and Poetry
Essays and Reviews
Art and Style
World and Politics
Montreal
Archive
 

***

MENTAL ERRORS AND FOREIGN POLICY FAILURES

***

By Steve Yetiv

***

The Montréal Review, May 2015

***

"National Security through a Cockeyed Lens: How Cognitive Bias Impacts U.S. Foreign Policy" by Steve Yetiv (Johns Hopkins University Press, 2013)

***

"Anyone who has been involved with policy making and execution...has experienced many of the cognitive biases and the personal and organizational dynamics discussed in Steve Yetiv's book... There are many times I wish I had had a book like this to wave around and point to, or quietly place in front of people to suggest they pause and think about their thinking, and I am sure that in the future I will use it as a reference."

--Frédéric Ruiz-Ramón, Perscitus International LLC; former U.S. Department of Defense official

***

Why did the United States underestimate the challenges it would face after its invasion of Iraq in 2003, which helped spawn groups like the “Islamic State”? Why was the 1961 Bay of Pigs invasion in Cuba such a fiasco? Shouldn’t Israel have picked up on the many signs of an impending Arab attack in 1973?

We usually explain foreign policy failures by referring to things like bad information, black swan events that no one could predict, political ideology, ineffective allies, poor bureaucratic coordination, special interests,  and inexperienced leadership. By contrast, we overlook or downplay cognitive biases or mental errors in how we see ourselves or the world such as over-confidence; short-term thinking; seeing what we expect to see; demonizing our adversaries; or focusing excessively on one factor to the exclusion of others in making our decisions.

Yet, over 50 years of psychological research underscores that such cognitive biases are huge factors in affecting decisionmaking, which raises a crucial question for foreign policy: If cognitive biases are pervasive and can be so damaging, how can they be diminished? Doing so is hard but research suggests that several techniques may help and can also improve decisionmaking in general.

For starters, it’s a great idea for leaders and their assistants to list options for dealing with problems and systematically weigh their costs and benefits in light of the country’s national goals. This may or may not obvious, but it is often neglected. Consider a dramatic example: the U.S.-led invasion of Iraq in 2003. There appeared to be no group meeting at which the final decision to go to war was considered, much less weighed very carefully, albeit President George W. Bush may or may not have done so on his own. By contrast, as an exception, President John F. Kennedy learned from his Bay of Pigs fiasco to use careful decisionmaking procedures during the Cuban Missile Crisis of 1962. Doing so probably helped diminish the type of overconfidence that appeared to bedevil the Bush team regarding Iraq in 2003, and helped the world avoid a nuclear war.

Believe it or not, studies also suggest that simply encouraging people to “consider the opposite” of whatever decision they are about to make can also lessen biases such as overconfidence or confirmation bias. This may take the form of playing devil’s advocate or of openly examining each argument so that no one view can go unquestioned. Powerful leaders should even skip some decisionmaking meetings to allow a freer flow of ideas. Kennedy employed such an approach in the Cuban Missile Crisis to good effect, helping avoid a global nuclear war.

Studies also underscore what many of us may well surmise: it’s useful to step into the shoes of others so as to avoid assuming that others understand us and vice versa. One cognitive bias is that we sometimes assume that if our intentions are benign, others will know this as well, but that is often false. For example, America seeks to protect Persian Gulf oil, and assumes that others tend to understand its motivation, but polls reveal that many mistakenly see America as seeking to steal Middle East oil and undermine Muslims.

Stepping into the shoes of others also can help avoid what is called the fundamental attribution error--one of the most common cognitive failures in conflict situations. Even if we know for sure that the behavior of others is driven by the context (no one is to blame) and not by their own personal characteristics, we sometimes still blame them personally—a hate-generating tendency that is amplified when we are unsure of what has caused their behavior. Forcing ourselves to step in their shoes may help decrease this dynamic.

Beyond such approaches, drawing on wider input is important. For example, it is vital to draw on outside experts and on the reports of underlings, which leaders sometimes fail to do. Studies show that insulated decisionmaking groups are less informed and more prone to cognitive biases like groupthink, with ruinous consequences.

Cognitive biases are ingrained and hard to eliminate, but we need to understand them better—not only to explain foreign policy fiascos, but also to try to diminish the role they play. That's certainly a worthwhile effort because human welfare depends on good decisions.

***

Steve Yetiv is a political science professor at Old Dominion University and the author of National Security Through a Cockeyed Lens: How Cognitive Bias Impacts U.S. Foreign Policy (Johns Hopkins University Press, 2013), The Petroleum Triangle (Cornell University Press, 2011), Crude Awakenings (Cornell University Press, 2011), and The Absence of Grand Strategy (Johns Hopkins University Press, 2008). His recently released book is: Myths of the Oil Boom: American National Security and the Global Energy Market (Oxford University Press, 2015). 
Steve Yetiv can be contacted at: http://www.odu.edu/~syetiv/

***

 
 
 
 
 
Copyright © The Montreal Review. All rights reserved. ISSN 1920-2911
about us | contact us