Tuesday, January 30, 2007

Review: "Discourse, Dissent, and Strategic Surprise"

The Institute for the Study of Diplomacy at Georgetown University recently published a report titled "Discourse, Dissent, and Strategic Surprise."

Although the authors' focus is the federal government - i.e., the communication difficulties experienced by diplomats (especially those overseas) and policymakers in Washington D.C., I thought the report had some relevant findings for homeland security professionals at all levels of government.

Generally, the report argues in favor of casting a wider net when seeking intelligence on a given threat - and it harshly condemns the practice of filtering all information through a pre-existing mindset.

We all are given to interpreting new information through the filter of our past experiences, but this report shows how prior failures of the U.S. government should serve as a lesson that the way to proceed is to keep an open mind, to listen, and to avoid pretending that we already know what the threat is.

The report focuses on a number of case studies which resulted in unpleasant surprises for the United States. Specifically, the failures to:

  1. Anticipate the 1979 revolution in Iran
  2. Recognize the threat against U.S. embassies prior to the 1998 embassy bombings
  3. Recognize the scale of the Soviet invasion of Afghanistan in 1979
  4. Understand the nature of the conflict in the U.S. - U.S.S.R. "proxy war" in Afghanistan from 1989-1992; specifically, the danger in arming and training the mujahedin
  5. Anticipate the Asian financial crisis of 1997-98
Here are a few highlights of the report, with comments:
Long-standing and systemic tensions in U.S. democracy exist between the need for open discourse and the requirements of a disciplined decision-making process, both of which are needed to govern effectively. Protection of the consensus, however, has the potential to hinder sound policy formulation when professionals are discouraged from presenting informed views simply because they challenge the status quo. When such information—and the people providing it—are excluded from policy discourse, the “marketplace of ideas” ceases to work as an essential corrective to mistaken or flawed assumptions.
Leaders must be especially sensitive to the danger of silencing those at lower levels. "Speaking truth to power" becomes extraordinarily difficult when your career is on the line. Leaders must be open to dissenting views:
It is inherently difficult for subordinates to challenge the prevailing views of their leaders at any time. This is particularly so when leaders are intent on pursuing a course of action, driven by firmly held assumptions about the nature and urgency of a threat.

But nonmonetary—or even monetary—incentives aimed at encouraging independent thought may not be enough to persuade individuals to express candid disagreements with the consensus if by doing so they also are bargaining with their professional survival. The interactions among senior and mid- to low-level professionals in both the intelligence and policy arenas are therefore central factors considered in our study, part of the analysis of how constraints on discourse can emerge and inhibit alternative interpretations of events, sometimes leading to a collective failure to anticipate or understand new threats.
The case of the Iranian revolution shows how the silencing effect can work:
Efforts by low- and mid-ranking analysts to discuss the regime’s failings were treated by most senior officials not just as irrelevant but suspect and soft headed. Just raising such issues could be incendiary, given that some officials believed that discussions of deteriorating conditions in Iran could add to the prospects that the shah would not survive. As such, reporting of bad news was actively discouraged, contrarian analysts found they were not being invited to meetings, and eventually the voices faded away.
In the case of the African embassy bombings, the problem was different. Information about the threat was available, but it was unwelcome:
Failure to heed—or to ask for—reports about conditions on the ground can lead to ignorance or misunderstanding of important political and economic trends that portend new security challenges.
Most ominously, the warning bell was repeatedly sounded by the U.S. ambassador to Kenya. But the warnings were received with hostility:
After it was disclosed that she had arranged to have her letter voicing security concerns hand-delivered to the secretary, [U.S. Ambassador to Kenya, Prudence] Bushnell, for the first time in her long and distinguished Foreign Service career, received a mediocre performance review. Just weeks before the bombings, Bushnell was chided for her excessive preoccupation with security and her “tendency to overload bureaucratic circuits.”
In the case of the 1989-92 "proxy war" in Afghanistan, which was almost entirely a covert operation, senior policymakers were actually the blindfolded ones. They kept on the current track because they did not have enough information to question whether it was the correct track:
This case provides a textbook example of how those who have access to information can become the drivers of policy, granting individuals authority that would normally exceed their jurisdiction or level of seniority. Without routine access to information, policymakers will always feel constrained from questioning the prevailing policy because, as one participant put it, “they feel they are missing the information needed to make judgment calls . . . and so they tend to back off.”
In short, the information sharing system virtually shut down:
Information about covert operations is always highly restricted, but in Afghanistan it extended only to a small number of individuals from Congress, the intelligence community, and a few executive branch officials. One former congressional aide whose senator was not included in the inner circle remembered how difficult it was for him to gain access to information about the situation in Afghanistan, notwithstanding the authority granted to the senator by virtue of his committee assignments. An intelligence official who was part of the operations agreed, adding the observation: “At any given time, in the peak of our involvement in Afghanistan, there were never a hundred Americans at work on the problem. . . . We [U.S. intelligence operatives] provided wide open door access [about events in Afghanistan], but to a very limited number of people. And they were very good about keeping it from everybody else.”
After examining all of the case studies, the report comes to some predictable conclusions:
The instances of surprise we examined in this study are not often the result of missing or faulty intelligence information; they are far more about the way information is interpreted, distributed, and prioritized by senior officials.

Policymakers dismissed warnings when the indicators failed to conform to common conceptions of what constitutes a genuine threat to U.S. “vital” interests.

When a healthy consensus evolves into a “mindset,” the assumptions and beliefs underlying that consensus can become impervious to new information, sometimes blinding leaders to the implications of global trends.
And ... in summary ... information sharing is absolutely vital to recognize threats:
When there is no routine discourse among top officials and professionals with detailed expertise, the ability even to consider realigning policies in response to breaking events, let alone to understand complex events, is virtually impossible. This factor is critical to understanding the phenomenon of surprise. Undue restrictions on the number and kind of individuals or agencies allowed to contribute to intelligence or policy debates by definition interfere with the government’s ability to assess events reliably.

No comments: