By Marguerite Rigoglioso
STANFORD GRADUATE SCHOOL OF BUSINESS—In the months prior to the Sept. 11, 2001, attacks, an FBI agent shot an angry email to an FBI analyst who was blocking him from following up on an important terrorist lead. “Someday someone will die,” the frustrated agent wrote, “and … the public will not understand why we were not more effective and throwing every resource we had at certain ‘problems.’”
Thousands of deaths, five years, and an uncomprehending public later, the U.S. government and intelligence community are still trying to figure out just what went so horribly wrong as to result in one of the most tragic breaches of national security of all time. In 2004, the 9/11 Commission Report shone the first harsh search beam on the ugly realities and endemic problems that led to the disaster—its findings reaching from the Oval Office and Congress to FBI headquarters and field offices.
“Prior to 9/11, intelligence information was often stalled, stovepiped, withheld, distorted, or simply ignored,” explains Roderick M. Kramer in a paper published in the International Public Management Journal. “The summer of 2001 became a summer of missed opportunities.”
But it was not a matter of bad and lazy people simply not doing their jobs, says Kramer, the William R. Kimball Professor of Organizational Behavior at the Stanford Graduate School of Business. Drawing on organizational insights, he argues that the failures of interagency cooperation described in the commission report are the same kinds of breakdowns observed in other large and complex bureaucracies.
Typical pitfalls in any large organization, explains Kramer, include compartmentalizing or hoarding information, avoiding political hot potato issues, engaging in turf wars, and passing off responsibility to others. Not only did the CIA and FBI fall right into all of these pits and more in the years, months, and days leading up to the terrorist attacks in New York and Washington, but so did related agencies—and even the nation’s legislature. Cooperative regimes are simply difficult to sustain, says Kramer, in large part because of the competition among groups and individuals for status and other highly coveted but scarce resources.
Often in bureaucracies a particular system is set up with good intentions but then devolves into its own evil twin. Take the trajectory of what came to be known as “the Wall” in the intelligence community. Following the 1994 prosecution of CIA veteran Aldrich Ames for espionage, the informal sharing between the Department of Justice and the FBI of information on foreign powers and their agents was disrupted over concerns about the legality of such arrangements. As a result, new guidelines were established that were intended to improve information sharing—but in reality were misunderstood and led to the institutionalized subversion of communication.
These procedures came to be known as “the Wall” and led to, as Kramer puts it, “the left hand not knowing what the right hand was doing, or more seriously, what the right hand wasn’t doing” within and among intelligence agencies. Power dynamics—also typical of large organizations—emerged, resulting in a tendency among the various agencies to hoard critical information in order to position themselves as “informational gatekeepers.” Even agents within the FBI were impeded in their attempts to cooperate with other FBI agents. In the CIA, what started out as “healthy paranoia” about counter-espionage degenerated into an almost obsessive tendency to mark information as classified—and therefore off limits to those who sometimes might need it to do their job.
While discovering and averting terrorists’ plans requires the ability to “connect the dots,” as the commission report observed, intelligence organizations had evolved mechanisms to keep the dots isolated. Foreign intelligence agencies were busy looking for threats abroad, while domestic agencies were looking at local targets. No one was looking for a foreign threat to a domestic target. And that was just the wormhole international terrorists needed.
Kramer notes that in bureaucracies, where problems are huge and intractable, most people find little value in taking on responsibility for difficult issues, and in Washington in 2001, terrorism was one such hot potato. Congress passed off work on this question to others, and even an organization like the National Security Agency didn’t think it should investigate certain individuals who had been identified as part of a possible terrorist cadre. In organizational parlance, the hero interested in taking on terrorism was missing.
“What have we learned as a result of all of this analysis?” asks Kramer. “Are we safer than we were before?” Yes and no, he concludes ambivalently. On the one hand, lessons of the past tend to be lost or distorted over time, yet the autopsy of 9/11 will be going on well into the future. “As was the case with the Cuban missile crisis,” says Kramer, “the revelations will no doubt lead to a much deeper and richer understanding of how a confluence of institutional mechanisms and behavioral tendencies can leave the door open to disaster. I do think that decision makers can learn from such analysis.”
“A Failure to Communicate: 9/11 and the Tragedy of the Informational Commons,” Roderick M. Kramer, International Public Management Journal, Vol. 8, No. 3, 2005.
Predictable Surprises, M.B. Bazerman and Michael Watkins, Harvard Business School Press, 2004.
Unleashing Change: A Study of Organizational Renewal in Government, S. Kelman, Brookings Institution Press, 2005.
“When Paranoia Makes Sense,” R.M. Kramer, Harvard Business Review, Vol. 80, No. 7, 2002.
The Limits of Safety: Organizations, Accidents, and Nuclear Weapons, S.D. Sagan, Princeton University Press, 1993.
Also on Stanford Knowledgebase: