Disaster by Management:Managerialism and Normal Accident Theory in the Columbia Accident and FBI Headquarters' Response to Field Office Concerns before 9/11Presentation to the:29th Annual Natural Hazards Research and Applications Workshop Boulder, CO, July 2004
Christine M. RodrigueDepartment of GeographyCalifornia State University Long Beach, CA 90840-1101 1 (562) 985-4895 rodrigue@csulb.edu https://home.csulb.edu/~rodrigue/ |
|||||||
ABSTRACT
A key element in contemporary technological accidents and in the September
11th terrorist incidents is the interaction of risk assessment and risk
management in complex public organizations. Risk assessment communication
moves along the spokes of an organizational hierarchy toward socially and
spatially more concentrated hubs of decision-making, each of which makes risk
management decisions in politicized contexts peculiar to its own scale. These
contexts affect the outcome of a given risk assessment communication: Is the
risk managed by an active and effective decision-maker at that level? Is the
communication passed along yet another spoke to still another hub in search of
an effective decision-maker at a more influential level? Or is the risk
assessment suppressed with either no decision taken to alter risk or with
sanctions applied to the messengers of risk?
This paper compares the structure of human errors in two disasters with sociogenic causes: the Columbia Shuttle accident and the FBI failure to act on intelligence presaging the 9/11 terrorist attack. In each case, technical information suggesting disaster was weakly transmitted within an elaborate bureaucracy and high-level decision-makers failed to authorize action and, in at least one case, actively overrode actions taken by lower-level decision- makers that may have prevented tragedy. The result was truly "disaster by management." To analyze risk assessment communication flows along NASA and FBI hierarchies, respectively, this paper integrates several theoretical frameworks: managerialism, organizational theory, functions of government theory, accident theory, risk perception, risk assessment and risk management relations, and the spatiality of NASA and the FBI. Data consist of public documents concerning these two disasters. The paper concludes that both the Columbia accident and the FBI handling of field office concerns before 9/11 seem to validate normal accident theory. Communication about risks appear to have been hog-tied in complex bureaucracies. Unpredictable external constraints acted on both agencies and led to a shift in risk managers' perception of the relative importance of the precautionary principle and the opportunity costs its application can impose. In NASA's case, the failure in communication can be traced to its external political environment and funding base, its geographically ornate and hierarchical structure, and the lower status and timidity of risk assessors compared with managers. In the FBI's case, the most consequential failure of communication was between the most senior levels of the Bureau and the lower- ranked personnel there at Headquarters, which affected their decision-making concerning the distant field offices. In both agencies, there were, additionally, parallel chains of command and communication. At NASA, individuals may find themselves wearing hats as engineers, as technical staff within the Shuttle Program, and as employees within the line structure of a NASA center, and it may not be clear to them which chain they should jerk to call attention to a safety-of-flight issue. At the FBI, intelligence and criminal investigation functions have been kept strictly separated and compartmentalized. The consequence of these barriers to communication along hierarchies, between chains of command, and across space was an imbalanced focus on the managerialist concerns of efficiency, budget, scheduling, and rules and regulations, instead of on the risk to human life. Managers had normalized anomaly and resisted data that contradicted their biases in perception, leading to what one NASA engineer called "worlds of pain." |
|||||||
INTRODUCTION
This paper compares the structure of human errors in two disasters with
sociogenic causes: the Columbia Shuttle accident and the FBI failure to act on
intelligence presaging the 9/11 terrorist attack. In each case, technical
information suggesting disaster was weakly transmitted within an elaborate
bureaucracy and high-level decision-makers failed to authorize action that may
have prevented tragedy. The result was truly "disaster by management."
Themes from prior literature relevant to these case studies include:
Information on the Shuttle disaster came from the Columbia Accident Investigation Board report, while information on the FBI treatment of field office communications comes from testimony given to the Congressional Joint Investigation into September 11th and the Boston Globe. |
|||||||
THE COLUMBIA ACCIDENT
On January 16th, the Columbia Shuttle was struck during launch by a large
piece of insulation foam at a relative velocity of several hundred km/hr,
leading to the breakup of the shuttle during descent on February 1st, killing
all 7 astronauts.
The Columbia Accident Investigation Board (CAIB) attributed the accident more to NASA's internal organization and history and Congressional and White House pressures on the agency than to the specific mechanisms that led to failure.
|
|||||||
A History of Budget and Schedule Pressure and Managerialist ResponseNixon ended most of NASA's post-Apollo space exploration plans. NASA salvaged the Shuttle by promising it would be a self-supporting launch vehicle and scientific platform. The first shuttle (Columbia) was launched in 1981. Reagan declared this experimental vehicle "fully operational" in 1981, so Reagan and Congress found Shuttle cost overruns and schedule delays inexplicable and unacceptable. Such managerialist pressures led to a lethal decision to launch Challenger in January 1986 over the concerns of engineers. The Rogers Commission indicted NASA's managerialism for normalizing anomaly to meet tighter schedules and budgets. NASA implemented many reforms in the Rogers Report and returned to flight in 1988, but the Shuttle Program was now part of the Human Space Flight Initiative, along with the ISS. Managerialist pressures resumed in 1994 with an OMB directive that ISS cost overruns be confined to the HSFI, meaning the moneys would be taken from the Shuttle. From 1993 to 2003, the Shuttle budget was hacked 40% and its labor force 42%: "Faster, cheaper, better"? Faced with a $4 billion ISS cost overrun in 2001 and White House pressure, the US contribution was limited to completing a node allowing other nations' modules to dock with the ISS. An arbitrary date was set for node completion: February 2004. The deadline imposed intense schedule pressure on the Shuttle, which led to managerial concerns about delays on a tight sequence of launches. The consequences of missing deadlines caused managers to require virtually ironclad proof of impending mission failure before approving delays or mission aborts. |
|||||||
Structural and Geographical Impediments to Risk Communication
Routine examination of launch videos revealed the strike by a suitcase-sized
piece of foam insulation.
The Intercenter Photo Working Group asked Kennedy Shuttle Management to get Defense Department imagery, which the Kennedy manager tried to do. NASA & Boeing engineers formed a Debris Assessment Team. Boeing engineers asked their Houston office to run "Crater," a program for simulating popcorn-sized debris impacts on the Shuttle. Crater had never been used by the Houston staff. The model predicted the foam had ruptured the wing. The Houston staff did not trust the model's results but did not consult with the California Crater developers about them. The Debris Assessment Team, however, was worried enough to request DoD imagery separately from the KSM request, via Johnson Space Flight Center engineering management. The Chair of the Mission Management Team, thinking the DAT had gone around her to Johnson and the DoD, contacted DoD to cancel the imagery request. DoD canceled both requests. The Chair knew a particularly bad debris strike had happened on the previous launch, without a mission abort (or accident). Normalization of anomaly made her decision comfortable. The DAT accepted her decision as a final order due to the mechanistic and hierarchical nature of NASA management. With no DoD imagery and just the odd results of Crater to go on, the DAT could make only a weak presentation justifying emergency rescue, which could have saved the crew's lives. Management found nothing in their talk to compel concern about safety-of- flight over their natural inclination to worry more about mission costs and schedule. And the rest is history. |
|||||||
FBI HEADQUARTERS AND
|
In light of that failure, the behavior of Headquarters analysts and operations specialists before the attacks makes sense.
On July 10th, 2001, a Special Agent with the Phoenix field
office requested HQ open investigations of several Middle Easterners
taking aviation lessons. Intelligence Operations Specialists there
did not know about the Clarke meeting. Without that context to alter their
worries about past FBI abuses and racial profiling, their decision on August
7th to close the case with no further action is not unreasonable.
|
On August 15th, 2001, a Minneapolis flight school called the local
FBI field office to report Zacarias Moussaoui's odd behavior.
Minneapolis contacted French intelligence and learned of his ties with
Islamist groups and then contacted HQ to request a FISA search. This was
refused because of John Ashcroft's earlier investigation of FBI FISA search
warrant abuses and because HQ agents felt the Minneapolis supervisor had a
habit of excessive resort to FISA searches. These concerns were decisive
because the content of the July 5th meeting had not been communicated to all
levels of HQ.
|
On August 29th, 2001, a New York field officer working on the USS
Cole bombing asked FBI Headquarters to let New York use criminal investigative
resources to find one Khalid al-Midhar, who had recently met with a
Cole suspect and had, furthermore, entered the United States in July 2001. Al-
Midhar would coordinate the 9/11 operation. HQ personnel, clearly not aware
of the July 5th meeting, refused because of "the Wall" separating criminal
from intelligence investigations: Prosecution can expose intelligence assets.
|
Immediately after September 11th, a new FBI Turkish and Farsi
translator uncovered possible infiltration of the FBI by a
Middle Eastern group under investigation by the FBI, which invited her to
join! She and an agent later found pre-9/11 documents in Turkish marked by a
member of this group as "not pertinent" for translation, which had contained
very specific premonitory information. She and the agent filed security
complaints at progressively higher levels. Instead of triggering
investigation of infiltration and poor operating procedure before 9/11, the
complaints resulted in her termination.
|
In NASA's case, the failure in communication reflects:
Both agencies have parallel chains of command and communication:
The result was an imbalanced focus on the managerialist concerns of efficiency, budget, scheduling, and rules and regulations, instead of on the risk to human life.
Managers had normalized anomaly and resisted data that contradicted their biases in perception, leading to what one NASA engineer called "worlds of pain."
http://www.spaceflight.nasa.gov/shuttle/investigation/index.htmlCongressional testimony about 9/11 is collected at:
http://www.fas.org/irp/congress/2002_hr/The post-9/11 issue of the translator's problems on reporting suspicious behavior of another translator comes from:
Kornblut, Anne E. 2004. Translator in eye of storm on retroactive classification. The Boston Globe (5 July). Temporarily available from: http://www.boston.com/news/nation/articles/2004/07/05/ translator_in_eye_of_storm_on_retroactive_classification/This paper is available at:
https://home.csulb.edu/~rodrigue/disbymgt/boulder04.html