You are on page 1of 18

Planning for the Worst, Bringing Out

the Best?
Lessons from Y2K

Kevin Quigley
Dalhousie University, Canada

abStract

Organization theorist Lee Clarke (2005) argues when policy makers plan for disasters, they too often think
in terms of past experiences and “probabilities.” Rather, policy makers, when planning to protect the
infrastructure, should open their minds to worst-case scenarios; catastrophes that are possible but highly
unlikely. Underpinned by a precautionary principle, such an approach to the infrastructure would be more
likely to produce “out of the box” thinking and in so doing, reduce the impact of disasters that occur more
frequently than people think. The purpose of this chapter is to consider the utility of Clarke’s worst-case
planning by examining Y2K preparations at two US government agencies, the Bureau of Labor Statistics
(BLS) and the Federal Aviation Administration (FAA). The data concerning Y2K come mostly from official
US government sources, interviews, and media analysis. The chapter concludes that the thoroughness of
worst-case planning can bring much needed light to the subtlety of critical complex and interdependent
systems. But such an approach can also be narrow in its own way, revealing some of the limitations of such
a precautionary approach. It potentially rejects reasonable efforts to moderate risk management responses
and ignores the opportunity costs of such exhaustive planning.

Copyright © 2008, IGI Global, distributing in print or electronic forms without written permission of IGI Global is prohibited.
Planning for the Worst, Bringing Out the Best? ers is how do governments protect these fragile and
interdependent critical infrastructures?
Organization theorist Lee Clarke (2005) argues that
when policy makers plan for disasters, they too often
introduction think in terms of past experiences and “probabilities,”
that which is calculable. Problems are often structured
Critical Infrastructure Protection (CIP)—activi-ties that by existing bureaucratic interests and routines rather
enhance the physical and cyber- security of key public than by the prob-lems themselves. To Clarke, the
and private assets—is the focus of urgent attention approach is too limiting since it is the unimagined and
among Western governments in light of recent power the events considered “low probability” that often
failures, computer viruses, natural disasters, epidemics wreak havoc on society. Policy makers, when
and terrorist attacks, both threatened and realised (GAO, planning to protect the infrastructure, should therefore
2005). Gov-ernment studies and popular analyses note open their minds to worst-case scenarios, catastrophes
the complex, interdependent, and fragile make-up of that are possible but highly unlikely. Clarke ar-gues
these infrastructures and the technologies that underpin such an approach, ideally underpinned by a
them; one small failure can have a massive and precautionary principle1, is more likely to “stretch
unpredictable cascading effect. Consider the 2003 North the imagination,” produce “out of the box” thinking,
American power outage: overgrown trees in Ohio and in so doing, reduce the likelihood of these all-too-
helped trigger a power failure that affected 50 million frequent catastrophes.
people and cost the US economy anywhere from $4 Clarke champions US reaction to Y2K (the Year
billion to $10 billion (US-Canada Power System Outage 2000 computer bug), the date-related com-puter bug
Task Force, 2004). One critical question for policy mak-
that threatened to bring systems down around the world
at the turn of the century, as an

instance of successful “worst-case planning”


(Clarke, 2005, p. 73). He recalls the collective
response from across all sectors that anticipated
numerous scenarios and then guarded against
them. To Clarke, this exhaustive effort averted
a potential socioeconomic meltdown.
The purpose of this chapter is to consider the utility
of Clarke’s worst- case planning by exam-ining
Y2K preparations at two US government agencies,
the Bureau of Labor Statistics (BLS) and the
Federal Aviation Administration (FAA). First, the
chapter situates Clarke’s book in the relevant
Organization Studies literature and summarizes
Clarke’s main argument for worst-case planning.
Second, it recalls the Y2K case. Third, the chapter
describes the Y2K directives that the Executive
Office issued for government departments and
agencies, and then employs the Hood, Rothstein,
and Baldwin (2001) framework to compare the size,
structure, and style of the BLS’s and the FAA’s
approaches to Y2K. The chapter then considers the
extent to which Y2K operations embodied the
qualities that Clarke predicts for worst-case
planning. The chapter concludes by asking, first, did
the US government’s response to Y2K constitute
worst-case planning and, second, is worst-case
planning an effective and desirable risk-
management strategy? The data concerning Y2K
come mostly from official US government sources,
34 in-depth, semistructured interviews, including 15
interview subjects from the specific agency case
studies or their parent departments, and an analysis
of media coverage from three leading US
newspapers.2
Worst-case planning helps to bring to light the
fragile, complex, and interdependent nature of
critical infrastructure. While it can be expan-sive
in one sense, however, worst-case planning can
also be very narrow. Endless counterfactuals risk
absorbing considerable resources without
acknowledging the opportunity costs of such
actions. Heady enthusiasm over eliminating all
risk scenarios can also marginalize those who
offer alternative, more-sceptical views. In such
Planning for the Worst, Bringing Out the Best?

a context, worst- case planning can work against


the very goals it seeks to achieve. His survey of disasters serves two purposes. First,
he uses the illustrations to challenge our
understanding of what we somewhat arbitrarily
background: worSt caSeS and describe as a “worst case.” Sometimes a “worst
organization StudieS case” can be measured by the number of dead
(“body count”); but other times it can be described
by reference to one’s proximity to the event, the
In Worst Cases: Terror and Catastrophe in the amount of media coverage it receives, the age of the
Popular Imagination (2005) Lee Clarke draws victims, or the manner in which they suffered, for
from, and contributes to, the debate in Organiza-tion instance. For Clarke, “worst cases” are, in fact,
Studies that is concerned with safety and reli-ability context sensitive; our understanding of a “worst
of complex technological systems. Broadly case” is often motivated by our capacity to identify
speaking, two schools define the field. Whereas with the victims (Clarke, 2005, p. 19). Clarke also
normal accidents theory holds that accidents are
challenges the notion that such worst cases are
inevitable in organizations that have social and
random occurrences. Rather, they tend to mirror
technical interactive complexity and little slack
how people organize themselves (Clarke, 2005, p.
(tight coupling), high reliability organizations
129). Typically, though not always, “worst cases”
theory states that hazardous technologies can be
affect the people with the fewest resources and are
safely controlled by complex organizations if the
therefore least able to cope with massive disrup-tion
correct design and management techniques are
(Clarke, 2005, p. 10). And, of course, what
followed. (For a comparative discussion of the two
constitutes a worst case for some people creates
theories, see Sagan, 1993. See Key Terms section
opportunities for others. After all, careers can be
for further references.) Clarke tends towards the
built, or revitalised, on “disasters.”
former. Clarke sees disaster, or at least the potential
Second, and more critically for our purposes,
for disaster, everywhere. He highlights our narrow
Clarke argues that thinking carefully about
understanding of complex systems, and laments the
worst cases can lead to greater understanding
hubris that allows us to think we can control them.
and con-trol (Clarke, 2005, p. 1). Clarke argues
He challenges our present understanding of the
that when policy makers plan for disasters, they
critical infrastructure: a list of critical assets often
too often think in terms of past experiences and
reads like an engineer’s inventory, he notes. It
probabilities (probabilism). This approach
neglects equally important social systems, for
leads to collective imagination breakdown.
instance.
Rather, policy makers should think in terms of
Clarke argues that what we consider accidents,
that which is possible (possibilism).
disasters—worst cases—are, in fact, quite Clarke offers only two rules with which to regulate
normal. While he acknowledges that 9/11 has possibilism. First, participants cannot “re-write
brought considerable attention recently to worst- history.” That is, participants cannot create a
case scenarios, he draws on examples from across
scenario that starts, “what if the Germans had won
countries, civilizations, and eras to demonstrate
the war,” for instance. Second, participants must
that they are more common than we think. Argu-
restrict themselves to possibilities that are
ably, the only thing that is different in Western
technically feasible. One cannot include a scenario
nations today is that our expectations for safety
that involves a “Star Wars” defence system that
and security are higher.
does not exist. Otherwise, Clarke encourages policy
makers to open their minds and plan for all kinds of
disasters.

0
Planning for the Worst, Bringing Out the Best?

Anticipating countless disasters—worst-case thinking twenty-first century entries (e.g., Would “01” be
—Clarke notes is a precautionary ap-proach to CIP. treated as “1901” or “2001”?). Such ambiguity, it
For Clarke, a precautionary ap-proach means more was feared, would result in systems failing or
than simply a slow and careful plan. He writes, “the producing inaccurate information. At the very
precautionary principle is a tool that can sometimes least, it made information unreliable.
be used to make some of those interests consider From the early nineteen nineties (when the first
worst-case possibilities. It can push policy makers to popular article about Y2K appeared) to the late
be explicit about the values they are pursuing, to nineteen nineties, the date-related computer
specify the uncertain-ties in their decision processes, glitch rose from relative obscurity to regular
and to imagine alternatives that they might otherwise headline news. Between 1997 and 2000, for
ignore” (Clarke, 2005, p. 180). instance, three major American newspapers ran
Specifically, the possibilism and counterfactu-als (“what collectively over 500 articles on Y2K. In 1997,
if” scenarios) of worst-case planning offer the promise and prior to any significant government
of four outcomes: creativity, empowerment, executive intervention in the economy, the tone
effectiveness, and equity. Clarke argues the process of the media coverage was overwhelmingly
empowers individuals to think “outside of the box.” It alarming (Quigley, 2005). 1997 estimates of
disrupts routine thought patterns, stretches the the worldwide cost of fixing Y2K-related
imagination, and poten-tially produces creative solutions problems ranged from $300bn to $1 trillion3.
that reveal the true interdependence of systems and in so The Congress and the General Accounting Of-fice
doing, allows us to make them more resilient. Moreover, (GAO; now the Government Accountability Office)
the large scope and concern over protecting all systems were particularly aggressive in seeking out
rather than those that serve an economic elite make it a information about Y2K. IT consultants and Y2K
more socially just approach (Clarke, 2005, pp. 143-144). specialists appeared before congressional com-
For Clarke, the Y2K operations embodied just such a mittees frequently to testify about the potential
strategy. We will consider the extent to which the FAA consequences of the bug. Between 1996 and 1999,
and BLS achieved the outcomes Clarke anticipates. congressional committees and subcommittees held
First, we will recall the case. over 100 hearings on the subject and the GAO
issued 160 Y2K reports and testimonials. Bruce
Hall of Gartner, a leading Y2K/IT consult-ing firm,
y2k recalled testified to a congressional committee: “we must
accept that risk exists in any technology that was
Y2K, or the “millennium bug,” referred to the fact that ever programmed by a human, examine such
in computer programmes created in the 1950s and technology for possible failures, and form
onwards, most year entries had been identi-fied in two- remediation strategies” (Hall, 1997; original em-
digit shorthand, 1965, for instance, was entered as “65.” phasis). Gartner advised “plan not to finish” and
Initially this shorthand was adopted to save on advocated “active prioritisation and technology
expensive computer memory. Through time, it simply triage.” Gartner predicted that 50% of organi-sations
became a commonly used practice. As the year 2000 would not finish their Y2K plans. Ann Couffou,
approached, however, anxiety grew that systems would Managing Director at Giga Year 2000 Relevance
be unable to distinguish between twentieth century Service, testified to Congress in 1997: “anything
entries and with an electronic component should be suspect.
The rule should be guilty until proven
innocent”(Coffou, 1997; original emphasis). She
noted the following systems were susceptible
Planning for the Worst, Bringing Out the Best?

to Y2K-related failures4: manufacturing control


systems; elevators; telephone systems; medical
evidence to support such a claim. Y2K budgets
equipment; stock markets; military messaging
were significantly greater in Anglophone coun-
systems; radioactive material waste systems; fax
tries, such the US, the UK, Canada, Australia, and
machines, electronic time clocks; landscaping
New Zealand.5 Many countries started much later
systems; vending machines; thermostats; and
and did considerably less than their Anglo-
microwave ovens.
counterparts, yet these supposed “laggards” also
Clarke singles out Y2K as an example of successful
experienced no significant systems failures 6.
“worst-case planning.” Certainly it was a Herculean
Manion and Evan (2000) argue that most
effort to manage a multifaceted threat that emerged
countries were less dependent on technology than
from the present context: our perceived dependence
countries like the US. They also argue that many
on, and interdependence between, fragile and
countries made considerable progress in the last 6
complex social and techno-logical systems. After
months of 1999, learning as they went from the
considerable negative and alarming publicity, most
countries that had preceded them. Both points are
organizations, and all sizeable ones, implemented
no doubt true. Most countries are not as
extensive Y2K plans. Planning usually entailed
dependent on IT and some countries did make
conducting a systems inventory, checking and
advances in those last 6 months. Nevertheless, if
renovating (or replacing) systems, and then testing
the bug had the bite that government publications,
them. There was also considerable effort vested in
and particularly the earlier publications in
contingency plan-ning and disaster recovery
1997/98, proclaimed, one would still expect to
scenarios. Often the process ended with a third-
see a disproportionately high number of failures
party audit confirming organizations were Y2K
in countries that prepared less. Instead, there are
compliant.
very few failures and almost no significant ones.
Budgeting for Y2K was done inconsistently and Nevertheless, there were Y2K-related failures,
therefore, the actual cost of Y2K-related and Y2K specialists are quick to bring this fact to
remediation remains elusive. Nevertheless, large people’s attention. Cap Gemini, a leading IT
companies claimed to spend anywhere from mil-
consultancy, in its ongoing survey of information
lions to tens of millions to hundreds of millions
technology executives at 161 companies and gov-
of dollars to manage the risks associated with the
ernment agencies, reported in August 1999 that
bug. The nine Wall Street firms that claimed to
75% had reported some Y2K-related failure
spend the most on Y2K, for example, reported
(Hoff-man, 1999). However, only 2% of the
that they spent $2.8bn collectively (Smith &
companies polled suffered business disruption
Buckman, 1999).
because of the problem; most worked around the
Ultimately, Y2K was a damp squib: contrary to the
glitches. In the US Senate report, Y2K Aftermath:
build-up in the early media coverage, nothing
Crisis Averted (Final Committee Report, 2000),
apocalyptic happened. Depending on who one asks,
the committee appended 13 pages (283 examples)
one receives very different explanations as to why this
of Y2K-related glitches that actually occurred.
was so. Was the problem oversold? Or was the effort
The Senate report underscored that the extent of
to eradicate the threat so success-ful? One view
the problem would never be known because only
suggests that the uncertainty sur-rounding the problem a small fraction of the occurrences would ever be
combined with the sizeable IT budgets and the reported. As with most internal problems,
considerable dependence on technology created organisations were likely to fix them and continue
incentives for Y2K special-ists to oversell the their operations unbeknownst to the general
problem. There is certainly public (US Senate Special Committee on the Year
2000 Technology Problem, 2000, p. 37).

Planning for the Worst, Bringing Out the Best?


executive intervention: Size, Structure
and Style of the faa and blS y2k issuing a memo on the potential consequences of
operationS the date-related problem in the early nineties, but
it seemed to have had little effect. The lack of
In January 1998, President Clinton signed Execu-tive impact was not lost on Congress. In July 1996, in
Order (EO) 13073, and in so doing, established a high its first Year 2000 Progress Report Card, the
standard by which all departments and agencies would be Subcommittee on Government Management,
judged: department and agency heads were responsible to Information and Technology (GMIT) issued an
ensure that “no critical Federal program would experience “F” to BLS’ parent department, the Department
disruption because of the Y2K problem.” Prior to EO of Labor (DOL), one of only four departments
13073, many members of Congress had argued that the (out of 24) to receive the lowest grade. This grade
Executive had been too slow to act. If it were so, EO 13073 was largely due to the lack of progress at BLS, an
represented a seismic shift in govern-ment efforts. All agency almost entirely dependent upon
departments and agencies were expected to go through the technology to perform its service. Systems had
standard five steps, as articulated by the GAO 7: awareness; developed over years and at times in an ad hoc
assess-ment; renovation; validation; and implementation. manner: there were no reliable inventories of
This process also included a mandatory third-party audit systems; little prioritisation of systems or supply
and the development of contingency plans. Office of chains; and varying degrees of awareness about
Management and Budget (OMB) monitored departments’ date-functionality of systems.
progress through their quarterly submissions, which were After continued criticism from Congress and the
made public. Moreover, a special appropriation was OMB, the Bureau of Labor Statistics slowly started
approved by Congress to provide additional funding for to show progress. Following direct orders from the
department and agency Y2K operations. The following Department of Labor, in May 1997, BLS appointed
sections summarise the size, structure, and style of the two a Y2K Project Manager who met monthly with the
agency case studies. Department of Labor’s CIO (GAO, 1998b, p. 4).
Despite these initial steps and the Secretary of
Size Labor declaring in a December 1997 memo that
Y2K was a departmental priority, reports from GAO
Size can be conceived of in two separate ways: and OMB continued to show slow progress at the
(1) Aggression, the extent of risk toleration in DOL and its agencies. By August 1998, for
standards and behaviour modification, and how instance, only 11 out of 23 (48%) of the BLS
far regulators go in collecting information about mission-critical systems were consid-ered
the risk; and (2) Investment, how much goes into compliant. (See, for example, OMB, 1997; OMB,
the risk regulation regime from all sources (Hood, 1998a; OMB, 1998b; GAO, 1998b.) Like all other
Rothstein, & Baldwin, 2001, p. 31). departments, BLS eventually adopted an exhaustive
IT staff within the BLS had been aware of the Y2K system-by-system check that followed the GAO
problem for years, but failed to advance any systematic guidelines. No systems were exempt. In total, the
manner of dealing with it. One director noted that the DOL estimated it spent $60.4m on Y2K (OMB,
first time the BLS had to deal with a Y2K problem was 1999c, p. 53). Approximately 50% of this total was
in 1985, to fix the Employment Projection System. He spent in 1999.
recalls The FAA was even slower off the mark than the
BLS. In January 1998, in response to a request made
by Congress to audit the FAA’s Y2K plan, the GAO
reported: the FAA had no central Y2K programme
management; an incomplete inven-
Planning for the Worst, Bringing Out the Best?

tory of mission critical systems; no overall strat-egy Structure


for renovating, validating, and implementing
mission critical systems; and no milestone dates or Structure overlaps with size to some extent. It
schedules (GAO, 1998a). In addition, until the refers to the way that regulation is organised;
aforementioned had been accomplished, the FAA’s what institutional arrangements are adopted; and
cost estimate $246m was unreliable. More the way resources invested in regulation are
importantly, the report concluded that “should the distributed. Structure can be conceived of in at
pace at which the FAA addresses its Year 2000 least two separate ways: (1) the extent to which
issues not quicken . . . several essential areas regulation involves a mix of internal and external
– including monitoring and controlling air traffic actors, including the use of intermediaries; and
– could be severely compromised” (GAO, 1998a, p. (2) how densely populated the regulatory space
15). Following the GAO report and the FAA’s is by separate institutions, and how far the risk
appearance before the congressional subcommit-tee involves multiple, overlapping systems of
on Government Management, Information, and regula-tion (Hood et al, 2001, p. 31).
Technology, the FAA started its Y2K project de The BLS Y2K operation was highly decen-
rigueur. It implemented almost immediately a vast tralised. The BLS had a Y2K committee but it
and probing Y2K operation that satisfied met infrequently. Most direction was issued by
GAO requirements. The FAA completed work on e-mail directly from the DOL and with limited
424 mission-critical systems and 204 non-mission scope for input from the BLS. Therefore, the
critical systems and was declared compliant by June BLS Y2K staff (one director and one analyst)
1999, less than 18 months after the GAO’s highly simply forwarded directions to individual
critical report. Financial estimates are rough and programme areas. When Y2K IT staff received
broad. The US Department of Trans-portation, of the results from the programme areas, they
which the FAA’s Y2K costs were a part, spent collated the reports, summarised them, and
$345.8 m on Y2K between 1996 and 2000 (OMB, submitted them to the DOL.
1999c, p. 51). There was considerable redundancy, insti-
In sum, there was considerable uncertainty in both tutionalised “double-checking,” within Y2K
agencies about the scope of the problem. Systems projects in the form of audit. The BLS established
had developed over time, and often in an ad hoc and operated its own Internal Verification and
manner; they were not well documented. Y2K Validation process (IV and V). In addition to the
efforts in both agencies revealed critical systems, IV and V, the Department of Labor’s Office of
for instance, that the IT units did not even know the Inspector General (OIG) contracted external
existed. After slow starts, OMB and GAO prodded auditors to carry out three audits at BLS.
both agencies into detailed and exhaustive From the time of the GAO report on the FAA,
responses to Y2K, which included the following Administrator Garvey established a Y2K office
mandatory steps for all systems: awareness; and directed it to provide leadership, guidance
assessment; renovation; validation; and and oversight, to FAA’s seven Lines of Business
implementation. The project foregrounded, in (LOBs) and aviation industry partners. In the
particular, the importance of IT, supply chains, and main, the FAA used contractors to get the work
contingency plans to the agency’s overall mission. done and auditors to provide assurances. The
Although a final sum remains elusive, Department of Transportation conducted its own
Y2K efforts absorbed millions. audit of FAA Y2K readiness.
Planning for the Worst, Bringing Out the Best?

The FAA started to open itself up. Agency staff held occasion, the strife between the external auditors
meetings with key players and industry rep-resentatives and the BLS staff was such that senior staff from the
as part of the US government’s Y2K working group on DOL had to come in to mediate the dispute. In
transportation. The intention of the meetings was not to another example, one director at the BLS recalled
dictate plans to industry, but rather to allow people to being pressured by the DOL into developing a work
share information about what their organisations were schedule that “showed progress.” That is, that each
doing. The FAA also started having more press quarterly report would show marked progress over
conferences and inviting its critics to visit the FAA and the previous quarter. The staff member at DOL
discuss its plans for compliance. suggested that such a report would satisfy
In sum, what started in both agencies as a large-ly internal Congress’s expectations. The director noted that he
operation transformed into something much more fluid, advanced the anticipated completion date in order to
both horizontally and vertically. Guidelines were generated meet the DOL’s (and OMB’s) request. When the
externally and were issued as top-down commands. The system was not ready in time, the BLS found itself
importance of supply chains meant that both agencies had criticised in the quarterly OMB report. Relations
to consult extensively with external parties to ensure their between the BLS and the DOL became strained, and
operations would be fully compliant. Moreover, the need to eventually the head of the Y2K project at the DOL
convince others, such as the lead department, OMB, GAO, was replaced.
and Congress, that the agencies were indeed compliant To a certain extent, the professionalism of the IT
meant that these normally independent agencies had to staff at the BLS was undermined not only by
open themselves up to allow external parties to test and external auditors and staff at oversight agencies, but
validate the agencies’ Y2K operations. Under considerable even executives within their own agency. The staff
pressure from external parties, the FAA concentrated its felt that their advice frequently fell on deaf ears.
Y2K authority into its Y2K team. The BLS, on the other One interim strategy IT staff proposed was that they
hand, followed the directives of the DOL, but maintained a would focus Y2K efforts exclusively on those
highly decentralised approach within the agency. systems that were required to produce reports in
January 2000. All other systems would be
Style considered non-critical. (Many BLS reports are
issued quarterly or semiannually.) This ap-proach
Style overlaps with the other two descriptions of would have allowed BLS staff to delay any Y2K
management. It can be conceived of in two ways: (1) work on non-critical systems until after January 1,
how far regulation is rule bound or discretionary; and 2000, and then only if the system failed. BLS IT
(2) the degree of zeal the ac-tors show in pursuit of staff felt this was a balanced approach: fix those
policy objectives. With style, culture and attitudes are systems that were required for January 2000
important (Hood et al, 2001, p. 32). outputs; and “wait and see” with the rest. In
There was a considerable amount of conflict between BLS September 1998, the DOL informed the BLS it
staff and staff from outside the agency. The BLS staff would be judged by the GAO guidelines, which
noted repeatedly that the systems were too complex for made no such distinction, and therefore it was back
outside auditors to understand and audit in a matter of to “audit, fix and test everything” by the deadline. In
weeks. On one another example, one of the interview subjects for
the International Price System noted that the system
had been upgraded in 1995 and made Y2K
compliant at that time. Despite making this point
known within his chain of command, he was still
directed to perform the Y2K audits. He estimates
Planning for the Worst, Bringing Out the Best?

that a good part of 1 full year for him and his tor, and the press to all the events. The FAA also
team was spent exclusively on Y2K, with little invited its most severe critics to its offices, Y2K
ever being found that was not compliant. management gurus such as Ed Yardeni and Peter
The intervention of Congress at the FAA marked a De Jager, to explain what the FAA was doing to
significant shift in style at the agency. mitigate the risk. (DeJager eventually agreed to
Administrator Garvey started having weekly fly on a plane during the changeover, as did
meetings specifically on Y2K with the Executive Administrator Garvey.) With respect to the avia-
Committee at the agency. Garvey told the Y2K tion industry, one interview subject noted that
office that they should report to her directly, and organizations had been working in silos. By the
insisted that they name individuals responsible if end of 1998, the FAA was sponsoring industry
progress slipped. One interview subject noted it got days and having leading players in the industry
some people angry and created some friction but the present their Y2K plans to others; it helped to
message was clear: people were going to be held to foster a cooperative spirit in the industry.
account. The entire approach to the problem In sum, neither agency at the operational level,
changed. One interview subject noted, “it was initially, took the problem as seriously as the
removed from the realm of the IT geeks and moved oversight offices eventually did. Many were loath
to every corner of top management.” It was also to break from their regular work routines. This
characterised as a political problem. The Deputy resulted in false starts, inconsistent report-ing, and
Secretary at the DOT informed his staff that if there delayed progress. As “outsiders” (i.e., OMB,
were shortcomings it would make the President look GAO, OIG) began to infringe on their space—
bad at the start of an election year in which the prescriptive orders, timelines, templates, audits—
“Technology” Vice President would be standing for IT agency staff became aggravated. Y2K work
election. was often boring, detailed, and tedious,
There was a sense of urgency that empowered area supplemented by (sometimes) weekly detailed
specialists to make decisions. According to every reporting requirements. In some cases, staff were
interview subject, money was not an object. The confident either there were no Y2K-related
person that coordinated the FAA financial requests problems or they could fix them as part of their
simply had to call the OMB and request the money. normal processes, yet these observations went
The FAA was never refused. One middle manager unheeded, particularly at the BLS. Like it or not,
noted how, on one occasion, he signed personally staff were corralled into large Y2K operations
for a $15 m upgrade during the 1998/1999 period, where risk management meant risk elimination.
something that would have been unthinkable and Eventually, the FAA IT staff thrived in such a
organisationally unjustifiable a few months earlier. context; the BLS staff did not.
When the FAA started to open itself up it was not
simply a question of “structure” but also one of
“style.” The FAA wanted to show Congress, the worSt-caSe planning at the faa
media, and any other detractors that it was taking and blS
their criticisms seriously. Throughout 1999, it ran
multiple “end-to-end” tests, “war game scenarios,” Clarke argues that worst-case planning encour-
and on one occasion simulated the “December 31- ages creativity, empowers participants, renders
January 1” date change in real time. The FAA effective solutions, and achieves equity in those
invited the GAO, the Office of the Inspector solutions.8 The definition for each of these terms
General (OIG), the FAA’s Administra- is subject to considerable negotiation. We offer a
provisional, context-sensitive understanding
Planning for the Worst, Bringing Out the Best?

of each term, and consider the extent to which worst- Y2K staff confronted agency executives at Y2K
case planning at these agencies embedded these planning meetings when they failed to meet Y2K
qualities. deadlines; Y2K staff authorised considerable
Where it existed, Creativity, unconventional, “out of the spending on Y2K projects; they briefed members
box” ideas, tended to include either clever “work- of Congress and the media. Many from the team
around” solutions (e.g., how opera-tions would recover describe Y2K as the highlight of their careers.
in the event of failures) or lucid explanations as to why Ostensibly, both departments were equally effective.
certain systems were not vulnerable to Y2K-related All systems were identified, fixed, tested, and
failures despite claims that all systems had to be audited. No systems failed as a result of the Y2K
checked. On the balance, however, the executives in bug. Insofar as this was the goal, both agencies
neither agency encouraged creativity in operations. In accomplished their missions. From a project
fact, the opposite is true. Straying from GAO guidelines management standpoint, however, there is again a
almost inevitably led to negative reports from auditors distinction between the BLS and the FAA. The BLS
and unfavourable exposure within Congress. Staff at started earlier than the FAA but became bogged
BLS in particular had alterna-tive solutions dismissed by down with conflicts between themselves and
executives within the agency because of anxieties over external agents. The FAA, on the other hand, started
relations with DOL, OMB, and Congress. late but accomplished its mission in a relatively
If we understand empowerment to mean giv-ing quick 18 months. The profile of Y2K, support from
someone influence and authority, then BLS IT staff lost the Administrator, and strong sense of mission
power over the Y2K issue by early 1998. What seemed within the team had considerable impact at the
like its reluctance to cooper-ate with central Y2K agency. The Y2K staff worked long hours; they
directives led to power and influence seeping away to developed stretch targets; they celebrated together
external parties. BLS staff note that despite having an when they met them; and they felt a sense of
excellent track record for maintaining their systems (a collective defeat when they missed them. Because
point corroborated by many external interviews), their they were never refused funding, they never got
advice was frequently ignored. Conflicts with external bogged down in tedious budget debates; they got
auditors sent in by DOL were common place. what they wanted, when they wanted it. Indeed, they
Ultimately, detailed and regular reporting requirements felt they were doing important work. The BLS staff
ensured IT staff never strayed from OMB/GAO central did not.
directives. Staff were not em-powered; rather, they If we broaden our understanding of effec-tiveness to
became at times combative; at times demoralised. include value for money, however, evaluating Y2K
The FAA Y2K office was working under the same efforts becomes much more problematic. Anything
guidelines as BLS, yet the FAA IT staff felt empowered by that might moderate the response, cost-benefit
them. The Y2K team had consider-able support from an analyses, probability risk assessments, and arguably
active Administrator who put Y2K at the top of the list of even professional judgement, were rarely
agency priorities. This support empowered the team to incorporated into Y2K projects systematically in
employ tactics in achieving Y2K compliance that would, either agency. Indeed, the only limitation on efforts
under other circumstances, be considered inappropri-ate was the fixed dead-line. As one FAA Y2K team
and perhaps career limiting. For instance, the member estimated, only about one in five systems at
the FAA required Y2K-related work, but given the
climate, it was simply not possible to tell Congress
and the media that the FAA would not be looking at
four fifths of its systems. Clearly, this approach
increased
Planning for the Worst, Bringing Out the Best?

the cost considerably with little or no discernible significant overlaps in the form of mandatory
benefits from a systems standpoint. third-party audits and contingency plans.
Viewed through one lens, the Y2K plan seems to However, the agencies did not adopt, convinc-ingly,
have achieved a sense of equity: all systems were the style of behaviour that Clarke attributes to
checked and, if necessary, fixed, and therefore, all worst-case planning. One agency was led through
systems were treated equally. If one assumes that the process by strong and persuasive leadership,
different people depend on different services and whereas the other was pitch-forked into doing its
therefore, would prioritise systems differently, then, job, insulted and demoralised. Clarke’s anticipated
indeed, the Y2K plan did not serve some people’s qualities of worst-case planning, cre-ativity,
interests over others. empowerment, effectiveness, and equity, were only
Again, however, if we assume a broader un- achieved occasionally and largely at the FAA.
derstanding of equity that views the Y2K project Somewhat ironically, this difference had little
as only one initiative in agencies with limited impact on either agency’s capacity to achieve its
resources and numerous competing projects, then goal of becoming fully compliant. Hence, while one
the equity achieved within the agencies can be might conclude that worse-case planning is possible,
challenged. For over a year, Y2K projects were one cannot conclude that such planning would
given a privileged status within govern-ment: necessarily result in the organizational dynamics
reporting requirements were exhaustive and that Clarke predicts.
inflexible and there were virtually no limits on The second question is more contentious: is worst-
Y2K funding. The effort absorbed consider-able case planning desirable? The absence of any
resources and in so doing, neglected other moderating measures risks driving up costs and
priorities. As one IT manager at BLS noted, the marginalizing the role of professional judgement.
opportunity cost of the project was rarely con- To start, Clarke skirts the issues of cost-benefit
sidered. One BLS executive went further: he felt analyses and refutes the notion of probability risk
Y2K diverted so many resources that it should be assessments, even as heuristic devices. In the case
described a “failure of leadership, a derelic-tion of Y2K, there seemed to be few constraints acting
of duty.” on Y2K project teams other than organizational
inertia and the fixed timeline, and therefore, with all
the available funding and the pressure to be-come
concluSion compliant, it was easier at the operational level
simply to check and fix everything. As a result,
To finish, we will recall two central questions relatively unimportant systems, as well as those that
established at the outset of the chapter: First, did showed no discernible date-related risk, were
the US government’s response to Y2K, as subject to expensive, resource-intensive Y2K
demonstrated at the BLS and the FAA, constitute treatment. There were alternatives. Rather than
worst-case planning? Second, is worst-case plan- simply measuring the extent to which systems were
ning an effective and desirable risk management going through the standard process, more feedback
strategy? loops could have helped detect the extent to which
The answer to the first question is almost certainly problems/failures were actually being identified, for
yes. The processes that supported the government’s instance. A more proactive effort at prioritisation of
Y2K objective of “no disruption to service” were systems could also have helped focus efforts and
largely the same in both agencies, slow, detailed, limit resource expenditure.
and exhaustive. There were also Clarke also contends that probabilism protects the
interests of the powerful. Again, he is not en-
Planning for the Worst, Bringing Out the Best?

tirely persuasive. One might argue, for instance, that alone are likely to detect. As always, the art is
probabilism protects the majority, which is not in finding the appropriate balance, which is
necessarily impractical in a world of limited resources always the subject of power and negotiation.
and unlimited potential for crises. In any event, it is
unlikely that a precautionary approach would lead to
addressing economic and social injustice. Powerful future reSearch directionS
interests can just as easily manipulate precautionary
approaches to ensure existing power structures are The prevalence of disasters caused by the
perpetuated, if not reinforced. Indeed, while there may failures of complex systems keeps the normal
have been inertia among many organizations not accidents/ high reliability organizations debate
wanting to act on Y2K-related problems, the Y2K fresh and meaningful. This chapter has
initiative was hardly a challenge to the status quo. In attempted to con-tribute to an aspect of this
many respects, it was the opposite. EO 13073 directed literature by examining the case of Y2K. Three
agencies to ensure there would be no disruption to broad and related sets of questions could
federal government programs. It also directed benefit from serious scholarly research.
government departments to work with industry to help
ensure the same standard. Y2K was a threat to “business Question one: learning from
as usual”; the government worked to ensure stability. disasters
Clarke’s precautionary approach is in fact selective in the
reality it portrays. While in some ways he encourages First, how do we learn from disasters? What
policy makers to open their minds in fact, he institutional arrangements are in place to gather
simultaneously offers a very narrow view of the problem. information, set standards, and modify behaviour in
Cass Sunstein (2005) astutely argues in his criticism of the light of past failures? Who interprets the fail-ures?
precautionary principle that if one chooses to invest in Ironically, government Y2K postmortems tend to
mitigating the risks associated with one problem one reinforce starting assumptions rather than question
almost certainly neglects other risks. How much spending them. After Y2K passed without incident, the UK
on Y2K was too much spending? What projects went government concluded, for instance, first, that its
unfunded or under-resourced because of the insatiable reaction to Y2K was an unqualified suc-cess, and
appetite for Y2K work or the unwilling-ness to live with second, that Y2K brought to light the significant
inconsequential failures? There are no answers to these extent to which economies depend on IT (Cm 4703,
questions, but that they were hardly considered is 2000). Few, if any, Y2K postmortem note the
disconcerting. Clarke is silent on the drawbacks of such variation in dependence on IT across sectors or
overreactions and the lack of transparency potentially at countries; nor do they note the ability of
work in the process he advocates. organisations and people to respond with flex-ibility
These problems do not lead to a complete indictment of and creativity to system failures. Rarely (if ever)
worse-case planning. Whereas Sunstein alerts us to the does one see a serious investigation of the
trade-offs inherent within precautionary approaches, considerable resources allocated to minimizing the
Clarke alerts us to the landmine of potential low impact of the Y2K bug.
probability/high con-sequence disasters that Y2K is one of the only preventative, pan-eco-
precautionary approaches nomic CIP initiatives on record. Many aspects of
the initiative were successful. If we are going to
learn from events such as Y2K, we have to look
seriously at the assumptions that underpinned the
planning and think more carefully about the
Planning for the Worst, Bringing Out the Best?

extent to which it was a success and a failure and rorism, for instance. Does this make a difference
integrate those lessons into future planning. By in their strategising about CIP? Many developing
this measure, Clarke’s book is a valuable countries experience systems failures regularly.
contribution. Does this inform how they prepare for and react
to disasters? By the same token, if the levees in
Question two: cooperation New Orleans had been sabotaged by an act of
terrorism rather than resulted from a hurricane,
Second, given the perceived interdependence across would the US government’s reaction have been
sectors and political jurisdictions, how do different (better)? Would the American public
governments and industries work together on these have been more forgiving? We could benefit from
issues? 9/11, the 2003 power failure, and the 2005 serious scholarly investigation into these matters.
tsunami, for instance, had significant global
impacts. They necessitated cooperation across
borders. While these crises require global referenceS
collaboration, there are several technical, eco-
nomic, cultural, and political barriers that prevent Blitz, J., Hope, K., & White, D. (1999). Mediter-
cooperation. In times of uncertainty, governments ranean trio see millennium bug as problem for
and industry can be reluctant or unable to disclose manana. Financial Times, November 12.
potentially sensitive information. Moreover, when
Bowen et al (1999), Low tech culture may
governments do wish to cooperate, as many have
prove region’s Y2K saviour. Financial Times,
since the advent of the “war on terror,” how read-ily
transferable is information? To what extent is Novem-ber 30.
information shaped by the institutional arrange- Clarke, L. (2005). Worst cases: Terror and ca-
ments within each country? So, for instance, when tastrophe in the popular imagination. Chicago:
the US government, representing a largely pluralist, University of Chicago Press.
federalist system, exchanges informa-tion with the
Cm 4703. (2000). Modernising government in
UK government, representing a largely unitary,
action: Realising the benefits of Y2K. London:
corporatist system, are they really exchanging the
same types of information? Can these differences
HMSO.
lead to incompatibility? How is this addressed? Couffou, A. (1997). Year 2000 risks: What are
What are the implications? the consequences of technology failure?
Statement of Hearing Testimony before
Question three: implications of Subcommittee on Technology and Subcommittee
differences on Government Management, Information and
Technology. March 20. Washington DC: GPO.
Third, if one anticipates different types of fail-ures
General Accounting Office. (1998a). FAA com-
or one defines the critical infrastructure differently,
puter systems: Limited progress on year 2000
how does this affect planning for and reaction to
issue increases risk dramatically. GAO/AIMD-
failure? What constitutes the critical infrastructure
98-45. January 30. Washington, DC.
may seem universal at first blush, but it is almost
certainly context sensitive: it is embedded in a General Accounting Office. (1998b). Year
specific geographic, cultural, eco-nomic, and 2000 computing crisis: Progress made at
political setting. Many countries are more labor, but key systems at risk. GAO/T-AIMD-
concerned about natural disasters than ter- 98-303. September 17. Washington, DC.

0
Planning for the Worst, Bringing Out the Best?
Government Accountability Office. (2005). Critical
infrastructure protection: Department of Homeland
Office of Management and Budget. (1999c).
Security faces challenges in fulfill-ing cybersecurity
Progress on year 2000 conversion.
responsibilities. GAO-05-434. May. Washington DC.
Washington, DC: GPO. December 14.
Hall, B. (1997). Testimony to Government Man-
Perrow, C. (1999). Normal accidents: Living with
agement, Information and Technology Subcom-
high-risk technologies. Princeton: Princeton
mittee. March 20.
University Press.
Hoffman, T. (1999). Y2K failures have hit 75% of US
President’s Council on Year 2000 Conversion.
firms. Computerworld, August 16, 33:33.
(2000). The journey to Y2K: Final report. Re-
Hood, C., Rothstein, H., & Baldwin, R. (2001). The trieved October 2004, from http://www.y2k.
government of risk: Understanding risk regulation gov/docs/lastrep3.htm
regimes. Oxford: Oxford University Press. Quigley, K. F. (2005). Bug reactions: Consider-
Jack, A. (1999). Level of Russian IT lessens year ing US government and UK government Y2K
2000 fears: Moscow has been late in putting to-gether operations in light of media coverage and public
some moderate Y2K defences. Financial Times, opinion. Health, Risk & Society, 7(3), 267-291.
November 26. Sagan, S. (1993). The limits of safety: Organiza-
Kheifets, L. Hester, G., & Banerjee, G. (2001). The tions, accidents and nuclear weapons. Princeton:
precautionary principle and EMF: Imple-mentation Princeton University Press.
and evaluation. The Journal of Risk Research,
Smith, R., & Buckman, R. (1999). Wall Street
4(2),113-125.
deploys troops to battle Y2K: Command
La Porte, T. R., & Consolini, P. (1991). Working in centre, jets, extra toilets are at the ready. Wall
practice but not in theory: Theoretical chal-lenges of Street Journal, December 22.
high reliability organizations. Journal of Public
Sunstein, C. (2005). Laws of fear: Beyond the
Administration Research and Theory, 1, 19-47.
precautionary principle. Cambridge:
Manion, M., & Evan, W. (2000). The Y2K problem and Cambridge University Press.
professional responsibility: A retrospective analysis.
Suzman, M. (1999), World is mostly ready for
Technology in Society, 22(3), 361-387.
Y2K. Financial Times, December 30.
Office of Management and Budget. (1997). Prog-ress
US-Canada Power System Outage Task Force.
on Year 2000 conversion. Washington, DC: GPO.
(2004). Final report on the August 14 2003
November 15.
black-out in the United States and Canada:
Office of Management and Budget. (1998a). Causes and recommendations. Published
Progress on Year 2000 conversion. Washington, DC: jointly by the US Government and the
GPO. May 15. Government of Canada. Retrieved October
Office of Management and Budget. (1998b). 2005, from http://reports. energy.gov
Progress on Year 2000 conversion. Washington, DC: United States Senate Special Committee on the
GPO. August 15. Year 2000 Technology Problem. (2000). Y2K
aftermath – A crisis averted (Final Committee
Report). Retrieved October 2004, from http://
www.senate.gov
Planning for the Worst, Bringing Out the Best?

Vaughan, D. (1996). The Challenger launch deci- Jaeger, C., Webler, T., Rosa, E., & Renn, O.
sion: Risky technology, culture, and deviance at (2001). Risk, uncertainty and rational action.
NASA. Chicago: University of Chicago Press. London: Earthscan.
Weik, K., & Roberts, K. (1993). Collective mind Kheifets, L. Hester, G., & Banerjee, G. (2001).
in organizations: Heedful interrelating on flight The Precautionary Principle and EMF: Imple-
decks. Administrative Science Quarterly, 38(3), mentation and evaluation. The Journal of Risk
357-381. Research, 4(2), 113-125.
La Porte, T. R., & Consolini, P. (1991).
Working in practice but not in theory:
additional reading
Theoretical chal-lenges of high reliability
organizations. Journal of Public
Beck, U. (1992). Risk society. London: Sage.
Administration Research and Theory, 1, 19-47.
Bellamy, C., & Taylor, J. (1998). Governing in
Margetts, H. (1999). Information technology in
the Information Age. Buckingham: Open
government: Britain and America. London:
University Press.
Routledge.
Clarke, L. (2005). Worst cases: Terror and ca-
National Institute of Standards and
tastrophe in the popular imagination. Chicago:
Technology. (2001). Risk management guide
University of Chicago Press. for information technology systems. Special
Cm 4703. (2000). Modernising government in Publication 800-30. Washington DC: GPO.
action: Realising the benefits of Y2K. London: Neustadt, R. E., & Fineberg, H. (1983). The
HMSO. epidemic that never was: Policy-Making and
Finkelstein, A. (2000). Y2K: A retrospective the swine flu scare. New York: Vintage Books.
view. Retrieved from Perrow, C. (1999). Normal accidents: Living with
http://www.cs.ucl.ac.uk/Staff/ A.Finkelstein, high-risk technologies. Princeton: Princeton
originally published in Comput-ing and Control
University Press.
Engineering Journal, 11(N4), 156-159.
Power, M. (2004). The risk management of ev-
Heeks, R. (1999), Reinventing government in the
erything. London: Demos.
Information Age. London: Routledge.
President’s Council on Year 2000 Conversion.
Hood, C. (1996). Where extremes meet: SPRAT
(2000). The journey to Y2K: Final report. Re-
vs. SHARK in public risk management. In C.
trieved October 2004, from http://www.y2k.
Hood and D. Jones (Eds), Accident and design.
gov/docs/lastrep3.htm
London: UCL.
Quiggin, J. (2005). The Y2K scare: Causes,
Hood, C., Rothstein, H., & Baldwin, R. (2001). The
costs and cures. Australian Journal of Public
government of risk: Understanding risk regulation
Admin-istration, 64(3), 46-55.
regimes. Oxford: Oxford University Press.
Quigley, K. (2004). The Emperor’s new computers:
House of Commons Library. (1998). The millen-
Y2K (re)visited. Public Administration, 82, 4.
nium bug. Research Paper: 98/72. London.
Quigley, K. F. (2005). Bug reactions: Consider-
ing US government and UK government Y2K

Planning for the Worst, Bringing Out the Best?


operations in light of media coverage and public
opinion. Health, Risk & Society, 7(3), 267-291. Federal Aviation Administration (FAA):
Rochlin, G. (1997). Trapped inside the net: The Responsible for the safety of US civil aviation.
unanticipated consequences of computerization. The FAA regulates civil aviation, develops civil
Princeton: Princeton University Press. aeronautics, develops and operates air traffic
control, researches and develops the National
Sagan, S. (1993). The limits of safety: Organiza-tions,
Airspace System and civil aeronautics, carries out
accidents and nuclear weapons. Princeton: Princeton
programs to control aircraft noise and other
University Press. environmental effects of civil aviation, and regu-
Sunstein, C. (2005). Laws of fear: Beyond the lates US commercial space transportation. The
Precautionary Principle. Cambridge: Cambridge FAA’s parent department is the Department of
University Press. Transportation.

Taylor-Gooby, P. (2004). Psychology, social psy- Government Accountability Office: The


chology and risk . Working paper. Social contexts and Government Accountability Office (GAO) (for-
responses to risk network, University of Kent at merly the General Accountability Office) is an
Canterbury. Retrieved from http://www.kent. agency that works for Congress. Congress directs
ac.uk/scarr/papers/papers.htm GAO to study the programs and expenditures of
the federal government. GAO, commonly called
Vaughan, D. (1996). The Challenger launch deci-sion:
the investigative arm of Congress or the congres-
Risky technology, culture, and deviance at NASA. sional watchdog, is independent and non-partisan.
Chicago: University of Chicago Press. It studies how the federal government spends its
Wahlberg, A., & Sjoberg, L. (2000). Risk percep-tion money. GAO advises Congress and the heads of
and the media. Journal of Risk Research, 3(1), 31-50. executive agencies about ways to make govern-
ment more effective and responsive. GAO evalu-
Weik, K., & Roberts, K. (1993). Collective mind in
ates federal programs, audits federal expenditures,
organizations: Heedful interrelating on flight decks.
and issues legal opinions.
Administrative Science Quarterly, 38(3), 357-381.
High Reliability Organizations Theory (HRO):
Hazardous technologies can be safely controlled
key termS by complex organizations if the correct design
and management techniques are followed, such as
Bureau of Labor Statistics (BLS): An inde-pendent top-down commitment to safety, built-in
national statistical agency that collects, processes, organizational redundancies, the existence of a
analyzes, and disseminates statistical data to the “safety culture” in the organization, including a
American public, the US Congress, other Federal commitment to learn from accidents. HRO is a
agencies, State and local govern-ments, business, and response to Normal Accidents Theory. (See next
labor. The BLS also serves as a statistical resource to entry.) Key contributors include LaPorte and
its parent department, the Department of Labor. Consolini, 1991 and Weik and Roberts, 1993.

Critical Infrastructure Protection: Activi-ties that Inspector General (Office of the Inspector
enhance the physical and cyber-security of key public General): The President appoints and Congress
and private assets. approves Inspectors General for each government
department. The Inspector General Act (1978)
gives the Office of the Inspector General (OIG)
autonomy to audit the department (and related
agencies) for which the Inspector General is Planning for the Worst, Bringing Out the Best?

responsible without interference.


normal Accidents Theory (nAT): Accidents are
inevitable in organizations that have social and
Worst-Case Planning: Devising plans based
technical interactive complexity and little slack
on counterfactuals (“what if” scenarios). The
(tight coupling). No management design can
counterfactuals need only be possibilities; they
overcome this inevitability. NAT builds on Chaos
need not be “likely” (in any calculable way) to
Theory. Key contributors include Perrow, 1984,
occur.
Sagan, 1993, and Vaughan, 1996.
Year 2000 (Y2K): Y2K referred to the fact that in
Office of Management and Budget (OMB):
computer programmes created in the 1950s and
Assists the President in overseeing the preparation
onwards, most year entries had been identified in
of the federal budget and supervises its admin-
two-digit shorthand, 1965, for instance, was en-tered
istration in Executive Branch agencies. OMB
as “65.” Initially, this shorthand was adopted to save
evaluates the effectiveness of agency programs,
on expensive computer memory. Through time, it
policies, and procedures, assesses competing
simply became standard practice. As the year 2000
funding demands among agencies, and sets fund-ing
approached, however, anxiety grew that systems
priorities. OMB ensures that agency reports, rules,
would be unable to distinguish between twentieth
testimony, and proposed legislation are consistent
century entries and twenty-first century entries (e.g.,
with the President’s Budget and with
Would “01” be treated as “1901” or “2001”?). Such
Administration policies.
ambiguity, it was feared, would result in systems
Possibilism: Counterfactuals (“what if” sce- failing or producing inaccurate information. At the
narios). Thinking in terms of that which is very least it made informa-tion unreliable. The US
possible rather than that which is probable. government eventually spent $8.5 billion to ensure
government systems were compliant. The US
Probabilism: Thinking in probabilities, draw-ing
government estimated that Y2K-related repairs cost
mainly from past experiences.
the US economy $120 billion (President’s Council
Size, Structure, and Style of Risk Regulation on Year 2000 Conversion, 2000).
Regimes: Hood, Rothstein and Baldwin (2001) deploy
the concept of “regimes” to explore and contrast
variety in risk regulation across differ-ent policy areas.
endnoteS
Size considers aggression and investment in risk
regulation. Structure considers the mix of internal and
1 Kheifets, Hester, and Banerjee (2001) note
external actors and how densely populated the
there are several definitions for the precau-
regulatory space is. Style considers how far regulation
tionary principle, which can be grouped
is rule bound or discretionary, and the degree of zeal loosely into three trends. The three trends
the actors show in pursuit of policy objectives. vary in emphasis and severity with respect
Worst Case: According to Lee Clarke, a context- to the evidence required to justify action as
sensitive event, and is interpreted largely through well as the action taken. Clarke’s specific
our capacity to identify with the victims. ap-proach will be elaborated in the next
section, “Background: Worst cases and
organization studies.”
2 Wall Street Journal, New York Times, and
USA Today.
3 By the year 2000, the US Treasury
estimated the cost to be $300bn worldwide.

Planning for the Worst, Bringing Out the Best?


4 Couffou was discussing the vulnerability
of embedded systems. Embedded systems
contain programmed instructions running
via processor chips. The processor chips
are similar to standalone computers buried 7
Based on preliminary work conducted by
inside various kinds of equipment the CIO Council based at the Office of
5 The Dutch government also had a Management and Budget.
sizeable Y2K operation in place. 8
As noted earlier Clarke is supportive of how
6 For examples, see Blitz, Hope, and White organizations reacted to Y2K. At the same
(1999) on Italy, Spain, and Greece, Jack time, Clarke notes his skepticism in Worst
(1999) on Russia, Bowen (1999) on Latin America, Cases about bureaucracies’ capac-ity to
and Suzman (1999) on Eastern Europe and Indonesia. respond in a crisis. He argues they are too
inflexible. Certainly, this research would
support such a claim. While on the one hand
I share Clarke’s skepticism about
bureaucracies’ capacity to respond in a cri-
sis, I am less convinced by organizational
responses to Y2K. This chapter attempts to
make clear these concerns.

You might also like