Professional Documents
Culture Documents
Background:
Y2K bug is also called Year 2000 bug or Millennium Bug or Y2K. The Year
2000 problem is rooted in the way dates are recorded and computed in many
computer systems. Systems have typically used two digits to represent the year
for the past several decades such as "97" representing 1997, in order to conserve
on electronic data storage and reduce operating costs. With this two-digit format,
the Year 2000 is indistinguishable from 1900, 2001 from 1901, and so on. This
decision was made back in the 1960's and early 1970's. Software companies
found that it would be cheaper to represent the date in a computer as six digits
(dd/mm/yy) as opposed to eight digits (dd/mm/yyyy). This works fine as long as
we are using dates that have the 19 as a prefix but when the year 2000 hits this
will cause systems that use the hard coded 19 to fail. The practice of representing
the year with two digits becomes problematic with logical errors.
It identifies two problems that may exist in many computer programs.
1. System or application programs that use dates to perform calculations,
comparisons, or sorting may generate incorrect results when working with
years after 1999.
Reasons:
Reasons the Y2K problem is hard to fix.
Effects:
Ways the Y2K Problem Affects Us
Programming solutions:
Date expansion:
Two-digit years were expanded becoming four-digit years.
Date re-partitioning:
In legacy databases whose size could not be economically changed, six-digit
year/month/day codes were converted to three-digit years but this delays the
problem to the end of the year 2899.
Windowing:
Two-digit years were retained, and programs determined the century value
only when needed for particular functions but this is not a permanent solution.