The Y2K problem affects almost every computer system in some way. Because of the age we live in, it is
safe to assume that Society rely on computers to a large degree
The problem started out innocently enough. Well intentioned programmers using two digits to represent
the year instead of four. Using the full four digits would have expanded the date field by one third at a time
when memory was very expensive. The practice continued to maintain compatibility with older systems
The Year 2000 problem, also known as Y2K, the “millennium bug,”, centers on
the inability of most computer systems to understand a four-digit year. Most systems are designed to accept
dates in a month/day/year (mm/dd/yy) format where the year is represented by two-digits and the century
is assumed. For example, 1997 is represented as 97. 1901 is represented as 01. A two-digit year is fine until
a new century is entered. Most computer systems are designed to assume the year 01 is 1901, not 2001.
Many other computer systems will become confused by their clocks turning back 100 years and will
actually stop functioning. Others will automatically turn their clocks back to factory default dates.
The century assumption is buried into millions of lines of programming code, some of which may be
decades old. The result affects any application or system that uses arithmetic computations based on a
two-digit year because two digits do not distinguish between centuries. The assumption that the Year 2000
is 1900 will lead to incorrect data: incorrect accounting, erroneous reports, operational mistakes, system
access failures, forecasting errors, and application failures. The failures and errors will occur whenever a
system uses a date after 12/31/99. To fix the problem, an estimated 2% of all programming code needs to be
located and repaired.
By reducing the year to two digits, it is estimated that companies saved
30% of storage costs. The practice of using a two-digit year became an industry standard that in many cases
is still being practiced today.
However, these digits violate laws that everyone must follow — the laws of time.
the Year 2000 is a leap year. The rule for leap years is confusing: the year must
be divisible by 4, except if divisible by 100; unless divisible by 400 or is equal to 3300. Because of this
confusion, many systems will not expect another leap year until 2004. A survey of executives found the
majority were not aware that the Year 2000 was a leap year and were incorrect in explaining the formula to
determine a leap year. Another survey found consultants fared poorly in their knowledge of the Year 2000
as a leap year. Companies believing they are Y2K compliant might breathe a heavy sigh of relief on January
1, 2000, only to be shocked on February 29, 2000!
Because data is the life blood of most organizations, this
transfusion century-data requires careful consideration for systems.
Another problem involves something called default dates. A common default value was 9/9/99 for
keypunched data, particularly expiration dates. The date was common for programmers testing and
debugging their systems. The date was used because it was an invalid future date, but could cause
problems once the date becomes valid.
Old micro computers will be the hardest hit. Most PCs track their dates in BIOS. Pre-1997 PCs will need to
have their BIOS replaced or risk faulty dates and lock-ups. Even with BIOS fixed, many DOS systems will
return to an error date of 1/1/80 on January 1, 2000.