There seems to be some confusion about what the Y2K problem actually is. Why is this?

The problem itself is very simple, but occurs only in a specific time subset.

We do not normally think of Time as a series of subsets. We have no words for it. This makes it difficult for the layman (or even the IT professional) to think about "where" it is and what it is. So we make up little examples about what we think it is. And that REALLY confuses the issue.

You have to differentiate very carefully when talking about Time and Dates, because they are in such common usage, people have their own prejudices and opinions, and misinterpration is rife.

The Y2k Problem occurs in a specific subset of Time and Dates:

Lets set up a Time-Line Model:

Computer Date Data    (Whatever you care to define)
Universal Time        (Billions of Years?)
Galactic Time         (Quite a long Time?)
Solar Time            (To do with a Star called Sol)
Earth Time            (Millions of Years)
Recorded Time         (4000 years?)            ^
Peoples Birthdays     (?-today)                |
Gregorian Calendar    (500+ years)           DATA
Real Time             (this instant .... now)
----------------------------------------------------
Current Computer Age  (1900-2099)                   TOD
Future  Computer Age  (2000- ?)                      |
TOD Clock             (1980-2099 -never backwards    |
                        advances day by day)         v
Intel-Type BIOS       (1980-2099)
IBM Age               (1950-2042?)
UNIX Age              (1970-2038)
Some BIOS             (1980-1999)

There are two clearly defined and totally independent Types:

  • People (or Data) Time. The shape, form and content of this are limited only by the imagination of the User. This type of information is primarily for the use of People.

  • Computer Time. Dates based on Time of Day Information. Binary based information, primarily for the use of Computing Devices. Conforms to the limitations of storage, format and the logic of the designer.

    These two types are Incompatible by their very nature.

    Conversion, Translation and Interpretation are involved in converting between them. Computer Time is a simulation of Real Time.

    And most importantly,

    THEY OPERATE UNDER DIFFERENT RULES.

    In technical discussions on Y2K we are concerned mainly with the interpretation by the system of the Time of Day Clock.

    With very specific emphasis on problems in the subset range 1980 to 2099.

    Unfortunately our discussions overlap with the more general form of People Dates , which encompass calendars and which have a very wide and "emotional" range.

    Even the Techies get into a confused crosstalk situation when they get confused between the very specific subsets WITHIN the generic Computer Time bucket.

    The situation is further confused depending on the type of machine and even Operating System in use.

    Perceptions differ.

    A Mainframe person understands the problem in a totally different sense to a person who operates in a midrange or Personal Computing environment.

    The only real commonality between these groups is that their problem crops up on the same day.

    The problems have similarities in origin , but differ substantially in form, scope and range.

    "BIOS" is meaningless to a Mainframe person.

    "COMREG" or "SVC" are meaningless to a PC person.

    The problem and solution sets are totally different between AS/400 and VSE/ESA, or Unix, etcetera, etcetera.

    The perceptions of Media and the General Public also differ.


    So I can talk about a leap-year algorithm that is based purely on mod 4 arithmetic. But I am speaking within a specific subset, that of Intel BIOS time range. (1980 to 2099). Based on a lucky one-in-400-year chance that a century year is also a leap year, 2000 is the only century year within the Intel subset, and it IS a leapyear.

    So in this specific subset (and only this subset), the Simplex and Full Gregorian (which is in essence a People Date procedure) algorithms coincide.

    People say to me "Oh but we must code it correctly for the future". (I wish they had had this attitude 20 years ago and not coded the Y2k bug).

    The point everybody misses is that we are going to have to implement an entirely different architecture of machines before the Full Gregorian algorithm will EVER work.

    On the existing range of machines the Simplex and Full Gregorian algorithms will never and cannot ever diverge. The Full algorithm will NEVER give a DIFFERENT ANSWER than the Simplex, within the subset range.

    The cost of implementing the Full Algorithm can never be justified or recovered. But the savings that we make by NOT changing away from Simplex code would be real.

    This is a very subtle point. But it has a real cash basis, today.

    I have great difficulty explaining this concept to Technical people. Imagine the distress of the unfortunate layperson.


    Another example.

    I am trying to promote the use of the ten year old ANSI/ISO standard, that of YYYY-MM-DD. I am talking about a common interface for communication between digital machinery, and again, within Subset.

    Some nut butts in and says "I disagree with you. Computers must do what people want. I want to write my dates as mm/dd/yy."

    This wretch may write his date upside down, in Swahili or Egyptian Hieroglyphics for all I care. Not only is he talking out of Subset, he is talking People Date. He has no clue.

    But I am now sworn to uphold netiquette and to avoid foul language. So I must pat him on the head, and send him tottering on his way.

    These are a few of the reasons for the confusion.


    

    Sponsored in part by

    Try Me?