While everybody is rushing around defining the "correct" algorithm for leap years and going into the fine print of the original Gregorian Legislation, has it ever occurred to anybody that the idiot who coded "if the year is evenly divisible by 4 then it is a leap year" and didn't worry about century and 400 century adjustments, actually got it right, (taking the lifespan of machines into account).
The "exact" code does not give us a better result within the life range of current systems. It in fact gives the same result. The system will die before it ever gives a "better" result.
For example, a PC will work from 1980 to 2099. There is only one occurrence of a century and 400 century adjustment. And that is 2000. It is a leap year.
So the simplex code : leapyear = year mod 4 will work.
Maybe we can save some of the $1.8T by NOT recoding for leap years.? Could we possibly be chasing another paper tiger?