Vid's Space

Actually, this is just part of Vid's Space. For more, go back to the full Vid's Space.

Passing Thoughts

Leap Day Reform for the Digital Age

Many date-handling tasks computers routinely face — such as translating dates between human-readable and internal formats, or just determining what day of the week a given day falls on — requires code that converts a year/month/day combination to a simple number of days since some reference date, or vice versa. Anyone who has had to write such computer code knows that our Gregorian Calendar isn't very convenient for computers.

For example, one fairly obvious method would be to start with the day of the month, then add a number to that (using the month as an index into a lookup table) to obtain how many days since the start of the year. But then, if it's a leap year and the month is greater than February, the program has to add one. Finally, to this number, the program adds 365 times the year (relative to the reference year) plus the number of leap years after the reference year and before the year of the date being evaluated. Now consider the Gregorian Calendar's rules on leap years: every year that's a multiple of four is a leap year, except years that are a multiple of one hundred are not leap years, unless the year is a multiple of four hundred, in which case it is indeed a leap year.

The first thing about all that I'd like to simplify is how, for any month March and later, the number of days since the start of the year for a given date depends on whether or not the current year is a leap year. This consideration can be made unnecessary by moving leap days to the end of the year. In the Digital Age Calendar, February's length is fixed at 29 days, and December's is 30 days, or 31 in leap years.

The next thing I'd like to simplify is the pattern of leap years. I'll keep the general pattern of having a leap year every multiple of four, because it's already convenient for computers. Scrapping the one-hundred-year and four-hundred-year parts of the pattern, we'd wind up having leap years slightly too often. So I calculated the ideal interval between skipping leap years, and it turns out to be almost exactly 128 years. That's surprisingly convenient for a computer! In the Digital Age Calendar, leap years are those preceding the multiples of four, but not those preceding multiples of 128. (The choice to have leap years before multiples of four is to further simplify computer calculation. The alternative would be to somehow insert a leap day at the beginning of the year, and I don't think people would adapt so well to having a 0th of January.) Anyway, the resulting pattern of having 31 leap days in a 128-year cycle matches the Earth's orbit considerably better than the Gregorian Calendar's pattern of 97 leap days in a 400-year cycle.

So here's some simple computer code to convert from a date to a number of days. In this example, the reference date is December 31, 1919 in both the Digital Age and Gregorian calendars.

const int MOffset[12] = {0, 31, 60, 91, 121, 152, 182, 213, 244, 274, 305, 335};

int SequenceDay(int Year, int Month, int Day) {
	Year -= 1920;
	return (Year * 365) + (Year >> 2) - (Year >> 7) + MOffset[Month + 1] + Day;
}

void SplitYMD(int Seq, int &Year, int &Month, int &Day) {
	int Cycles, Quads;
	Seq -= 1;
	Cycles = Seq / 46751; //integer division, rounded towards −∞
	Seq += Cycles;
	Quads = Seq / 1461;
	Seq -= Quads * 1461;
	Year = Seq / 365;
	Seq -= Year * 365;
	Year += Quads * 4;
	Month = 11;
	while (MOffset[Month] > Seq) {
		Month -= 1;
	}
	Day = Seq - MOffset[Month] + 1;
	Month += 1;
	Year += 1920;
}

While this may look a lot like C++, consider it to be pseudocode — especially if you find syntax errors. For the Gregorian Calendar, there would certainly have to be more ifs and modulos. Also, feel free to substitute your favorite search algorithm for my while loop. (I'd personally implement a binary search, but the linear search makes clearer pseudocode.)

Of course, any time a new calendar is adopted, there will be some conversion issues, particularly if the adoption isn't universal. Still, the difference would only be of one day, and only for 60 days each year, three of every four years. Seven eighths of the time, the calendars match. That is, until 2048, at which point the calendars will begin to disagree every day until 2100…

Besides the simple fact that people don't like change, I don't see why this calendar couldn't be adopted. Using the slightest amount of care in selecting the transition date, it's easy to make the transition without a noticeable jump in the date, as happened in the switch from the Julian Calendar. They did it back then, and this change would be even less of a bother. Except, perhaps, for the millions of embedded systems that can't easily be updated. How's that for irony?

Okay, I realize that the only practical problem solved by this new calendar is in writing code, and once the code is written, it hardly needs to be written again. Furthermore, any performance gains resulting from simpler code are irrelevant considering modern computing power. Still, I find this Digital Age Calendar to be simply more elegant and logical. Can't that be reason enough to switch?

More Passing Thoughts