I’m having my kitchen redone.
Yes, I remember that in my very first blog entry I said that I would be talking about only technical matters, not trivia about my life. Bear with me, people.
When I say “redone” I mean that I took a crowbar, a sledgehammer and a couple of helpful housemates, pulled down all the walls, pulled up the floor, trucked the asbestos-laden vinyl flooring to the dump and then invited over a bunch of big burly guys to build a new kitchen from scratch.
Unfortunately, we found a number of “structural issues” once the walls were apart which necessitated some pretty major infrastructure improvements. I was hoping to have all the work in the kitchen done in early October, but these various non-scheduled improvements ate up a lot of time. I got an email from the general contractor the other day saying that the revised schedule now has us finishing up in October — October 45th to be precise. Ha ha ha. Very funny guys.
Update: We finished on January 16th, taking exactly twice as long as initially forecast, and confirming the old saying that a poorly planned project takes three times as long as you think, a well planned project only twice as long.
var d = new Date(2003, 9, 45); print(d);
Fri Nov 14 00 :00:00 PST 2003
What the heck? “2003, 9, 45” is the 14th of November?
Yep. First of all, for reasons which were never adequately explained to me by my cohorts at Netscape, the date constructor numbers dates starting from zero. So “9”, not “10”, is October. If you want to know why that is, ask Brendan next time you see him, ’cause I sure don’t know. (UPDATE: See below.)
But what’s more interesting is that any attempt to schedule projects like my contractor made result in real dates. November 14th is, in some sense, the best candidate for October 45th, as it is 44 days after October 1st. You can get weirder than that, of course.
var d = new Date(2003, -1, 0);
Wed Nov 30 00:00:00 PST 2002
Which is only logical — if January was month zero of 2003 then December 2002 was the “negative first” month of 2003. And if December 1st was the 1st day of December, then November 30th must have been the 0th day. Similarly, November 29th was the -1st day, and so on.
The underlying implementation details of JScript dates are quite a bit more sensible than VBScript’s implementation. JScript also stores dates as double-precision 64 bit floats, but the JScript implementation stores all dates as the number of milliseconds since midnight, 1 January 1970 UTC. Universal Time Coordinated is the proper name for what most people call “Greenwich Mean Time” — the time at the prime meridian, with no adjustment for British Summer Time.
For all practical purposes, we treat the 64 bit float as a 53 bit integer plus a sign bit. This means that JScript has 300000 years of millisecond accuracy on either side of the epoch, as opposed to VBScript where the precision varies over the range. Also, in this system there are no two dates with the same numeric representation, which makes comparing, subtracting, adding and rounding times straightforward.
Also, JScript, unlike VBScript, adjusts for daylight savings time and time zone differences according to the local information reported by the operating system.
Microsoft didn’t make the problem any better when we screwed up our implementation of
getYear — the original IE implementation returns the year minus 1900, so 2000 was year 100. Fortunately we corrected that incompatibility in the second version. You should never use
getYear — use
getFullYear, which always returns all the necessary digits.
More generally, you might wonder if there are design principles underlying the decision to make “October 45th” a legal date. Indeed there are (though to be perfectly frank, when it comes to the date handling, I suspect that any “justification” might be somewhat post hoc.)
Here are some thoughts that go through the minds of language designers:
- The sooner you produce an error for bogus behaviour, the sooner the developer will catch the bug. Therefore, at the first sign of anything out of the ordinary, crash and die immediately.
- But wait a minute — the more errors we produce, the more error handling code the developer will have to write, because some of those bogosities will be produced by end user data.
- Worse, if the developer misses an error handling case then the error will be reported to the end user.
- The end user of a web page script is someone who has absolutely no ability to understand the error. If they do understand the error, they have no ability to go to the web server and fix it. Therefore, never produce error messages unless you absolutely have to.
The ability to start with a premise and deduce its opposite is a neat trick, you must admit! But seriously, the existence of conflicting goals explains why there is more than one programming language. Hard-core application development languages like C# are designed to trap as many bugs as possible at compile time and run time so that you can find them, write the handling code, and ship. C# demands that you type your variables, that you call reflection explicitly, that you mark unchecked arithmetic, and so on.
But C# is a language designed for building hundred-thousand-line application frameworks and object models, not hundred line web page scripts. C# is designed for professional developers, not hobbyists and departmental web developers who just want to get “glue code” written as rapidly as possible.
And this is why when people come to me and say “my team of twenty developers has written a hundred-thousand-line database program in JScript and now I can’t make head nor tail of it, help!” I tell them to use the right tool for the job! C# was designed for those tasks. Heck, JScript .NET was designed for those tasks! But a weakly-typed, interpreted, fail-silently, late-bound, non-generational-GC language like JScript was designed to add scripting to simple web pages. If I give you a really great hammer then you can build a decent house but not a decent skyscraper.
Updates from 2019:
This article produced some responses from noted individuals. Raymond Chen pointed out:
My answer: because that’s how java.util.Date did it.
From mid-1995, when I created the language with the code-name “Mocha”, we really were trying not to reinvent wheels to be any more or less round than Java’s equivalent, where we could justify mimicking Java.
Unfortunately, java.util.Date had a pile of y2k and usability bugs. Better luck next marketing scam! Or, better luck next “little brother to a big, not actually related programming language” curse. Next time, I won’t be anywhere near the victim of such a curse.