People across North-America are scrambling to update their servers, in order to deal with the upcoming timezone changes.
I’ll tell you a little secret that has served me well… ever since the C libraries were a pest in the mid eighties (the ones on DOS defaulted to GMT-5, US EST, or something in that realm), and since with layers of libraries you easily risk missing a timezone conversion, or get double/triple conversions, I decided to never use timezones on servers again.
In the old DOS days, the library issues were resolved by making an app do setenv(“TZ=GMT0”); tzset(); on startup. No more conversion problems, regardless of the system time.
But in essense I do the same on Internet servers that I manage. They all simply run in GMT and therefore never need to change timezones – and they’re immune to dst funnies invented by George W Bush, too ;-)
On display, an application simply converts from GMT to whatever local timezone is appropriate for the user.
Think about it… countries like the US and Australia have multiple timezones, so which zone do you use for your web site? Would it not be nicest for the users if they got all times in their own timezone, rather than some arbitrary zone that you decide for them – which is based on the -for them- completely irrelevant location of the server (or your timezone decision, whichever).
Easy as – better user experience, with less work and maintenance issues.
OF course…. but you’re still vulnerable.
I always store UTC in the DB and have the reference clock set to UTC.
The problem comes when formatting the time for the user. If you’re not patched you’ll show them the wrong time.
At least the data isn’t corrupt though.
Of course if they ENTER the incorrect time they could enter a start date one hour ago… that sucks too.
a) we agree that then it’s purely a presentation issue. Easy to fix.
b) incorrect user entries. Why/when would the user need to enter a time? Dates, perhaps. Time, rarely it ever…