What's that they say about hindsight?
If you build web applications like I do, please don't learn this the hard way, as I'm about to.
I have a form where users can view things that have occurred within a particular timespan. The UI is pretty simple - it's two calendar datepickers. Just dates, no times.
Because I'm a frugal programmer, I haven't been saving occurrences individually as separate rows. I didn't care to keep such granular details; instead I've been keeping stats using a statement like this:
Thus I keep a running total of occurrences on November 8. If anyone asks how many occurences happened, I can say there were X. The next day, I start a new counter for November 9.
But here's the problem. What if the user is in Japan? Their idea of what constitutes "November 8" is different from mine. Something that, for me, was on November 8 might be November 7 for my Japanese user. Mess that up, and a whole bunch of other things go wrong.
A user in Austin might request stats with the same criteria as our user in Tokyo, but they should see different data. Because some things that happened on Tuesday in Japan were on Wednesday in Texas.
Consequently, that counter I'm using to tally ocurrences on November 8 isn't sufficient. It doesn't matter that no one can request data more granular than a range of days; I need to keep that granular data, because the definition of "a day" differs from user to user.
I did it wrongly. The problem cascades all over my app, where my record of the timing of things won't match up with what actually happened.
So, note to self:
1) I should store all my date&time data as UNIX timestamps ('1238373794'), instead of formatted datetime strings ('2009-11-08 13:32:09').
2) I should build timezone conversion into the presentation layer, right from the start
3) Before hard-coding shortcuts in data collection, I should pause to consider whether the need for granularity is affected by dates and times.
Think about it now, before you start.