All,
I'm pretty sure this is probably my misunderstanding something, but I have a scenario where if a user runs some code and their PC is set to London (UTC+0) everything works fine.
However if they change the system time zone to say Melbourne (UTC+10), then the same code crashes.
I have a small project that illustrates this:
It seems like DateTime.MinValue returns something different in these time zones, but my interpretation was that MinValue was a constant.
For my purposes I just needed a 'reasonable' default and have changed to DateTime.Now as a solution, but I would be very interested to understand why this happens if someone can explain it to me??
Thanks for reading,
Chris
No comments:
Post a Comment