Theory vs. practice in software development

by Michael S. Kaplan, published on 2007/07/21 14:53 -04:00, original URI: http://blogs.msdn.com/b/michkap/archive/2007/07/21/3991927.aspx


Sometimes it is easy to build implementations in software that spend so much time striving for consistency and cleanliness of design that they stop really modeling the world they are nominally attempting to represent.

I was thinking about this the other day after one of those MS internal mail threads splintered into many different branches (as often seems to happen). The thread was about DateTime in .NET and specifically some edge case surrounding DateTime.MinValue and DateTime.MaxValue, especially exposed when using DateTime.ToUniversalTime().

In the end, it finally inspired Josh Free to write a nice summary post on the whole matter entitled BCL Refresher: DateTime.ToUniversalTime returns MaxValue/MinValue on overflow.

Now that is a truly excellent post that goes to great lengths to really explore and explain the issue. And it is not my intent to pick on either Josh in general or on that post in particular.

But I am going to talk for a bit about one of the flaws in the implementation of DateTime in .NET....

The problem is that it is a software model for the Gregorian calendar, which itself was not created until the 1500's. So what is the meaning of dates that are literally representing values in a calendar before they even existed?

Not that there is clean date one could use, given the huge variability in the actual way that countries throughout the world moved to use the Gregorian calendar. You can look on this page for a good comprehensive look at that issue, but the summary version is that any claim of date in the Gregorian calendar in .NET:

Now the problem would be solvable in a much more conventional way if DateTime were based on something independent of calendar and thus separated the problem of trying to represent individual time periods and trying to display them in the conventions of a particular calendar.

But that is not how the type was designed.

Kind of puts posts like Long Live the Emperor and Out of [implied] range in perspective as software engineers grapple with how to represent date values that are either undefined or under-defined to such large degree as to make them little more than approximations in some cases.

(gives some perspective to potential reasons behind the value of System.Data.SqlTypes.SqlDateTime.MinValue, with possibly the single greatest expression of Microsoft being a US-centric software company spelled out in clear and unambiguous technical terms!)

In the end it is a further reflection of the problems I hinted at in this post and others, but perhaps hinting toward the actual shape of the solution.

A big part of me would really rather see this engineering perspective that will (for lack of a better term) implement the crap out of a solution to a technical problem given all of the parameters. Because the only real flaw in the design is that the previous efforts to implement the crap out of things were done without having all of the facts.

Think of how much better they could do is they really attempted to take these additional data points into account!

Is there a .NET Architect or Distinguished Engineer in the house? :-)

 

This post brought to you by (U+1881, a.k.a. MONGOLIAN LETTER ALI GALI VISARGA ONE)


no comments

Please consider a donation to keep this archive running, maintained and free of advertising.
Donate €20 or more to receive an offline copy of the whole archive including all images.

referenced by

2010/09/14 Using a culture's format, without using that culture to format?

2008/12/16 Grody to the Max[Date]!

go to newer or older post, or back to index or month or day