There are a lot of things about algorithmic patents that just seem wrong, and I'm not going to sit here and debate the obviousness of the work in patent application 10/631611 by Bryan Kendall Beatty via Microsoft, however I think there is an interesting lesson in how innovation helps us defeat patent lockout.
I'm making a series of assumptions here on how the patent rulings are actually going to come down, but we'll see how this goes. There are really 3 algorithms and two patents at play in this story (assuming for the moment that none of the patents cited in Method of identifying geographical location using hierarchical grid address (S. Lee Hancock, Peter H. Dana, Scott D. Morrison, 2000—referred to as HDM) are brought into play.
At issue is encoding geographic coordinates in a form that is more usable by the users.
The initial patent (HDM) was granted in 2001 and basically deals with using a grid as an initial identifier and then encoding the addressing information in successive letters and numbers. It is, more or less, a way to provide a very structured address for computers and users to be able to locate and identify duplicate information as well as process data in a very structured way.
In 2003, Microsoft filed the Beatty patent (Compact Text Encoding of Latitude/Longitude coordinates), which was designed to provide a way to provide coordinates that could be easily entered and checked by a human or computer and which could provide much shorter URLs than using the current method of providing latitude and longitude. There's nothing earth-shattering in this patent application, but it is a useful way to shorten the current URL scheme. As such, it might have been widely adopted, except that Microsoft has well-earned reputation for being stingy with its "intellectual property". To my knowledge, there is no widespread use of this mechanism, which basically converts the latitude and longitude to strings and then concatenates them, providing a string that is longer or shorter based on the number of significant digits. Useful to shorten the length of URLs.
Fast forward to February 2008. when Gustavo Niemeyer creates geohash.org, a web site and a method (in the public domain, no less!) for creating arbitrary-precision hashes for geographic locations. The algorithm is described in the Geohash wikipedia entry, and basically involves alternating bits between the latitude and longitude and encoding the alternating bits in a base-32 string. This then becomes the description of the location.
This algorithm is explicitly not patented (and should now be un-patentable) and unless it is found to infringe either the HDM patent above or the applied-for patent by Microsoft (both unlikely as the only claims they would conflict on are the over-broad claims about creating a shortened mechanism for geographic coordinates), we will all be able to use it for free going forward.
Here's the interesting moral, though:if Microsoft had released their algorithm into the public domain, it might have seen some uptake, which would have made a new entrants penetration of the "marketplace" difficult. This is good, though, because geohash is actually quite a bit better! As well as providing an arbitrary precision mechanism for specifying geographic locations using short strings, the unique method of splitting the bits between the latitude and longitude makes it possible to do some neat tricks, such as this method of performing Geographic Queries on Google App Engine. Although it isn't perfect (certain precisions couldn't be represented without moving to a binary representation instead of base-32, and it doesn't handle proximity across the poles or international dateline), it's certainly something that couldn't have been done without the work of Gustavo Niemeyer, which might not have happened at the right time if there had been a commonly-used mechanism in place.