Data licenses for the geoweb

Andrew summarises  with clarity the current state of licensing for “open” geodata on his blog. This is going to be an emerging theme over the next year as more data becomes available and there is greater awareness of the immature state of data licensing compared to software licensing.

When I touched upon the subject over the summer is was within the context of DRM a scary umbrella term that has too much baggage, but one which at least in an abstract sense describes the problem.

I have no doubt that the open geodata community will go through the difficult and potentially painful process the software industry experienced to reach the current broad range of potential licenses. This is a necessary step for as Andrew points out for many potential data publishers there is no standard license that is close enough to matching their needs.

As this process takes place a good resource is Kevin Promfret’s excellent blog on spatial law who is tracking licensing developments.

Written and submitted from the Google Offices, London (51.495N, 0.146W)

OSM Business models

Interesting post by Stefan at the United Maps blog, which continues “the now OpenStreetMap has matured and is taken seriously.. what next ? meme”.

In Amsterdam I had a few chats with people talking about how OSM contributions might find their way into commercial products and if we would see different distributions of OSM, or even a forking of the project as different organisations have varying perspectives as to what they see as important.

Without question the current licensing of OSM does as Stefan points out restrict is commercial use. In my personal opinion there will need to be a less viral license established at some point for many commercial organisations to use OSM data.

Over time we will see other commercial distributions or OSM data and other services set up that compete with Cloudmade which will be another positive step in moving open geodata forward.

This will be because they will no doubt have a different perspective and may suggest changes to the project and licesning of it’s data that will take the OpenStreetMap project in different directions.

This may well be a painful process, just look at the history of other large open source projects, but it may be a necessay step for OSM to as SteveC quoting Geoffrey Moore says “Cross the Chasm” into mainstream adoption.

Written and submitted from the Google Office, London

OpenStreetMap all grown up and serious..

sotm09At last weekends State of the Map (SOTM) conference is was clear the the OpenStreetMap project is growing up and trying to position itself at a real alternative to commercial geodata suppliers and not just a fun project for people who love maps and making them.

Perhaps it is the experience of Cloudmade or the numerous iPhone application developers using OpenStreetMap that has brought the necessary focus on the boring stuff of data quality, consistency and currency.

To actually use OpenStreetMap in many applications there needs to be improved data attributes, as Steve Coast himself noted even where there is near complete coverage of streets, such as in London for example, many of the streets are not attributed with street names. Given a focus on fixing this particular aspect, such problems are relative easy to solve, but the key point is that the project leadership now recognises that a guiding hand is needed to help the community complete the task.

In terms of spatial accuracy Muki Haklay has made a specialism of accessing OSM data quality and his latest results presented at SOTM suggest that using the UK as an example, OSM data is better than the equivalent business geographics product produced by the OS, and in some cases comparable to OS MasterMap ITN data, a product that costs over £100,000 per year to license .

Alongside the increased awareness of the importance of data quality, the other clear indication that OpenStreetMap is getting more business like was the dedicated business track day, and the long needed work to produce a new “fit for purpose” license for OpenStreetMap data in the form of the new ODbL.

Some may not like aspects of the new license (myself included) but the awareness of the problem and the willingness to address it shows that the project has reached a real level of maturity. The licensing of community sourced geodata is still novelty, we now have the mirror of a GPL like license for geodata, others licenses I’m sure will follow.

If there are still people out there than believe that community generated geodata is just a joke, its time to wake up!

OpenStreetMap, Google’s MapMaker and Tele Atlas with is Map Share programme in different ways all demonstrate that spatial data capture from the bottom up is a valid alternative to traditional mapping agencies / data providers and is in many parts of the world the only practical solution.

Congratulations to the local organising team for putting together another great conference !

Written and submitted from home, using my home 802.11 network

StateoftheMap 2009 Call for Papers

The call for papers for always one of the most interesting conference has just been announced. StateoftheMap 2009 is the conference to discuss all thinks to do with the ground breaking Open Street Map Project, and this year will be held in Amsterdam in July

This is much more than a gathering of people who like to ride bikes with GPS tapped to their handlebars however, the conference is a excellent forum for discussions of new types of cartography, data access policy and legal issues around open source data.

Indeed this year I’m sure one of the hottest topics will be licensing of data, not something which appears to everybody, but its an indication that the project has reached a level of maturity that it needs to be addressed.

Written and submitted from the 11:45 London-Cardiff Train, near Bristol.

UK government starts to get open source

As the Guardian Technology blog notes the UK government is once again trying to push Government Departments into looking at Open Source software solutions at least as an alternative to the proprietary software we all know and love.

This is not the big stick approach which has been used in some other countries, here the policy is from a procurement perspective to just make sure open source solutions are see viewed on an equal footing, taking into account the total cost of ownership of new systems recognising the many years of support and maintenance that will follow the initial purchase.

osgovThis I hope will not just be seen as the simplistic religious debate between Windows v Linux, Microsoft Office v Open Office, or MySQL v Oracle, because actually it is not in terms of packaged software where the real benefits can be found.

The real big costs in Government IT projects go into the bespoke software development customising or building additional functionality around off the shelf software like Oracle or SAP, or from the GI perspective ArcGIS.

This is where this is massive potential, for much of the code developed solves very similar problems for different departments and agencies across government. As things currently  stand none of this code is reused and each department pays for similar code to be developed for them, often I’m afraid to say by the same vendors.

So for example in the GI world, the data management systems developed to build and maintain the maps for Ordnance Survey is not so different from that needed by the UK Hydrographic Survey, or at a larger scale the tools used by the Land Registry to maintain your title deed plans are not so different to what is needed  to build and maintain OS Mastermap.

If the code developed to meet these needs was made open source, the initial code base could be used and maintained by all government agencies each benefiting from potential improvements developed by the others, and the tax payer never have to fund more reinvention.

There is once small hitch with this, companies like Google are very open about their use and support of open source software tools, which form the backbone of their back office systems, and which can be maintained and extended internally by skilled engineers.

Over the last 10 years most of the IT expertise has left government departments, meaning that very few actual software engineers or developers are left within government.. They have all been outsourced. This means that the potential benefit is reduced internal maintenance of code and its development cannot occur within house, another reason perhaps government should think about re-skilling in IT ?

 

Written and submitted from home, using my home 802.11 network.