Categories
opensource Thoughts

OpenStreetMap and the Rabbit Phone problem..

This week the innovative guys as Nestroia have launched an experimental version of their great real estate aggregation site using OpenStreetMap mapping as an alternative to the usual Google Map Tiles. This is great vote of confidence for OpenstreetMap, but it also highlights some of the problems creating geodata from the cloud.

Where OSM data is comprehenive the map tiles are often more detailed than those Google supply which are based on commerical available datasets. For example this section of tiles covering my childhood neighbourhood in London is truely beatiful, in many ways a better map than the commerical websites.

Chelsea

However the problem is that the coverage for OSM data is not yet complete, and where there is incomplete coverage, for this type of application, its use is a problem. Look at these examples from Wokingham, West of London and part of the UK’s silicon valley.

The Google maps tiles using Tele Atlas data are pretty much complete..

teleatlas

However the OSM version has a lot of missing detail..

OSM

Mapping data really does need to offer complete coverage for it to be really useful, some may remember in the UK in the early days of mobile phones there was an alterative system based on local hotspots called Rabbit. This failed becasue you had to be within 100m or so of a hot spot, unlike the wider coverage of the early analogue mobile systems.

Mapping data needs to be as comprehensive, with no coverage gaps, what is great about Nestoria’s early exposure of the data in a real application is to highlight where more volunteer work needs to be done to complete the work.. If this is achieved by the OSM community, the critic’s of open source geodata will be silenced.

Written and submitted from home, using my home 802.11 network.

21 replies on “OpenStreetMap and the Rabbit Phone problem..”

Jon,

I agree that OSM is different fundamentally to other types of Geodata in the way it is collected (in a good way) however are the applications of OSM really that different ?

Interesting parallels with Linux, is OSM without Wokingham for example the same as a Linux distribution without a driver for webcam 🙂

You know, I used to work in Wokingham. Sadly it was before the advent of OSM, or I’d have mapped it! But I did sketch a few lines for the proto-OSM site I was playing around with, Geowiki – there’s even a reference to those few lines in this Guardian article.

Anyway.

In the last three months, I think, OSM has reached the tipping point. It used to be the case that if you did a Google (natch) blog search for OpenStreetMap, you got the Further Adventures of Steve – people at conferences the world over saying “I went to this conference, Steve Coast did a talk, it sounds like an interesting idea”.

Recently it’s changed. Now, the blog postings are all about “hey, I’ve just discovered OpenStreetMap, this is cool and I’ve started to map my city”, or “I’m working on this new program which uses OSM data”. The exponential growth in users and data in the last five months is another illustration. OSM has moved from something you respect, to something you participate in.

So Wokingham will get its open map. All it needs now is some decent pubs and it might even become a half-decent place to visit. 😉

It is an interesting point regarding completeness in terms of the base detail of a map. The question here surely is, is it possible to have a single set of digital maps that meets everyone’s requirements? No is the clear answer! In this respect the base detail of a map will vary depending on the intended use and the overlay information. With this in mind should Google and others as well as producing a single web based raster system, look at producing web based raster systems for specific markets and their corresponding base map detail requirements?

Whilst Google Maps is an excellent product it does have limitations in this respect. I have found in relation to road data that for certain tasks there is too much data, and for others tasks there is too little detail. Of course in relation to too little detailed map data, Google Maps and others are constrained by OS pricing here in the UK, which is a hot topic of debate.

In this respect if Google Maps are to capture a share of the more lucrative markets in the GIS World then they are going to have to be more flexible in terms of their web based raster map offerings in relation to base map data. With regard to Open Street Map the point I make is equally valid as it is to Google or for that matter Virtual Earth.

I do agree that using the crowd to correct and improve data would be attractive especially for a company with user based of Google. Btw, I did do a survey of OSM users about whether they would contribute data to commercial providers and I think it is fair to say the answer was ‘no’! The value of this survey may however be limited unless it can be proved that the respondents were a typically cross section of society. A serious question about the take up of commercial contribution services though is who owns the data and how the contributions are used. We could well end up with all the mapping data owners and service providers thrusting update services on their users and then keeping the results to themselves (that’s the old model anyway). I believe that TomTom didn’t share its updates with TeleAtlas (who provided their data) who didn’t share their changes with the OS, so it might be pretty unattractive to the community. If Google delivered the improved data back to their data provider in return for lower usage costs then that would be much more interesting for the community (but possibly not for G), but there’s the bind. Oh yes, here are my survey questions and the responses (follow the links): http://lists.openstreetmap.org/pipermail/talk/2007-November/019702.html

Rabbit wasn’t in competition with cell phones. It was in competition with payphones.

It failed because it couldn’t receive calls at all, ever, no matter how near an access point you were. It was, in short, a very stupid product.

I seem to remember that it wasn’t that long ago that Google Maps had North America and the UK floating in an unusually large, world-covering ocean. OSM may well still be at that stage of having patchy coverage, but it’s catching up *fast*…

I think the coverage of the OSM Project is not the main problem. I agree with some comments posted earlier, focusing on the factor time.
In my opinion dataquality is also a serious problem. I’m not familiar with all details of the project, but there should be mechanisms to ensure the dataquality and correctness of data. Metadata could help to make the data more useful and transparent to the users. By the way some metadata for Google Maps (and GE) would also be great.

But of course these initiatives are really great for the GIS sector.

*cough* Isle of Man *cough*
So Ed, pray tell when will Google maps have equal or better coverage than OSM of that particular bit of the Commonwealth ? Before or after OSM has equal or better coverage than Google maps has of Wokingham ? 😉

Leave a Reply to EdCancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.