If you can’t link to it… does it exist ?

“If a tree falls in a forest and no one is around to hear it, does it make a sound?”

So goes the well-known philosophical thought experiment,  however rather than a discourse on observation and perception I’d like to hijack the experiment for an argument I have been making on and off for the last couple of years and which was  well summarised in a tweet summarising my point last month..

Does information published on the web which is not easily linkable actually exist ?

Well of course if I chose to publish my large spatial database of whatever using a Web Feature Service or some other application server the data actually exists, but as far as users of the web does it exist if I cannot find it using web search or more importantly as far as the way the web works cannot link to it?

This issue of making the so-called Deep Web more discoverable is still challenging , efforts such as the sitemap protocol have had only limited impact.

I would argue for the geospatial community in particular we need to take a more fundamental look at how we make information accessible and linkable on the web.  We need to start from the basic use case, common if you think about it but radical it would appear in the GIS world..

I need to let people link to each record in my spatial database and to share that link..

This actually requires perhaps a much more granular approach to making spatial data available, something that nearly got started with OS Mastermap but which for many issues was never fully implemented.

Rather than publishing online a database of railway station locations in the Netherlands and expecting a user to then query the database for  “Amsterdam Centraal Station”,  publish the database giving each record a URI so for example Amsterdam Centraal Station becomes;

https://brt.basisregistraties.overheid.nl/top10nl/id/gebouw/102625209 

Now this is something I can paste into an email, tweet or even share on Facebook !

Kudos to the Dutch Kadaster for taking this approach and providing this example, Ordnance Survey you could do the same ?

This approach also results in such data becoming part of the “mainstream” web indexable and searchable, but I argue the key benefit is the “linkability”

The Spatial Data on the Web best practice document, something of course I recommend you taking a longer look at provides many excellent practical pointers to taking this type of approach.

Maybe really this is just an issue of semantics rather than publishing spatial data should we be talking about sharing spatial data ?

 

Looking for a Google Earth Server

Then look no further than the latest release of GeoServer which has fantastic new KML serving capabilities on a par with the Google Earth Enterprise server.

The key new capability here is to stream vector  and raster data to Google Earth as the user zooms or pans  making sure that only just the minimum amount of information is transferred thereby giving the great performance you expect from Google Earth.

This release of GeoServer can also extrude KML with a height attributes allowing users to stream simple 3D model data to Google Earth.

GeoServer 1.7.1
GeoServer 1.7.1 serves 3D KML

 

GeoServer continues to develop into a serious enterprise application which no doubt is getting the attention of the guys in Redlands and is providing much needed competition in the market. From a KML perspective it is now possible for an organisation to self publish almost any type and size of geospatial database using an open technology stack.

And it runs nicely on my Mac !!!

Written and submitted from the Google Office, London.

Where’s the cheese – OGC moving forward

St Louis

I’ve spent much of this week along with some of the other guys from Google at the Open Geospatial Consortium (OGC) Technical Committee meeting in St. Louis. KML is hopefully just a few weeks away from becoming an adopted standard, and the OGC as an community I’m pleased to say, is increasingly taking interest in geospatial technologies developed outside of its traditional membership.

So amongst the continued detailed work on the W*S standards we know so well, there was much debate about the potential of RESTful interfaces and the use of lightweight technologies such as GeoJSON and AtomPub as realistic alternatives to creating transactional web services beyond mash-ups.

It becomes really interesting when the new and existing are combined, one of the slickest demos I saw was using an extension to the existing SLD standard to control the server side creation of KML data from a component WMS service, populating attribute data into KML Extended data tags.

There is also growing recognition that as a reflection of the new technologies, new approaches to creating standards may also be needed, after-all the pace at which technologies are introduced and adopted by the mass-market is much faster than the traditional standards process can keep pace with.

Perhaps a new approach is needed where standards are defined at the same time as new applications and functionality developed, so that the standards process is driven by individuals and organisations implementing new functionality which is standardised once demonstrated to be both stable and useful !

This new approach which focuses more on the user need, was nicely summarised in a presentation from NASA with a picture of a cheese stall, “I’d like some cheese.. bit I rather not know how it’s made”.

Googles release of the libkml open source library should be seen in this context, as it allows developers to quickly get started in creating well formed KML files, and to experiment quickly by actually writing code. Want to write a FDO provider to read and write KML, then libkml is a great starting point, likewise if you want to write a new iPhoto geo-tagging plug-in, libkml deals with most of the basic requirements you would need. In both cases any extensions or changes that might be needed to KML can be tested and proven in a practial sense before becoming standardised.

I have been to perhaps half a dozen Technical Committee meetings over the last few years, and I leave St. Louis feeling more optimistic than even before that the OGC can remain the positive influence on the industry it has been up to now, change is needed but that’s recognised.

Written and submitted from the Westin Hotel, St. Louis, using its broadband network

Welcome back.. Microsoft rejoins the OGC

In another example of the overlap between neo and paleogeography Microsoft announced yesterday on their Virtual Earth Blog that they have rejoined the Open Geospatial Consortium the industry standards body for “professional” GIS. Along with the new support for KML in virtual earth, I would say the geoweb is beginning to develop quite nicely !

Written and submitted from the IRLOGI Conference 2007, Dublin, using the bitbuzz 802.11 network.

Neogeography.. it was just a dream..

Imagine waking up in the beautiful Portuguese city of Porto and finding out the past two years of your life were a dream… All that talk of GeoRSS, Map Mash-ups, KML, User generated My Maps, The GeoWEB and Paris Hilton were all part of a dream.

We it felt a bit like that on the first day of the annual European Commission GI and GIS Workshop. Over 200 hundred GIS users from Public Sector Organisations and a few private sector ones are together meeting to discuss the impact of the INSPIRE directive now that it has been passed by the European Parliament.

ECGIS workshop

During this first day the web 2.0 buzzwords of neogeograhy were notable by their absence.

Now I am actually less disappointed that I might have been, let me explain why…

INSPIRE is, contrary to all of the fuss last year drummed up by some in the UK, quite tightly focused on the supply of harmonised environmental geospatial data to the institutions of the European Commission, by public sector organisations in the member states. – There is no “public” interface here and the citizens are not seen as major customers for INSPIRE services.

As such you can think of this as a complex back office system for European Government, as much an Enterprise GIS for Brussels as a Spatial Data Infrastructure. So key to success will be clear definition of requirements and well specified system design.

Now here is the rub, despite the fact that much of the INSPIRE directive is not expected to be implemented until at least 2010, it is been designed now and must used well specified and recognised standards – things like the ISO 19100 series of standards developed by the Open GeoSpatial Consortium.

It’s not difficult to appreciate the problem, REST based interfaces, KML, GeoJSON, GeoRSS etc might actually be the best technologies to use today and would be the tools of choice of many, however like many other Government IT projects INSPIRE needs to follow the low risk route of SOAP, WSDL, WMS, WFS etc.

So we may find that organisations will use OGC style interfaces to communicate to other public sector organisations and the commission, while using lighter weight technologies to publish information to their citizens. This is no bad thing !!

I am however disappointed by the continued focus on metadata driven catalogue services as the primary mechanism to find geospatial data, I don’t believe this will work as nobody likes creating metadata, and catalogue services are unproved.

INSPIRE needs GeoSearch !!

Written and Submitted from the Le Meridien Hotel, Porto using the in-room broadband network.