As a follow up to the Google Developer Day event last week , Silicon.com has a great article on the adoption of mash-up culture at the BBC and their brilliant backstage project.
For other content owning organisations BBC Backstage sets a great example highlighting the way that forward thinking licensing of information can really aid the innovation process and develop truly useful new content driven sites.
During my time at the OS we made no secret of the fact that Backstage was an inspiration to the OpenSpace project.
Just imagine the mashups which could be created if other content owners in the public sector were as open.. indeed in just this last week, we are beginning to see a more innovative approach to publishing content with the reporting of Brents use of google maps and the Maritime and Coastguard Agency decision to publish some of it’s charts in KML.
A point I made during last weeks London Keynote (about 30 mins in, but its well worth listening to the great Chris di Bona) is that there are still huge amounts of geographic information still to be liberated from existing corporate silos, in addition to the user generated content we are beginning to provide the tools to create.
When you think about it, this is less a technology issue that it has even been – this is now really about information policy..
Written and Submitted from the Google Office, London.
4 replies on “Google Developer Day – BBC leads the mash-ups revolution..”
I am still concerned with much of the user generated geographic content because it seldom contains useful meta-data (a problem all around I suppose) and because its rarely checked/verified/other wise quality controlled. Several mashups I’ve seen create pretty pictures, but I’m not sure the substance really matches.
I do concur that there is a lot more good from having a large user driven “application.”
Keep up the good posts : )
Casey
As I stated to a friend who also produced a world database – “Everyone wants something for nothing, and then there’s those who want to make something from doing nothing. Which, you know, kind of disturbs me.” (In relation to being asked on occassion by users of my site how to pirate a cache from any one of the various visualization apps available.)
Why metadata really needs to be available to the masses, other than to pirate a cache – I don’t think it does in a map-centric visualization environment (where the metadata is being applied!). I’m sure there’s people out there who disagree with me though, based on Casey’s concerns for accuracy.
That information, however, could be listed in a manner that’s appropriate to assure quality – and if that’s the end-goal of the publisher’s mash-up in the visualization app it’s being made available in.
I guess I move beyond the pretty pictures to see if a map or spatial analyst is really saying what people think its saying. James Fee just posted about a similar topic on his Blog. http://www.spatiallyadjusted.com/2007/06/05/geocommons-the-future-of-mapping-or-geo-splog
It would be nice if we could just assume people had good intentions and good work, but thats not a safe assumption. I do see your point about exposing meta-data for the masses and actually applying it, but isn’t the power of google in the masses of people developing?
Perhaps I’m just jaded working for the government and not being able to leverage a bunch of this new web mapping technology : )
Both,
This is an interesting debate, I think we have some way to go before an appropriate level of metadata is agreed upon for the Geoweb. What we have now is too little, what is proposed via things like ISO 19115 is maybe too much.
ed