I’m currently going through the preliminary stages of looking to move house, this means that I seem to be coming across an awful lot of pushpin maps and Google mashups on a variety of property web sites. Most of these sites are very good at aggregating data into a simple map view displaying a number of data sources.  Typically you choose an area to search, a bunch of parameters, and all properties are displayed on a map. Some sites will go a little further and add in demographic data, showing what the area you are looking at is like.

London house prices aside, none of these sites seem to do what i need by letting you search for areas in general.  What I’d like is to ask;

 find me all the areas within an hours drive  (at rush hour) from my office and 30 min’s public transport from my wife’s work, with a demographic of people I’d like to live near (I’m not so bothered if they want to live near me!). 

 This is actually a pretty difficult problem to answer, and certainly not one that can be quickly solved by a combination of Pushpin maps and a tiled streetmap.  To answer this question you need both data (multimodal transport network demographics, historic traffic density) and some geoprocessing capability to generate service areas and combine these with demographics. 

 ArcGIS Server would be the perfect answer to this when combined with the right data.  Geoprocessing functionality based either on road networks or travel time grids can be used to generate areas that meet the criteria.

You can’t pre-generate data as you can in the tiled map model, as everyone’s query is different, perhaps I’d rather calculate my travel area based on CO2 emissions or to different locations.

Solutions to parts of this problem are available on the web but not combined together.  Transport direct does a great job of giving driving and public transport travel times based on single journeys rather than a service area (which is harder to calculate).  Chris Lightfoot did some fantastic work turning this data into travel time maps

There don’t seem to be many sites that actually do on-demand processing of data to generate custom service areas.  One example that does is the Business Analyst online service.  Which among other things, will create drive times from a business location, and also generate a demographic or business report.  Its a good example of what can be done with geoprocessing capability in the background.  The ArcGIS server development blog also has a nice demo of building geoprocessing into a web application.

None of the above sites is any closer to finding me the perfect location to live in though, I definately need more than just pushpins.

I happened to be working in Canary Wharf on Tuesday, so on the way home I made a last minute decision to detour to the mashup* event which was themed ‘On Location’.  I’d made a mental note of it a while ago, but then completely forgotten, anyway I’m glad I  went as it was pretty interesting.  There was a good talk from TeleAtlas, about what they are up to, particularly in the Internet mapping and wireless mapping space.  I was most interested in the information they presented about the commercial penetration of the different map sites around the world.  Google was the top one, but the interesting bit was that it seemed that in most places Google was first and a local provider (such as multimap in the UK) was second.  I’m not sure why this would be, maybe a language thing, or historical or perhaps its a map presentation cartography issue, I know my wife always uses multimap when looking for addresses, as she says they look like proper maps!  They also mentioned that Nokia is planning on launching a new global (tile based) mapping platform and API for use with their devices.

The talk was followed by a panel discussion, which focused mostly on the privacy and security issues surrounding things like google streetview and also locational information, kept for example by your phone service provider.  Do you care that your phone company, or your employer or your spouse can locate you anytime? or do the benefits of you and trusted entities knowing your location outweigh the issues?

One thing the talks and the interesting applications that many of the presenters and demonstrators were discussing, made me think about is to what extent does the ubiquity of high quality reference data available in web mapping sites and API’s lead to a downward spiral in the perceived value of data.  I guess this is a question that TeleAtlas and other data providers probably worry about too. 

Most of the demonstrators were showing property search based web sites, which seems to be where the money\talk is at the moment, but being a geek at heart I was more interested in Widr and their location API, Its a similar concept to Navizon, location by wireless networks.  Its pretty cool, and the model of revenue sharing with people who contribute to collect the location data is nice.  I guess the key is how  much coverage they get, this will partly depend on how much data is collected by contributors, and the problem I found originally with Navizon when it was just a mobile application, is that to collect data you need a GPS, but if you have a GPS you (mostly) don’t need Navizon, Having a web site API opens it up a bit as while you may have a GPS but your desk or laptop bound website users may not, but how important location is to laptop users vs mobile users I’m not so sure, its still  a cool idea though.

***Update*** 

I should probably point out at this point that the tool currently linked from here does not work with the  current xml format of openstreetmap.  I will post a new script soon that does, although you can probably tell from how long ago the format changed that I have had a few othe real-life things get in the way of blogging recently.

***End Update*****

A while back I started getting interested in the OpenStreetMap Project, initially just filling some detail around the area that I happen to live in.  Access to free detailed data has always been a bit of an issue in the UK and this looks like a great project that seems to be getting a lot of traction.  One thing to note is that the project has been approached very differently from a typical GIS data project. 

Ease of use of data capture has been the main goal and rightly so, as capturing the data in the first place is obviously the biggest challenge. However this does mean that if you approach the data expecting something similar to say navtech or O.S. Data you are in for a bit of a shock.  There is a documented XML structure, but it lets you put pretty much whatever content you want into the data.  This leads to a wonderfully rich set of attribution but also poses some challenges to using the data from a software point of view.  There are some guidelines to attribute feature coding here and there has been lots of heated debates on the discussion list about whether a new data model should be adopted , or whether a topological model is better, or if a spatial index is needed on ways and segments.  If you are used to commercial data the main thing you will notice is that there is very little consistent metadata at the feature level.  That said I’m sure the data has plenty of uses above and beyond cartographic display.

Most of the current tools are aimed at data capture tasks, or at processing the data into nicely rendered map tiles for use in the Google style slippymap.  There is now a pretty large set of data to work with, and it would be great for people to start putting the data to other interesting uses, so to help with that process I’ve just released a copy of a python script I wrote last year (As an exercise in learning python)  for loading the data into a file Geodatabase for use in ArcGIS.  I was planning on re-writing it, but never seem to have the time or enthusiasm, so I’ve released it as is.

The OpenStreetMap data can be downloaded as a Planet dump file containing the current set of data.  The data consists of nodes (both points and vertices) segments (a linear segment between two vertices) and ways (a linear or possible area feature made up of an ordered list of segments and attributes) .  The loader constructs the Ways from the nodes and segments and loads them into polyline or polygon layers. It also loads those nodes that have attributes, as a point set of data.  Attributes in OpenStreetMap are based on key value pairs  such as k=”highway” v=”motorway”, you can think of k as the attribute name and v as the attribute value.  The loader creates attribute fields for all the “standard” key values and loads them into a separate table that can be joined easily to the nodes and ways based on id.  The non-standard key values are loaded into a separate table. with a node_id,name and value columns.  You can use a relate in ArcGIS to join these to the ways or nodes.

The tool runs as a geoprocessing tool from the toolbox, and was written so as not to use any more memory as the data volumes increase, so it should carry on working even when all the tiger data is imported.  the OpenStreetMap data is usually downloaded as a compressed XML file.  This tool does not use an XML parser, as in the past the data has not always had the correct utf8 encoding, also you don’t need to decompress the data before hand which should save a bit of space.  If you are interested in different tags then you can just edit the script to have a different list of standard or ignored tags.

If you’re interested in exploring the data give the loader a go and see what you can do with  OpenStreetMap.  You can download the loader from ArcScripts.

I was lucky enough recently to see a great demo of the giant TouchTable, and I had forgotten since seeing it 3 years ago at the user conference, what a very very cool piece of kit it is.  With microsoft recently announcing their forthcoming Surface platform, I got to wondering what would be involved if i wanted to build one myself.  The thing that makes these devices so compelling to use is the multitouch and collaborative aspect of the interfaces.  Pretty much all the devices that are available to consumers such as tablet PC’s, or businesses such as the smart boards and kiosk PC’s are single touch only.  The devices just emulate normal mouse interaction with a single cursor.  The touchtable, surface pc and perseptive pixels devices all allow you to interact with multiple touches at the same time, either both hands or several people at once.  This bit is the most challenging, both from a hardware sensing, and a software processing perspective.  I’m not too sure how the touchtable and perseptive pixels stuff works, but the Surface PC works using IR sensing, as decribed here

Anyway, after a bit of searching I found this site dedicated to building your own touch table using the same principles, there is a diy guide to putting together the hardware, as well as links to an opensource effort to build the software to interact with the hardware.

All I need to do now is think of a good reason for making one!

The Agenda for GIS Tech is up on the website The green tracks are the more technically oriented sessions, mostly being given by people from the consultancy group.  There’s some good AGS talks, particularly on the second day, I’ve had a preview of some of the demo’s that Matt and Dan are doing in their Ajax\Web 2.0 talk and that should be worth seeing. On the first day I’ll probably stick with the green sessions as well as they all look pretty interesting. 

Registration is now open for GIS Tech 2007.  This year we are going to be in Nottingham, although the format is going to be pretty similar.  Starting with a plenary session followed by various streams with 1 hour technical sessions covering pretty much all the technology so  you will be able to get to see all the 9.2 stuff.  The details of the event and how to register can be found here

Its a great way to get more detail about products and technology that you may be using or planning to use as there is a wide range of presentations, I think I’ve been roped in to doing one.  Its also a good opportunity to get to meet many of the technical staff at ESRI(UK). 

You can now download the free OSTN02 transformation tools for ArcPad 7 and ArcGIS 9.1.  To download it you need to log in to http://www.myesriuk.com/esriuk/members/downloads.asp.  Its listed as OSTN02. 

obviously you need to be registered with myesriuk.com to do this.  The download adds a new transformation to ArcMap that you can use when using OSGB36 based data It  also has an installation to support OSTN02 in ArcPad 7.