May 2006


If you are not on the beta program or haven't been to a user conf or to GIS tech, then you might be wondering what things are coming in ArcGIS 9.2. There's not much to find out on the web about it as its still in beta, but there is a well hidden page on the esri site that lists a lot of the new things that are coming in 9.2.

The theres lots of new stuff, but if i could only choose 3 things from 9.2 it would be these:

Geodatabase History, lets you store the history of all changes to your data and visualise and query how thats changed over time, also great for auditing purposes in enterprise systems.

ArcGIS Server support for Geoprocessing models You can build models and tools as you can now, but then publish them in ArcGIS server and allow people to access and run them via the web, great for centralising your data management and processing tasks.

The new ADF for ArcIMS + AGS. The new developer framework gets all ajaxy (seamless pan and zoom etc) and also works with ArcIMS servers as well as ArcGIS server, WMS etc, to top it all its both java and .net.

there's lots of other stuff too, so its worth reading the page and the other pages it links to.

Advertisements

For Anyone (like me) who has installed IE7 Beta2, you've probably noticed that ArcToolBox doesn't work anymore in Catalog or ArcMap.
Anyhow there is now a patch here 

Seems to do the trick. 

This is a really interesting post about the challenges faced in moving the O.S. forwards, and in particular in how to lower the entry barriers for users of OS data, and help to drive the kind of innovation that has been fuelled by Google-maps. I've always thought that the biggest "Wow" factor of Google maps for UK users, is not really the UI or the API (These are very good, but not unique) but the availability of the detailed data. Making data available is all about the simplicity of the licensing.

At last years GIS tech I did a session on security and how to build secure web mapping sites, as a worked example of how the web allows users to build custom applications that can utilise functionality of your web site in there own applications, I built a Google map client that loaded the street mapping layer as a dynamic background layer into ArcGIS. This was before the API was released, and after looking at Google's license it was fairly obvious that this use was on the bounds of breaking it.

However the thing this highlighted for me was the simplicity of the Google terms of use, when compared to the O.S. MasterMap or Landline licensing agreements. I worked on our Searchflow system which delivered first Landline and then MasterMap via the web, as vector data that could be manipulated by users. The hoops we were required to jump through and the complexity of the licensing agreement, would have been a major barrier to smaller organisations or individuals.

There seem to be 2 key areas where the licensing is complex:

The first is the complexity of the charging/royalty scheme for mastermap and landline for commercial customers, we had to build a whole set of back office tools, to track, calculate and audit usage and royalty payments. To be fair the scheme was trying to be flexible and aimed at charging users by type of usage, but the flexibility led to a lot of complexity. If services are made available free perhaps leading to chargeable ones later it is vital that the charging model is simple, and that it is easy for developers to implement.

The second issue that I believe is more important, is that currently most of the O.S. agreements seem to contain technical restraints on the usage of data. This is a major hurdle for innovation, as there are constraints in the license that could prevent new usage of data, some of them seem fairly innocuous such as providing a copyright statement and license number, but even this could be an issue, when for example trying to provide data as a service with no map or UI. More importantly is the ability to deliver vector data for example, svg, gml or georss. Currently the letter of most agreements don't really allow for this. I think the key issue is that Lawyers tend to work to a different time-line than developers, and trying to build technical constraints into license agreements, is always going to lag behind what people want to do with technology and so stifle innovation. Any constraints should be focussed on business models or commercial issues rather than technical uses.

There is great value in Mastermap data but many of the traditional users may not need or utilise this additional value. It will take the sort of explosion in innovation that making the data free for experimentation by individuals will drive, to realise the full value of the data.

Microsoft Research have released MapCruncher This looks like a great tool for doing georeferencing of data to Virtual Earth data, and then creating a mashup that you can easily publish on a web site. Its starting to introduce some concepts that previously have been the realm of GIS Proffesionals to consumers who want to build simple mapping sites and shouldn't have to worry about the fact that Virtual Earth data is in a mercator projection,or that their image map is not correctly georeferenced.

The UK Oracle Spatial Special Interest group is meeting on thursday this week, its always interesting to see what people are doing with oracle spatial. It tends to be a mixture of user presentations and something from oracle (Usually Albert) and is always informative. It tends to be a bit more technical/practical than some of the AGI SIG's are. I'm going this week, mostly as a number of our customers tend to go and often have questions about the use of Oracle spatial with ESRI tools, its also a good place to meet up and chat with people using or involved in delivering systems with spatial. I'm not sure if they are full yet but its worth going along if you get the opprotunity.

Dave Bouwmen of Arcdeveloper fame has started a thread on software development processes used by GIS consultancy companies, so I thought I'd describe a little how ours has evolved. Many years ago when our consultancy group got going most of our projects were relatively small and standalone, and historically we have tended to do most of our work following a DSDM method. This is a RAD approach which allows us to focus on the key requirements first, and start to time-box deliveries to our customers, but to maintain flexibility as the project progresses. It has been pretty successful for us in the past on small to medium sized projects, however DSDM does raise a couple of issues. One of the main criticisms of DSDM is that it does not scale to very large systems, as well as some other approaches do. The key reason for this is the focus on prototyping requirements early and expanding these through the lifetime of the project, this tends to mean that the overall architectural framework may initially get ignored which could lead to scalability problems later. It tends to work well on projects where the framework is already well known or dictated by the products chosen, so for example developing an Extension to ArcGIS would be a good project for this approach, as it is mostly worrying about the client functionality and the UI, which are easy to prototype then extend, and there is not much need to think about architectural scalability. The second issue with adopting a RAD approach is that it can often be hard to fit this with the expectations of the customer, delivering software in an interactive manner requires resources from the customer to receive the deliveries and to test and provide constructive, timely feedback. A large percentage of our customers are public sector organisations who are often under-resourced, so its vital with this approach that all parties know exactly what they are getting when, and what resources are going to be required to make the project a success.

There have been a number of trends in the last few years that mean we have had to evolve our development approach. The size and technical complexity of projects has tended to get larger, which means that scalability and architecture is much more of an issue than ever before. Integration with other IT systems is much more common which means projects are tend to be based around building interfaces and components rather than end-to-end systems. The shape and architecture of GIS projects is now infinitely more varied that it used to be, with desktop, web, mobile and database systems, being mixed together in different combinations. To deal with these issues we have moved now to a process based on RUP, this allows us to keep the flexibility of an iterative approach, but to focus more heavily on getting the architecture correct to begin with. Its not a huge step from DSDM, but focuses on the bigger picture to begin with before breaking the project down into smaller iterations. Its good to have an flexible approach that can cope with change during the project, but its important to get the fundamentals right near the beginning, as it can be expensive to change the entire architecture once the project is well under way.

I'm also a fan of some of the agile development processes, particularly around agile modelling, and the AUP which attempts to simplify the full RUP approach into something a little more lightweight and flexible.

Its interesting to see what other organisations are up to, and what the constraints on adopting other approaches is.

There is a new white-paper describing how to use ITN data with Network Analyst. Its fairly lengthy, but gives a good overview of the ITN GML data-model and how that is translated into a Geo-database model. There is then an explanation of how to pre-process the data so that you can use it within network analyst, to do routing and integrate it with the route restriction data.

ITN in Network Analyst

Next Page »