I have been involved in a project looking at detecting urban change from radar imagery and, specifically, the building and demolition of buildings. In order to validate the work we are doing it struck me that Mastermap would be an ideal product to use. TOIDs are used to model features in the landscape and are added, modified and deleted. These can be rolled out to users for an AOI so that an updates to the area can be applied. Of course, the corollary of such a service is that (in Asda speak) you could “roll back” the landscape to a prior point in time to see what the landscape looked like. Or, another way, you could “print” map for a certain point in time. This is potentially a powerful way of looking at urban change and would be very interesting to exploit, except….
I don’t believe OS (or EDINA) offer such a service. When acquiring data it is only ever for the “current release”. This astounds me (although please correct me if I’m wrong) as there must be 101 uses for (recent) historic data. So it would seem that the only way to get temporal snapshots is to get a printed map. Hmmmmm… the wonders of the digital age.
GIS Lounge have a nice article entitled Why ArcView 3.x is Still in Use. And the funny thing is that it describes (by inference) all the problems with AV3 and then goes on to say why it can still be better than ArcMap. Not really a glowing recommendation for ESRI. And yes, incredibly poor performance, high overheads of implementation, draconian licensing and the arse-about-face way of doing things. Yup, it is usually quicker to do things in ArcView 3.x (create a new shapefile, delete vertices etc etc).
ESRI have been busy bees in the 3D geospatial world and one of the “new” features in ArcGIS 9.2 (OK, I know 9.2 isn’t that new with SP5 out and 9.3 imminent!) is the ability to create “Terrains”. Clearly ESRI realise that a huge new sector of data capture is via LiDAR (initially airborne, but increasingly terrestrial) and given that performance in ArcGIS (generally) is, as Ian would say, as good as a chocolate teapot, they needed to come up with some ways of improving data visualisation (re-drawing speeds). Not only that, but given that many people don’t use server-side databases, the personal geodatabase was increasingly looking long in the tooth, using a bespoke format, and just not designed for the job (i.e. it can’t store more than 2Gb). LiDAR, and more generally raster, data can surpass this limit quite quickly.
So terrains have been born that essentially take the idea of pyramid layers (i.e. subsampling) and applies it to vector data. And by vector data we are primarily talking about LiDAR point clouds and their conversion to TINs, but that then can include things like breaklines, drainage, masks etc etc. So its not exactly revolutionary but very much a step in the right direction. The ESRI Instructional Series has two useful introductions on terrains.
And to create them? Well you need 3D Analyst installed and, you would hope, there would be a single “whizzy” button that says “Create Terrain”. However hope is too much in the world of ESRI and it has to be more complicated than that. Firstly you need to create a file geodatabase (in Catalog). Yes, thankfully ESRI have seen the error of their ways with personal geodatabases and finally have a format that is easily transportable and unlikely to corrupt. It can also store upto 1Tb per table which is handy. Once you have said geodatabase, you need to create a feature dataset inside it (note its a dataset, as in a collection of classes) and then put your terrain data inside the dataset. So if your LiDAR has come as a 3D shapefile then this needs to be exported into the feature dataset. Only then can you run the Terrain wizard to create your terrain (right click on the feature dataset, goto New and then Terrain). Note that the wizard does automate the terrain creation process and the individual tools are available in ArcToolBox. The 3D Analyst PDF does have some further info.
OK, its actually SoftGrid and ArcGIS. Bought by Microsoft recently, softgrid provides a virtualisation environment where you can “push” an application across a network to a client sitting on a PC. It *looks& seemless and means the application doesn’t have to be installed locally. This is really useful for our computing support people as our GIS software is site licensed through the faculty, but we have students wanting to use them in university labs. Softgrid is a great solution and works with everything we have thrown at it (ERDAS Imagine usefully!) except, you’ve guessed it, ArcGIS. This is the one application we really wanted to get working, but with little success. It initially loads but with very slow performance, hanging etc, it becomes unusable.
And (thanks Kris for the info) it appears that the problem may be related to the archaic port of ARC/INFO to the PC, the appallingly poor use of registry entries (apparantly 50-70Mb just for ArcGIS!) and Microsofts slow registry performance. All of this adds up to alot of registry accesses which kill performance.
Solution? ESRI needs to re-write ArcGIS properly and get rid of the bad back end. ESRI needs to use *anything* other than the registry for most of its settings. Microsoft needs to sort out registry performance.
None of which is likely to happen!
For those that have taken advantage of the tumbling subscription rates for mobile broadband through the likes of 3, T-Mobile etc (as cheap as £5 per month), the different data rates might prove a little confusing. Paul Ockenden has a useful article in PCPro this month that summarises and gives a little history on the topic. And the main data rates you will come across are:
CSD: 14.4 Kb/s
GPRS: 114 Kb/s
EDGE: 230 Kb/s
3G: 384 Kb/s
HSDPA: 1.8, 3.6 or 7.2 Mb/s
The interesting thing here is that 3G only ever vaguely approached the slowest ADSL line speeds (51Kb/s) and we certainly wouldn’t consider it fast today. And also note how fast EDGE is to 3G. However it is HSDPA that is really pushing the envelope and the data rates can get this fast (depending upon the contention). Note that 3G/HSDPA are separate from the other standards which run on standard GSM. 3G technologies are at different frequencies and on a different network. Interestingly both EDGE and HSDPA can supposedly be deployed using firmware upgrades to existing equipment.
I’ve finally taken the plunge and upgraded to Firefox 3, although through the portableapps.com route. This has the benefit of being able to run both FF2 and FF3. Having tinkered with FF3 when it was first released I was impressed by the speed at rendering sites; noticeably faster than FF2. IE7 appears glacial in comparison! The address bar has some really neat features and we finally getting tagging for bookmarks which nicely does away with the need for folders.
The problem?? The fact that I run around 30 add-ons. Perhaps not best for stability but they all serve a specific purpose and really do extend the functionality. Much to my surprise nearly all of them were FF3 compatible, although a few key ones weren’t (Copy Plain Text, Auto Copy, Minimize To Tray). Add-ons are downloaded as XPI files, which are simply ZIP files with the requisite additional files to add to your profile. FF does a compatibility check against install.rdf to see which version is supported (against the maxVersion field). So for these addons they can be got to work, rename the XPI to ZIP and extract the RDF file. Simply change this field to version 3, put it back in the ZIP and rename it to XPI. Clearly you don’t want to do this for all addons (particularly complex ones!) as FF3 clearly has changed. However for simpler addons this will get them working again.
OK, so there are potentially cognitive thresholds which inhibit the progression of (willing!) students in their learning. Sandy Gilkes went on to suggest four primary areas that engage students and assist in their learning (and I quote):
-that they are recognised
-what “learning” is
-what you expect of them
-how they are doing
-critical writing looks like this
-reflective writing looks like this
-an argument looks like this
-a dissertation looks like this
-models of reflection
-prompt questions for reflection
-prompt questions for evaluation
-vehicles for practice
-tools to develop skills
-open learning resources
Sandy finished with a very nice quote from Pestalozzi (1746-1827):
“Education is the development of an innate power, the formation of an abiding habit, that constitutes its true value.”
The Free Our Data Campaign report on the Show Us a Better Way “competition” from the e Power of Information Taskforce. In short, getting people to come up with good ideas for the use of public data. What’s nice that they have a brief list of data available and have posted new datasets. Makes for interesting reading even if you don’t post any ideas!
I spent a day this week at the NERC Field Spectroscopy Facility receiving some training in the use of a GER1500 that is being used to study loess profiles. The facility is extremely well run with the training not only being thorough but also very practical indeed. They have a good pool of equipment (GER, ASD etc) that can be used in a variety of environments (e.g. marine) with a rolling programme of upgrades. They are also very active in both the development of field spec techniques as well as their application and are in regular contact with the manufacturers. As with any NERC equipment (e.g ARSF), an application form should be completed with deadlines of 1 June and 1 November. This should provide a supporting science statement and detail of the methodology. Its not onerous, but clearly the FSF has to be sure it is funding appropriate projects.
I came across a recent USGS publication today, Annotated Definitions of Selected Geomorphic Terms and Related Terms of Hydrology, Sedimentology, Soil Science and Ecology. This is a geomorphologically focussed “dictionary”, although there is a bias towards fluvial environments. Possibly not as useful as the Penguin Dictionary of Physical Geography but none-the-less well worth a look at and with a more inter-disciplinary focus.