WorldKit 3.1 Released

Saturday, 30 December, 2006

Mikel Maron has released version 3.1 of WorldKit, the very functional, lightweight, web GIS client that we use at the Journal of Maps. FlashPlayer 8 is now supported, along with the addition of select zoom and pan buttons. There are three things that would really help our use of WK though:

  • scale dependent renderer for displaying points and polygons (i.e. a point when zoomed out and a polygon when zoomed in).
  • when hovering over an object the description text is displayed, but if this is too long it is cut off the display
  • being able to point it at a (MySQL) database to read objects rather than using an XML file



Converting old word processor files

Wednesday, 27 December, 2006

A while back I had occasion to look at some old files I had created as part of my MSc Thesis. These were generated in WordPerfect 5.1, a fantastic, state-of-the-art, DOS word processing package that really could “do it all”. And I really did shell out the ~£250 for it. Well, times fell hard for WordPerfect and they eventually passed from Novell to Corel and remain plugging away in the background. This brings up the first lesson of format conversion; if a new version of the product exists it should be able to import all the old files pretty well. And at less than $100 WordPerfect is pretty cheap.

Of course you may not want to shell out any cash which brings up the second lesson (and warning!) of data conversion. Use something else! Microsoft Word has always vaunted its WordPerfect conversion facility (back in the days of Word 2), however it is far from perfect and on most of my documents makes a right dogs dinner of them.

The final lesson in format conversion is actually to use the software itself! If you still have it sitting on an old PC then you might be able to fire it up, edit the document, print it out or whatever. The alternative is to re-install on your nice new shiny PC. Which would work if it was running DOS (Windows 3, 95, 98, ME), but NOT (without problems) if it is running Windows NT, 2000, XP or Vista. Whilst the command window is certainly “DOS-like”, it is not DOS and there are many compatibility issues. Indeed my install of WordPerfect half worked, but would do some very strange things, then keel over and die. The alternative, which is used in many other areas of computing, is to get a DOS emulator and run WordPerfect inside that. And games fanatics, keen on maintaining playability for some of the classic DOS games, have come to the rescue in the form of DOSBox. Its a DOS emulator (doh!) and you just run your word processor from inside it. Simple to install and it worked perfectly!!

So that gets your original processor working on your new machine, from which you can load your original document and, indeed, edit it. What do you then do with it?? Well you could export it in to another format, but the real issue is preservation of the file content and layout. Which means you need a page layout document and, yes, PDF is the ideal candidate. Of course PDF is just post-script and all the decent word processors from the DOS era had postscript printer drivers. All you have to do is install a postscript printer driver (I went for an HP LaserJet), print the document to a file and then use Acrobat Pro to convert this to a standard PDF. It all works rather neatly and you can see the results in my MSc thesis!

Firefox 2

Sunday, 24 December, 2006

I’ve held off upgrading to Firefox 2 for a while now simply due to lack of time and the availability of Portable Firefox. Well both have coincided so I made the jump and very well worthwhile it has been too. Nothing is staggeringly different, but the interface is fresher (I’m using the QuBranch theme) and the slightly buggy memory leaks less of a problem. The nice additions (for me) are the spell checker (very useful for blogging) and session restore. The latter reloads any open pages from your last browsing session. Searching is supposedly easier with “suggestions” made by the search engines; I find this intensely irritating and there is no menu option for turning it off. This can only be achieved in the “about: config” page (accessed by typing this in the menu bar) which gives you access to loads of “hidden” Firefox settings. Extensions are now called Add-Ons (why?!), an anti-phishing filter is added (I’ve turned it off) and RSS feeds are handled better (although I use Sage). So all in all a well worthwhile download.

Academics For Academic Freedom

Friday, 22 December, 2006

After a lengthy absence from illness, I thought I would give a plug for the “Academics For Academic Freedom” campaign that recently received some press from the BBC entitled Academics seek right to offend. The title is perhaps a little misleading in that the campaign is seeking academic freedom, “the responsibility to speak your mind and challenge conventional wisdom”, and so contribute to open debate within society as a whole. As AFAF says on its website, “In today’s political climate it is harder than ever for academics to defend open debate.” Whilst the campaign has been running for a couple of months, it is perhaps reached wider press as a result of the release of David Irving (the BBC again). And I wholly support the campaign as it is vital that there is academic freedom to openly, and critically, debate current issues. In the current climate of political correctness, it is hard to imagine the strength of character required of the likes of Galileo (is the Earth flat?) and Darwin (“Origin of the Species).

Fingerprint of Error

Friday, 8 December, 2006

I bought a rather snatty Sony Vaio TX3XP laptop recently. Not only is it small (20x25cm) and light (1.25kg; its made from carbon fibre), but it also has a incredible battery life (~10 hours). Anyway, one of the features introduced on this model is fingerprint scanning to log on. It really works rather well and is a damn site easier than typing in a user name and password. That was, until I had a bath. Afterwards the skin on my fingers shrivelled slightly and I was refused access to the laptop!!

Inn on the Park, St Albans

Tuesday, 5 December, 2006

Another occasional haunt is the Inn on the Park in St Albans. Really a cafe-cum-coffee bar, its situated on the north side of the park in Verulamium and is a rather pleasant place to go in winter. The food is home cooked, the coffee good (naturally!) and its just mellow. The same cannot be said for it in the summer when it gets very busy, particularly as the “splash park” is only 50 yards away; avoid at all costs if you want peace and quiet!!

Ultra-Light KAP Rig

Saturday, 2 December, 2006

As a result og my KAP experiences in China, I spent a while thinking about a new ultra-light camera rig. The current rig, copying Scott Haefner’s design, weighs in at over 500g with picavet cross. For conditions such as I experienced in Wuhan, its very difficult to get the rig/camera to fly, even with something like the Dopero125. I’m not prepared to reduce the quality of the camera (and the Coolpix 8400 is very good) so the next alternative was the rig.

First off, nearly all of our images are verticals; the only adjustment we need is rotation and that actually is only required through 90 degrees (to align the frame). So the new rig (right; very many thanks to Martin Abbott for the effort here) has a single-piece carbon plate picavet with a light-weight servo for rotation and the arm of the rig attached directly to the servo. Scott’s rig was very clever in that he had designed a simple gearbox to remove any stress on the servo (i.e. the rig hung off the gearbox, not the servo), but it added to the weight. The rig arm is curved to allow the camera to balance. In addition the camera can be hung at the top or bottom of the arm and, if you want, the camera can be moved to a fixed pan or tilt position. We also added a much lighter 6-channel receiver (not shown), with the whole lot coming in at a staggering 91g. We haven’t tested it yet so the concern is how the strong the servo is, however just to be on the safe side there is a safety strap so the camera doesn’t plummet to earth mid-flight! Watch this space for some more results.

Earth and Life: the birth of a new journal…

Friday, 1 December, 2006

I received a circular last week to a new journal from Beijing entitled Earth and Life. in today’s very crowded marketplace I either applaud those that can find a niche that isn’t being serviced or sit back and scratch my head in wonder. Anyway, Earth and Life is aiming to be a weekly publication covering everything in the geosciences and, I think, uniquely has a “self-review” system. That is the author goes off and gets someone to review their paper and this review is then emailed in. Quite an intriguing system in terms of reducing administration, but will it be taken seriously as it’s obviously open to abuse. Time will tell I guess.

The other aspect of Earth and Life that made me smile was that of the 69 pages in the first issue, 68 were authored by the editor. I think that must be some kind of record!

Applied Geomorphological Mapping Working Group

Friday, 1 December, 2006

I have recently become the vice-chair for a new working group at the International Association of Geomorphologists on applied geomorphological mapping. Geomorphological mapping (the mapping of surface features based upon their morphology) saw a rise in interest in the late 1950’s, with an explosion in application in the 1960’s and 1970’s. Since that time work has been much more muted, but ongoing. The last decade however has seen a resurgence in landform mapping with the working group one expression of this interest.

The groups general objectives include:

  • Develop and deepen the theoretical basis of applied geomorphological mapping;
  • Develop standards, specific mapping procedures and legend systems for different applications and scales;
  • Disseminate the importance and effectiveness of the use of geomorphological mapping as a basic tool for those who deal with the physical environment, in order to;
  • Put a bridge between our and other scientific and professional communities.

Some of the intended outputs of the group include a professional handbook, digital atlas and at least one journal special issue. The target conference where primary reporting will take place is the IAGs 2009 International Geomorphology Conference - Melbourne.

The group is keen to involve individuals from across academia and industry, including those interested in geomorphology (!), engineering geology/mapping surficial materials and geomorphometry. Please subscribe to the AppGeMa mailing list if you are interested.

More Editorial Musings

Tuesday, 28 November, 2006

Some more musings on the role of editors in academic journals as a result of a paper I submitted a paper to the Journal of the American Society for Information Science and Technology last year. In its original form, two reviewers highlighted both the strong and weak aspects of the paper. One suggested it would sit well if re-submitted (and actually noted that the topic was “excellent”. Warm glow!) as a short communication and on this basis the editor recommended a re-write. After six months (yes, I really should have done it sooner) I sat down and shortened the paper, re-submitting it. It then took another six months for the review to be completed before it was finally rejected. What surprised me was that the paper was reviewed from scratch and the original reviewers comments were abandoned, with one of the new reviewers stating it is “not likely to be of interest to the readers” (definitely not a warm glow on that count!).

So what is going on in all of this?? Well, I would normally re-submit a paper, addressing the points raised by the reviewers in an attached letter. An editor would be expected to check that these points had been correctly addressed and then either accept or reject on this basis. For whatever reason, the paper went out for a second review, which was not favourable. Clearly this placed the editor in a difficult position. Two sets of reviews, the first generally positive and the second generally negative. Which are “better”? In the end the paper was rejected but it clearly highlights both the role of the editor in the whole process and, more importantly, the careful selection of referees (something also highlighted by the IJRS article retractions). And it is referees that are both the strong and weak link in the whole review process. You need “experts” in a field of study, but can you find them? Are they expert across the scope of a whole paper? Are they biased? And will they do it?! Ultimately, these things need to be balanced.

Online Backup

Friday, 24 November, 2006

I blogged last week about having a reliable backup routine for data on a PC. In this I mentioned that I have five copies of my data, including archives and offsite backup. Whilst it is relatively simple to set up a backup routine to another hard disk drive (internal or external, or indeed both!) using something like Second Copy or Mircosoft’s free SyncToy, offsite backup is a little more complicated. This would traditionally have been performed to a tape, which would have been taken away at night. People have more recently used CDs and DVDs and the new generation of BluRay discs.

There are two problems with this approach; the first one is actually remembering to do it, making sure there is a disc in the drive, and taking it offsite! The second is disc capacity. I have 40Gb of data which wont easily fit on a DVD (and I dont want to be sat at a PC slotting discs in and out).

The simplest method is actually online backup. This ranges from free space at places like Box.net through to the rather neat firefox extension called GSpace that allows you to dump files into free space within a GMail account. Ultimately though these are only of he order of 1-2Gb so suitable for some but not a total solution.

The cheapest of the online storage solutions is Carbonite (ny referral URL), offering unlimited space for $5 a month. This itself is great, but for me its the software the totally sells the solution. Operating as an extension within Windows explorer it simply monitors these directories and automatically backs up your data, compressed and encrypted, to Carbonite. You don’t have to think about it. And once its done the initial upload it simply copies file changes. The restoration of files is painfully simply, again using Windows Explorer to access your remote data and marking files you want to restore. All in all its a brilliant solution that I can heartily recommend.

Firefox EXIF Extension

Wednesday, 22 November, 2006

Following on from my blog about the potential use of EXIF headers in JPEGs, I cam across an extension for Firefox called EXIF Viewer. It does what it says on the tin in that it allows you to view EXIF information for JPEGs. This is an early version so it’ll be interesting to see how this develops.

Exporting References from Endote

Friday, 17 November, 2006

I was putting together a relational database recently that needed to contain a table of references. The references themselves were sat in Endnote so I thought it would be straightforward to export them into something like a CSV or Tab-delimited file. It’s not though!! The “Export” feature doesn’t do what you expect and supports TXT, RTF or XML, exporting using the currently selected output style.

The solution is to create an output style using the (text) format you prefer. I like working with CSV because they are straighforward to manipulate. Whilst a tab-delimited output style is made available as part of Endnote, a CSV is not. So I created a very simple CSV output style to generate a CSV file. With this output style selected you make sure all the references are highlighted and then go to Exprot in the File menu. A new TXT file will be generated that is a CSV and can be dumped straight in to Excel or a database.

Note: I only created the ouput style for “Reports” and “Journals” and, for some strange reason, Endnote wouldnt put a comma after the author field (but did after all the others). I changed this to a * and then did a simple find and replace to put commas back in, in my text editor.

IJRS Journal Article Retractions

Thursday, 16 November, 2006

Beinga journal editor, I am concerned about the quality of the articles we publish, but have to balance this against the maintenance of a throughput of appropriate material. Add in to the mix the management of 1 internal reviewer, 1 cartographic reviewer and 2 external reviewers, and it all makes for alot of effort to publish one article.

The whole “quality” issue came starkly in to focus recently with the publication of a “Statement of Retraction” by the International Journal of Remote Sensing. If you read the statement you will see that not one, but three, papers have been retracted from publication (Sidenote: not sure if you can physically retract something that’s already published; I guess it’s more like disowning) where the same (ish) group of authors substantially reproduced material that had already been published (i.e. plagiarised). This really does highlight the whole peer-review process. It’s not perfect by any means, but does provide a good way of assessing the “worthiness” of research. So it is a case of selecting reviewers with care and passing a careful editorial eye over the results. What I find slightly strange is that several of the papers plagiarise were themselves published (earlier) by IJRS. Not quite sure what was going through the minds of the authors….

Backup your data ;)

Wednesday, 15 November, 2006

Ontrack Data Recovery have a rather amusing selection of “Top 10 Data Loss Disasters” from 2006. They include the usual dropped from helicopter, run over and packed in wash bag they seem to crop up regularly. I found the “banana on the external HDD” rather good, whilst my favourite definitely has to be the university researcher who sprayed his HDD with WD40 to stop it squeaking!!!

Of course this is all designed to highlight Ontracks data recovery services whilst suggesting that backup might actually be quite a good idea. Comedian Dom Joly had 5,000 photos, 6,000 songs and a half-written book on his dropped laptop. All I can say is more fool him for not having a backup. My own data has four automated backups, with one off-site, and includes archiving so that I can access previous versions of files (and I use the excellent SecondCopy for most of this). With the proliferation of digital media (audio, video, photos), backup really is very important but very few people (and companies!) actually do it.

LaTeX: LaTable

Monday, 13 November, 2006

One of the other bits of LaTeX “support” software I’ve come across in recent weeks is LaTable. Laying out tables, to quote Robbie Coltrane from “Nuns on the Run” when explaining the holy trinity, is “it’s a bit of a bugger”. Whilst most things in LaTeX are generally straightforward, tables are not. I have sweated many hours trying to get the layout look right. Anyway LaTable is a small Windows utility that allows you to layout a table in a spreadsheet fashion, inserting rows/columns, deleting rows/columns, merging cells etc. It has a “code” view so that you see the raw LaTeX code ready for copying straight in to your document. It also supports importing CSV files which is handy. It’s not the panacea for table layout but does take the pain out of all the initial hard work.

Planetary Geosciences, Geological Society

Thursday, 9 November, 2006

I’ve just spent a couple of days at the Geological Society in Picadilly at a conference on Planetary Geosciences. As a side note, the GS has the original (very large!) William Smith geological map of England and Wales. Its the first ever geology map of any country and was published in 1815. Because of decoration work its been on full display in the lower library; rather pleasant during coffee!!

Anyway, for “geosciences” read “geology”. However the breadth of the conference was much much wider than this. Dave Rothery (OU) should be congratulated on putting together a good conference package that was thorough, complete and relaxing! And whilst geology was clearly the major theme, there were delegates from remote sensing, GIS, astro-biology, astronomy and technology. The keynote speech was given by Steve Squyres (Cornell and PI on the Marsr Exploration Rovers), as well as talks from science teams on sensors such as C1XS, MIXS and ExoMars PanCam. Somewhat gratifying from my perspective was the continual use (and requirement) for topographic data from many speakers (e.g. Lionel Wilsoin, Lancaster, on magma venting dikes). DEMs are now central to many process studies, with MOLA taking central stage for planetary work. However we are starting to see much HRSC data coming on tap (thanks to some of the work from Jan-Peter Muller’s team) as well as the promise of HiRISE data.

My own talk (audio and slides) was a broad review on GIS and how data, software and technology are converging such that there is increasing usage within planetary geosciences. However there are still barriers to “entry” so that many remain unable to use GIS. I finished up with the suggestion that, if it didn’t already exist, a GIS Special Interest Group might well be a good idea. Something I hope to follow-up (although feel free to comment).

Copyright or database right… Does it matter?

Friday, 3 November, 2006

One of the follow-on outputs to the GRADE Project from my report on the Use Case Compendium of Derived Geospatial Data deals with the legal aspects relaing to geospatial respositories. In particular there was a need to look at the framework for designing a licensing strategy for the sharing and re-use of data submitted to a repository. This part of the project was led, and reported on, by Charlotte Waelde at the AHRC Research Centre for Studies in Intellectual Property and Technology Law, Edinburgh University. Whilst my compendium highlighted the problems relating to data re-use, particularly with respect to the Ordnance Survey (as this is where UK HE has the greatest experience), Charlotte has taken a step back and assessed the basis for accepting the terms and conditions upon which the data use are based upon. And the conclusion that she has come to:

geospatial data (generally) does not come under copyright, but rather database right. The former covers original, creative, pieces of work (and includes things like photos and maps), whilst the latter is designed to protect databases that have been collated. Her argument (and you need to read the complete document) is that products such as Mastermap are covered by database right only.

This has some important implications, but don’t read this as a free-for-all grab at everyone’s geospatial data; its not. I would like to highlight the following point that Charlotte makes in her report (and I quote):

A lawful user of the database (e.g. the researcher or teacher in an educational institution) may not be prevented from extracting and re-utilising an insubstantial part of the contents of a database for any purposes whatsoever.

This has the following implications:

  • 1. If you’re a licensed user (e.g. Digimap user) you can use an insubstantial part of the database as you see fit (although Charlotte explains that the term “insubstantial” is still a little vague but possibly <50%)
  • 2. This includes re-distribution of that insubstantial part, creation and re-distribution of derivative data and publication of figures directly relating to the utilisation of the data or any derivatives.
  • 3. Any terms and conditions applied in relation to the original license are null and void for the insubstantial part

Of course does any of this really matter to anyone? Well on one level you could argue no. Those that are happy with the status quo, utilise geospatial data and publish within the “restrictions” are not affected in any way. There are those that find some of the current restrictions irritating and just want a “sensible” license. Finally there are those that want to get well on the way to “freeing our data.”

Whatever our outlook is, these are important and highly relevant conclusions to draw and will effect us all in the geospatial community. Indeed Europe as a whole (as this relates to the European Database Directive) is going to have to take a deep breath and work out the next step. Not least, the HE communitys two biggest bug-bears (with the JISC-OS license at least) are addressed in that, theoretically:

  • 1. No copyright subsists in derivative products and these would be freely distributable (as long as they are not “substantial”)
  • 2. For the same reason, academics would be more or less free to publish whatever diagrams they see fit



So really the big question is:

what happens next?

GIS File Formats: how do we distribute data?

Wednesday, 1 November, 2006

One of the other discussions that cropped up at the GRADE meeting was file formats for spatial data in relation to repositories. In the back of my mind I’m also thinking about the Journal of Maps as there is interest in publishing data. So what file formats do we use for data distribution, particularly bearing in mind the need for ready accessibility as well as preservation for future use. You have your ECW, IMG, SHP, TIF etc etc. What is good? Why? Will it work? What doesn’t it support?

In terms of formats we are really talking about raster, vector and attributes. At the lowest level these are all that are need to import data for use in any processing system. But that is all they are; low level. There is no preservation of symbology for instance. I think this is a good starting point (and its where I am going to start!), but I’m happy to be contradicted.

So, what formats?
Vector
Well SHP is good because its well understood and there are plenty of tools to deal with them (and I think its important that support from projects like GDAL/OGR is maintained). OK it’s not topological but very flexible. The same case could be made for DXF as well; is this worth including?. On the topological front E00 might well be worthwhile (in that its simple). How widely supported is it? Are there any other formats worthwhile considering??
Raster
GeoTIF is really the dominant player for open formats and JPG should be included because it is also a standard photographic format (along with the EXIF data).
Attributes
So that deals with the spatial side, we then have attributes. Is DBF the most apt? It is well supported. It might also be useful to add CSV for tabular data, particularly as its ASCII based.

What I haven’t mentioned here at all is GML. It is still very early days with limited support, but this has the potential to be the way forward. Even in the standard XML arena limited inroads are being made, although with both OpenOffice and MSOffice supporting it in office apps things could change rapidly.

Data Publishing

Tuesday, 31 October, 2006

The GRADE project, which I am a collaborative partner on, is concerned with scoping geospatial repositories. The project has principally been tackling legal and technical issues regarding their establishment and, I think, has made some very good progress. Yet behind all this work you do actually need people to deposit data for inclusion in a repository. And this is where the rub is. At the moment we have data centres (buzz word 5-10 years ago) and we are now seeing the increased establishment of institutional repositories. Yet what/where is the impetus for actually depositing data?? I suspect that this is partly subject specific. My impression is that subjects such as physics have a greater tendency to data share. In geosciences its usually a case of keeping what you have collected and only ever publishing the results; not the data itself. To be fair this is beginning to change with the research councils in the UK requiring the deposition of data from funded work. But how much data (from research) actually results from research council funding? My impression is less than half (although if anyone has any figures that would be interesting).

So we have the situation where there is a “top down” establishment of respostories, but no one is actually interested in using them. We have researchers collecting data (for research), but it is research publications that drives the agenda (NOT the data). I know that I see absolutely no reason why I should share primary data and, indeed, I like to discuss with people potential uses before sharing. Then of course we have the vested interests of the institutions that employ researchers. They are directly or indirectly funding much of this research and there is increased interest in “monitoring potential assets” (although quite to what extent institutions have a claim to IPR is another matter).

So where does that actually leave things?? Well Mahendra Mahey (at the GRADE meeting this week) provided a summary of repository work in the UK and (briefly) summarised some points that Pete Burnhill (Director of EDINA) was making along these lines. And that is that data should be published. As a community, academics need to be encouraged about the positive aspects of data sharing and see this as an opportunity to publish. Indeed one could argue that data publication should be seen as a valid publication route. And in the same way that journal articles are peer reviewed, so data should also be. This is a route that we have been toying with at the Journal of Maps. Several articles have data published with them (e.g. Stokes et al. They have been checked for appropriateness but not explicity reviewed in the same manner the article was. I am currently reviewing how useful this “service” is, with the potential to asks reviewers to comment on submitted data, as well as having a separate data reviewer. This actually raises a whole host of other questions concerning data preservation (as opposed to a repository) which I won’t comment on at this moment.

With the above comments, I think it is clear that I’m in favour of data publication, but I am inclined to think at the moment that the data should follow the research (hence the reason for publishing the data with the article at the Journal of Maps). The problem with separating data and content is that maintaining the explicit link between the two becomes more complex (just look at journals from the 19th century to see how effective immediacy is). It also makes the peer review process much simpler. That isn’t to say that data can’t be stored in a repository, but that, in the first instance, it might be better placed with the article. Indeed, I could see the research councils requirement for copies of publications and data deposition taken a stage further and requiring research articles to have data published with them. Clearly the emphasis is then shifted to the journals many of whom will not be placed to deal with it. However the whole research publication ethos is changing (e.g. open access) and it is time that journals become proactive. Indeed, with Wiley and Elsevier being so prominent (and supporting things like permanent electronic archives), it would only require these two organisations to support such an initiative for it to really take off. Whilst in principle it sounds a reasonable idea, there are many barriers. Not least the sheer volume of some data sets within a web based infrastructure where most journals struggle to offer more than a static PDF.