I attended the International Glaciological Society’s British Branch meeting a couple of weeks ago at the School of Geosciences in Edinburgh. As an aside, living close to Luton airport, it is quite convenient to cycle to the airport and fly up. It actually took me longer to get the bus in to the city centre! And the web check in at Luton really is very good. What it did bring home to me, is that, whereas in the past Scottish universities were marginalised by their relative “remoteness”, this remoteness has now shifted to universities in northern England such as Newcastle, Durham and York. It took 4 hours to get to Durham, compared to 2 hours to Edinburgh.
This is my second British Branch meeting and it was again and mellow affair with plenty of post-graduate presentations. These are always very good to see; not only to you get a feeling for where “current” research is going, but it providesd a great forum for them to “practise”. Long may this tradition continue. Most of the Glaciology centres had some form of presence (Aberystwyth, Bristol, Edinburgh, Queen Mary, Swansea, Scott Polar, BAS), although Durham, Leeds and Sheffield were notable for their absence, so it was a well represented meeting. Being interested in the cross-over between glaciology, remote sensing, GIS and “terrain”, Tavi Murray’s talk on volume changes of glaciers on Spitsbergen was of particular interest. NERC ARSF have had a couple of summer campaigns collecting, amongst other things, LiDAR. This project formed part of a larger NERC project and developed some long baseline GPS to enable the economic collection of data. The LiDAR were then used alongside the Norwegian historic archive of airborne survey imagery. Whilst the imagery had been flown with appropriate stereoscopic overlaps, it had never been processed. The advent of digital photogrammetry made this (economic) possibility more likely, however GCPs for rectification were still required. And this is where the LiDAR came in; as the LiDAR were accurately georeferenced using DGPS and onboard INS, the terrain data was accurately positioned. The intensity data could then be used to provide GCPs for the digital photogrammetry and then generate DEMs using this archive data. To the project’s surprise, there was sufficient contrast on the glaciers to allow the extraction of terrain elevation and then quantify volume changes of ice for a number glaciers. All in all, a very neat integration of technology and multi-epoch data.
I’ve recently returned from a trip to the Geography Department at the University of Durham where I gave two research seminars on some of my close-range remote sensing and DEM visualisation work. There is a group that is particularly interested in DEM visulisation for landform mapping and I was helping them getting going on this. Durham is a very interesting place to visit, with quaint cobbled streets, a cathedral and a castle. The castle is actually part of the university, which is nominally split in to colleges in a manner similar to Oxbridge. If you are lucky enough to be part of Castle College then you can get lodgings in the castle and, as a member of staff, go to the Senior Common Room for coffee. All rather reminiscent of days gone by (and the coffee is OK).
Which brings me to the final part of my trilogy, the dodgy wifi on the GNER train. I guess I should be impressed that GNER actually have this installed and I should say that it does work. At £3 for 30 minutes it’s not desparately expensive, although I gather from later this year it will actually be free. Anyway, getting connected is easy, but the service is slow. Downloading a 30Kb email takes upto 10 seconds, which is painfull! You don’t want to be downloading 5Mb email attachments. The service also had the habit of dropping out, particularly in stations; I’m not sure exactly how the service is run, but to be honest it’s borderline as to whether it’s commercially chargeable.
If you’ve read any of my blog entries over the last year, then you will realise that I’m a regular LaTeX user, primarily for typesetting at the Journal of Maps. Until recently I had never used LaTeX for writing a research article for submission to a journal. And my initial experiences are that I won’t repeat it!! Much of my academic writing with co-authors is performed remotely. It is therefore essential, when sending a manuscript back, to keep versions and identify where any changes have been made. MS Office (and Open Office for that matter) both have change tracking and it is a simple matter to work through the changes, accepting or rejecting them. LaTeX is a typesetting system and, as far as I’m aware, is not designed for versioning. Thats not to say it isn’t possible with PDFs (and I would be interested to hear of any LaTeX packages that might help this process) as Adobe Acrobat Pro has long supported annotating PDFs. However PDFs are not the source programme and you then need to reintegrate these changes in to the original file. It’s fine for sending proofs to authors, but not good when writing a paper. This whole process was brought home when a reviewer for one paper I submitted noted that the paper felt rushed and slightly disjointed. This wasn’t intentional but, having looked back at the paper, was a result of “round-tripping” the article using LaTeX. I’d be interested to hear if anyone else has experiences along these lines.
JPEG2000 is rapidly becoming one of the most popular image formats due to its high compression, lossless/lossy format and open specification. NASA use it for distributing HiRISE images, whilst I recently downloaded some photogrammetrically scanned air photos from NERC in JP2. Whilst quite alot of packages now support JP2 (both remote sensing such as Imagine and graphics such as PhotoShop), they can be painfully slow. Thankfully IAS Viewer is a free Java applet that can view both on and offline imagery VERY quickly. Well worth using if you need to browse imagery.
And if you really are wedded to Photoshop (or other graphics package) then try the j2k plugin. It’s also pretty fast and works well; in Paint Shop Pro it loaded a JP2 image that the native importer wouldn’t.
I was speaking at the Society of Cartographers - 43rd Annual Summer School this week (gentle introduction to spaceborne DEMs) and whilst only there for the day, a thoroughly relaxed and pleasant affair it was too.
The speaker after me was Daniel Thomas from the Institute of Cosmology and Gravitation at Portsmouth. He gave an excellent introduction to astronomy and how he is involved in the massive Sloan Digital Sky Survey to map more than a quarter of the entire sky using a 2.5m telescope and 120MP digital camera. As you can imagine, it generates quite a bit of data!! Anyway, as part of the project the team want to classify the 1 million or so galaxies that will be imaged. Computers are not particularly good at this type of classification (spirals, ellipses, edges, mergers and unknowns!) so the group has set up Galaxy Zoo, a public website that allows anyone to classify galaxies for them. A brief introduction allows you to see examples of the types of things they want to classify. This is followed by a brief test to make sure you’ve understood, then away you go. Well worth a try!