I’ve blogged before about processing RAW digital imagery, however that was in relation to extracting raw values that can be calibrated to give radiance. More recently there has been an increased interested in workflow processing of digital photography. Whilst I am a big fan, and use, Adobe’s Photoshop Elements, parimarily because it includes their photo organiser, it is still a little weak a pulling photos off the camera in a professional manner. Currently it will pick up RAW or JPGs images and import them into my collection (automatically storing them in a date named directory). From there I can “tag” them so that the collection is easy to navigate. It also allows one click lossless rotation, red eye removal etc for quick image processing based around a versioning system so that you always keep previous versions of the image. All good stuff, although note that unless you take care setting the system up, it is quite difficult to backup without doing a full system backup. In particular all tagging and photo attributes are stored in an Access database within your profile. Keep it safe! And woe betide you should you wish to move all your photos to another directory as absolute pathnames are hard-coded into the database. And whilst it is possible to change them it requires getting your hands dirty.
Anyway, digression. Several companies had noticed a gap in the market here, particularly for professional/amateur photographers. In order to get the best out of your imagery you need to shoot in RAW mode; this note only stores the raw image data, but allows subsequent manipulation at a very low level. And your image processing software is far better at doing this than your camera which is severly time limited when taking a shot. Pixmantec are of note in this area and produced a free and “pro” version of their workflow software. This allows you to bring in imagery to a directory of your choice, tagging the “quality” of the shot and then applying standard pre-processing techniques during import. In particular automated white balance, contrast stretching and sharpening. Whilst the free version didn’t include some of these features (ntably contrast stretching), these features are available in most image processing packages so it allowed access to the other features. I’m currently playing with this as it offers a wealth of flexibility. Notice I said “produced”. Adobe has now bought the company and the developers are proting much of the RAW processing capabilities to Adobe’s Lightroom product which is currently in beta. So Lightroom is the product to watch (and you can download a Mac or Windows beta), but for the time being you can still get the free version of Pixmantec. Grab it whilst your can!
I attended an RSA public debate on access to data collected by public bodies. This was co-sponsored by The Guardian as part of their Free Our Data campaign. The speakers were Vanessa Lawrence (CEO Ordnance Survey), Paul Crake (RSA Programme Director), David Vaver (Director, Oxford Intellectual Property Research Centre), Charles Arthur (Technology Editor, The Guardian), Carol Tullo (Director, Office of Public Sector Information) and chaired by Derek Wyatt MP (audio available here. There were probably more people “in” the industry forming the audience than were on the panel. A quick scan of faces saw people from the BBC (Bill Thompson), UK Hydrographic Office (CEO Wyn Williams), Groundsure, AGI (Angela Baker), Ordnance Survey (Ed Parsons), OpenStreetMap, National Archives, GDC and many others.
The evening started with a 5 min statement from each speaker and then a question and answer session. Whilst the academic value of these events is sometimes questionable (no question or answer is ever free from rhetoric and based entirely upon fact), it is very useful in making the debate far more public. Views ranged from completely free access to publicly collected data (Charles Arthur), to making the most commercial use of said data (Vanessa Lawrence). It was perhaps a little unfair to have the OS “in the spotlight”, but as they were present much focus was upon them. It should be said that no one questions the quality of their products and it so happens that they are one of the most successful public bodies for generating money. What interested me was the apparant progress that public bodies have made in making their data available (Carol Tullo), particularly through the adoption of “Click-Use” licenses (National Statistics being a good example). The National Archives also highlighted the fact that they make their historic census data available “at cost” to commercial providers who are able to re-sell the material at about a tenth of the cost that the NA would have to charge. Paul Crake made the statement that their needs to be a balance between copyright (the “protection” of ideas) and freedom which drives innovation (something the RSA has been involved in for over 250 years). This was taken up by David Vaver who gave a good historical summary for the reasons copyright are so different between the US and UK, making the point that whilst the Freedom of Information Act guarantees us access to data, it doesn’t mean that it cannot be charged for and it may well be copyrighted.
The overarching conclusion I took away was that free access to data is a good thing that will drive innovation. However it is not appropriate for all bodies (much of the UK Hydrographic Office non-UK data for instance) and if there is no commercial charging then the government needs to fund that body (and maintain standards at the highest level). Vanessa Lawrence is clearly not in favour of such a move for the OS. The debate goes further than this though. It is not a decision for individual bodies as they follow the direction of government; it is a government (and hence political) issue that needs people like Derek Wyatt to take on board. As David Vaver stated, the UK begins from the premise that all government data should be copyright, with exclusions applying. Really this should be turned on its head; such a radical move would have truly positive ground breaking repercussions for the UK, its economy and, ultimately, its population.
I had a trip to Stonehenge today. A unique experience as normally you are not allowed within 200m of the stones, however certain groups can book to actually wander amongst the stones for a private viewing prior to opening. I was with the Young Archaeologists’ Club and had got permission to fly my kite and camera. Unusually for Salisbury Plain, there was virtually zerp wind which gave me the perfect opportunity to test my new Dopero125 from KAPShop.
For those not in the know, framed kites can fly at lower wind speeds, at steeper angles and gently sail to the ground when the wind drops. All good traits. Never having flown a Dopero before it waas a baptism by fire. Quite simply it was stupendous; the quality from Jones Airfoils was excellent and it was brilliantly stable with very good lift. My rig and camera come in at 1kg and there were no problems here. First results to the right!
Remotely sensed images are an important data source for the mapping of glacial landforms and the reconstruction of past glacial environments. However the results produced can differ depending on a wide range of factors related to the type of sensors used and the characteristics of the landforms being mapped. This paper uses a range of satellite imagery to explore the three main sources of variation in the mapped results: relative size, azimuth biasing and landform signal strength. Recommendations include the use of imagery illuminated with low solar elevation, although an awareness of the selective bias introduced by solar azimuth is necessary. Landsat ETM+ imagery meets the requirements for glacial landform mapping and is the recommended data source. However users may well have to consider alternative data in the form of SPOT, Landsat TM or Landsat MSS images. Digital elevation models should also be considered a valuable data source.
My main system at home died this week so I have been working off my old laptop (a Vaio N50SX designed for Windows 98!). Being just a little descrepid (no battery and 128Mb RAM) it has Arcview 3.2 on it but no ArcMap. I’ve therefore been relearning all those old ArcView skills. I use ArcView less and less nowadays and really only come back to it for running the odd Avenue script. Anyway, I have been transcribing some x,y corrdinates in Excel and needed to do the following:
1. Import x,y data
2. Add projection information to shapefile
3. Overlay on to basemap
Whilst this would have been fairly quick and easy in ArcMap (because I know how to do it!), I had to figure out the “ArcView way”. Which, surprisingly, is quicker and more intuitive than ArcMap!! Thats life I guess. Anyway, the solution to these three tasks were:
1. I exported my coordinates from Excel as a DBF file and then added it as a table to ArcView. THe “Add Event Theme” menu option then takes the coordinates and creates a point shapefile.
2. Use the Projection Utility Wizard (make sure its enabled as an extension) to add projection information.
3. ArcView doesn’t (I believe!) support on-the-fly reprojection. My basemap of the world was in Geographic LAt/Long, so I used the Projection Utility Wizard to add projection information and then convert it in to the local coordinate system.
Whilst it took a little while to work out all these steps it was somewhat intuitive. Still plenty of life left in the old dog as they say!!
I’ve just returned from an exhausting three days at the British Geomorphological Research Groups Annnual Conference in Loughborough. The BGRG is the national professional society for geomorphologists in the UK and this conference (incorporating the AGM) was important because it oversaw the transition from being a “research group” under the Royal Geographical Society to becoming and separate charitable entity and expanding its scope. As a result it has now been renamed the British Society of Geomorphologists (BSG); not to be confused with the BGS!). So its a new era.
The conference itself was expertly organised by Steve Rice (Loughborough) and Mark Macklin (Aberystwyth) and was almost a “Who’s who” of geomorphology (well in the UK at least!). There was also excellent overseas attendance which added hugely to the scope of the conference. There was a large poster area and one series of talks (i.e. no parallel sessions) organised thematically with a “lead” presentation. The theme for the conference was geomorphology and Earth Systems Science; Keith Richards kicked-off with a philisophical presentation about the “state” of the subject and suggested that ESS is really LESS which is actually more. You had to be there!
Bill Detrich received a “services to geomorphology” award and gave two excellent and thought provoking presentations (”a geomorphological signature of life” following work presented in Nature). The other eye-opener for me was a presentation by Steve Tooth (Aberystwyth) talking about a river system in Botswana (Okavango Delta). Two notable points: the first was the change from a straight channel to excessive meandering as a result of a change in substrate (i.e. granite to sandstone). The second was the effect of hippos on both channel erosion and discharge in a wetlands environment. Quite astounding!! To be honest, whilst Heather Viles (Oxford) noted the lack of “interest” in biogeomorphology early on in the conference, it was actually probably the main underlying theme, particularly highlighted by Bill Dietrich. Biota are fundamental to all geomorphic processes and it is clearly an area that is receiving much attention.
My only complaint was that there was probably too much; you can always take it or leave it though. All in all an excellent three days and I can thoroughly recommend next years conference (Birmingham) to those interested in geomorphology, physical geography, earth systems science and allied areas.