It is with great pleasure that I am able to announce the award of the 2016 “Best Map” to Bernhard Jenny (RMIT University), Johannes Liem (City University London), Bojan Savric (Esri Inc) and William M. Putman (Goddard Space Flight Center) for their animated map visualizing a year of changes to Earth’s CO2 titled “Interactive video maps: A year in the life of Earth’s CO2“. When the map is first loaded it appears as an animated map of the world showing just how dynamic this part of the Earth system is. But interact with the map - you find it’s pannable and zoomable - all other ways of interacting with 4D data seem mundane in comparison.
The awards committee noted the remarkable interactive animation; something that both tells a story and allows you to investigate. A big leap forward for interactive cartography, drawing the viewer in and allowing them to formulate potential global implications. For these reasons it is a deserving winner of this year’s award.
As you will see with this Editorial, it has been a year ofintense activity at the Journal of Maps (JoM). The mostimportant announcement is the move of JoM back toan Open Access (OA) publishing model which waseffective from 1st September 2016.
Art-geoscience: exploring interdisciplinary representations of space and place
We would like to invite contributions to a special issue of the Journal of Maps devoted to interdisciplinary collaborations between the arts and sciences, with a specific focus upon an exploration of a location using, at least in part, some form of mapping and ideally involving the collaboration of artists and scientists.
The fundamental basis for this special issue is the growing interest in interdisciplinary collaboration and in particular the crossover between the arts and sciences. Art is seen an important component in exploring and explaining science, whilst science offers new avenues for creative investigation and recording of phenomena. This is a general call for a special issue entitled ‘art-geoscience: exploring interdisciplinary representations of space and place’ and provides an opportunity for collaborative researchers to present their work.
Recent years have seen increased collaboration between the arts and sciences, with conferences, exhibitions and residencies devoted to exploring the inspirations and mutual benefits that can arise from activities that bridge the two spheres. Subjects such as biology, chemistry, and global climate change commonly feature prominently in such collaborations, but many of the geosciences (e.g. geomorphology, geology, geophysics) are less well represented.
Despite rapid movements towards global connectedness, with people, goods, services and scientific data now moving at speed over vast distances, space and place still retain great power in shaping the world. Many visual art forms can help to document and represent such themes, especially when combined with various forms of mapping.
Without constraining the range of topics that are potentially suitable for inclusion in the special issue, we offer the following as examples:
- use of scientific methods or techniques specifically for an artistic investigations of a location;
- scientific data already collected for a location-based projects that are re-used or re-purposed for artistic means;
- artistic data or outputs that are re-purposed and re-used for a location-based, scientific project;
- use of artistic techniques to investigate phenomena and/or enhance presentation and communication of scientific data.
The artistic medium can be anything that can be reasonably explained or presented within the journal. Beyond the inclusion of traditional mapping products (see below), we are keen to see submissions that may also use 3D models, video or audio to enable space- and place-based representations, or videos that present and explore the artistic work itself.
All papers are expected to consist of a map or series of maps (loosely and broadly defined to include various forms of spatial representation) accompanied by brief explanatory text. Papers should be bespoke, and the mapping of good quality. All papers in this special issue will be peer reviewed. To submit a paper, authors should do the following:
1. Submit a short draft (500 word limit) outlining the key themes and scope of the paper, where possible including example mapping, by 28 February 2017.
Abstract selection will be by the special issue editorial team. You will receive a notification by 31 March 2017.
2. Submit a completed paper (4000 word limit) by 30 June 2017.
3. The special issue will be published in 2018.
Ideally, the work would involve the collaboration of artists and scientists.
The special issue editorial team are happy to discuss ideas for papers and their suitability with potential contributors prior to the short draft submission stage. Please email Mike Smith (email@example.com) or Stephen Tooth (firstname.lastname@example.org) in the first instance.
All submissions should be made via the Journal of Maps website (http://www.tandfonline.com/toc/tjom20/current) where further guidance on all aspects of submission can be found. Please note the journal is open access, with an article processing charge of £400.
Stephen Tooth, Aberystwyth University, UK
Mike Smith, Kingston University, UK
Heather Viles, University of Oxford, UK
Flora Parrott, Tintype, London, UK
Kingston University put out a press release about our Reading-Landscape Project that we presented at the RGS-IBG Annual conference in the summer. It’s a nice piece that the press office has put together that takes some quotes from myself and Flora, offering a little more reflection upon the overall achievements of the whole group.
I was recently teaching a class on introductory cartography where we were using a range of different socio-economic datasets including 2011 counties and middle super output areas (MSOA) of the UK from the UK Data Service. These are (helpfully) made available in a range of different formats including the ubiquitous shapefile. These are helpful for choropleth mapping of socio-economic (census) data, use as location maps and when clipping other datasets for including topographic data on maps (e.g. Meridian 2).
One student wanted to generalise the polygons for the location map - thinking this would be easy he went ahead and ran the toolbox tool but end up with lots of sliver polygons as a result. Crucially, as a shapefile doesn’t store topological relationships, the tool was generalising each polygon separately resulting in a very poor output. And this was exacerbated by the fact that the borders were provided pre-generalised.
The obvious solution is to use a topological version of the data - which isn’t provided. The next step is therefore to create the topology in ArcGIS before generalising it. And whilst not difficult, it is a little convoluted to achieve! I found this page particularly helpful and it provided the core of processing (and remember, as with all computing instructions, you need to follow it to the letter!) which can be carried out in ArcCatalo. In short, the steps are:
1. Create a new geodatabase (either file or personal)
2. Create a new feature dataset within that
3. Import the shapefile into the feature dataset
4. Create new topology in the feature dataset
4a. For the topology you will need to use two rules: (a) no gaps and (b) no overlap
4b. This will throw an error where you have coastlines because (obviously) you have a gap!
5. At this point you now have built topology for the dataset and you can proceed to simplify/generalise the borders. Note that there will be multipart polygons present and if (like me) you want to delete any small islands to clean up data for use as a location map then you will need to run the “multipart to singlepart” toolbox tool.
This all proved a little more long-winded than I was expecting, but such is the price of topology! That did make me wonder if I could (easily) do this in QGIS and my initial research suggests not. Yes, the latest versions of QGIS have the Topology Checker Plugin (built-in) which checks topology (doh!) but as far as Im aware there is not an open source file format that supports topology. The grown up solution would be to use a PostGIS/PostgreSQL database but this isn’t particularly useful when you want to distribute data. If anyone knows better (or can correct me) then please do get in touch!
… and the stories they tell. The Washington Post ran a nice story earlier this month mapping the extent of infrastructure in the US. This is in response to Donald Trump’s (sketchy) plans to invest in infrastructure projects. This was subsequently followed up with a nice blog post on how they were created and, in particular, the courses of data and an idea of the data wrangling going on behind the scenes. What’s telling here is the simplicity of the rendering and that journalists use QGIS because its free, but that Photoshop and Illustrator (rather than GIMP and Inkscape) are still the graphic artists tools of choice. I wonder if this would be any different if there was GIS expertise on their teams to support the graphic designers…
My big lesson was the importance of a simple message, and saying it the same way over and over. If you’re going to change it, change it in a big way, and make sure everyone knows it’s a change. Otherwise keep it static.
…and that goes for any type of communication. Keep the core message simple to understand because whilst the implications may be profound, your target audience needs to be able to take it in and interpret it unequivocally.
A nice overview and comparison of the shapefile, personal geodatabase and file geodatabase over at the guys at GIS Geography. Its a good succinct summary and review of the pros and cons. They do note that the file geodatabase is proprietary (to Esri), but not that the shapefile is too. And whilst (quitea while ago) Esri published a whitepaper detailing the specification of the shapefile its worth noting that they have released the API to the file geodatabase as well.
Sense About Science kicked off their Evidence Matters campaign earlier this year and this month held a meeting in parliament to push the importance of policy decisions based upon factual evidence. That is, making decisions that have impact upon society for the benefit of all, not simply to push a political agenda or because it’s what politicians believe but not what evidence shows. And the corollary is ignoring evidence - when it has been collected and presented, don’t make a decision because you don’t like the evidence (the so-called “post-truth society“). It’s critically important for the community we share and the environment we inhabit. There shouldn’t be elitist strongholds on decision making, but egalitarian approaches that value all.
Way back in 2009 I published a paper on the Cookie Cutter which outlined a method (and accompanying script) for calculating the volume of drumlins. This worked in ArcGIS 9.2 using the Python interface to a number of ArcGIS toolbox tools. Fast forward 9 years since I first wrote the script and, not too surprisingly, it doesn’t work (thanks for telling me Arturs!).
I finally sat down a few weeks ago to bug fix the script which was actually easier than I thought it would. It’s actually comprised of two scripts - the first sets up some working directories and takes an input shapefile, splitting into a number of new shapelines (one per drumlin). The second script then performs the volume calculation on each drumlins. It turns out (given Im pretty much only calling Toolbox tools) that there wasn’t much to fix… a third party script splitting the initial shapefile had to be removed, a bug in the command adding a new field and then reference to ArcGIS 10.4 paths. For those wanting to use it, please download the attached files and follow the notes below.
- I use WinPython 2.7 and then the excellent Spyder IDE to run the scripts from
- in Spyder you need to change the Python console to the path of the one that ArcGIS has installed. Goto Tools -> Preferences -> Console -> Advanced Settings then change “Use the following Python Interpreter”. It should be something like:
- at the top of the cookie_setup script set the project_directory to the location of the main input shapefile. For outlines, set the name of the input shapefile
- Press F5 to run the setup script, creating the working directories and adding a new field to the shapefile
- the next part needs to be performed manually (I haven’t had time to add in and test the Toolbox call)… add a new text field to the outlines attribute table called “split” and consecutively number each row from A1 to An (ie your last row). In QGIS the expression in the field calculator is
concat(’A', @row_number )
- save the file then use the ArcGIS Toolbox tool Analysis Tools->Extract->Split
- use this to split the outlines shapefile based upon the “split” field you just created. Specify the “Target Workspace” as the “input” directory that has been created in your project directory
- now load Cookie_Cutter into Spyder and again specify the following 5 inputs:
project_directory : the project directory
nextmap_dsm_img : the input DEM
gp.cellSize : the DEM cellsize (in metres)
tension_parameter : leave this as it is
buffer_parameter : the distance to buffer your drumlins (in metres). The example shows 20m, for a 5m DEM
- on line 66-68 you *might* need to change the path to the listed toolboxes. This is specified for 10.4 at the moment
- RUN IT! The console pane in Spyder should show you a whole load of information as it processes each drumlin. There will be a counter showing you which drumlin you are on
The key output is the Volume_Master.dbf table. You can open this in excel. It is zonal stats from ArcGIS for each individual drumlin (subtracted from the cookie cut DEM). The critical value is the SUM column that shows the total height for all pixels within the drumlin. Multiply this by 25 (for a 5×5 pixel) to give you drumlin volume.
UPDATE: If you can’t (or don’t want to) use Spyder you can just run the py script directly from the command line using the interpreter that ships with ArcGIS.