A couple of nice blog posts from the OS… the first looks at the Top 10 mapping moments in OS history. A nice little primer which highlights some of the products, some of their data and some of their history. You might find something useful!
The second, Recreating historic maps: interview with Christopher Wesson is an interview with Chris Weeson, one of their cartographers, on the recreation of one of the OS’s old map styles but using new data. In his words: “I decided to take the earliest OS cartographic representations I could find for that scale [1”] for each feature used on the map, so it is a hybrid of maps from the 19th century as opposed to a direct copy of the 1801 Kent map style.” The interview walks through the overarching method (and software) used to produce it. Really good to see how the design and data sides intermingle heavily in its production. Good read.
An interesting article over at Amateur Photographer which picks up an Olympus press release about the development of an RGB/NIR sensor for use in consumer grade cameras. The use of digital cameras for NIR imaging (e.g. my dead leaf photo) has been common for many years and is achieved by having a longer exposure (as the sensor is less sensitive to NIR) and placing a NIR cut filter in front of the lens (e.g. Hoya 720). Specialists such as Advanced Camera Services will even convert your camera to IR by removing the internal IR filter. Sensefly use a modified Canon S110 for the eBee UAV which can image in RGB, NIR or red edge. Which is why the Olympus announcement is interesting (for the light weight/low cost UAV sector) as I’m not aware of a major manufacturer developing a single sensor for imaging 4 bands. A traditional approach is to use a bayer array over a sensor sensitive to RGB and then interpolate (demosaic) the image to three RGB layers. Olympus appear to have extended this to 4 bands by developing realtime demosaicing to support it. The sensor is probably a standard one, albeit perhaps more sensitive to NIR. Lead time could be awhile as this is in development but it clearly shows the direction of travel.
It’s Impact Factor time!! All the movers and shakers will be looking through the citation reports from Thomson Reuters to see how the different publications are performing and find out who’s up and who’s down!! Yet again it’s very pleasing to report an increase at the Journal of Maps, this time going from 1.19 to 1.44. As I said last year, the 1.0 boundary is a watershed as that is the point at which there are more citations than articles published. This year’s editorial summarises this performance showing that the big change was an extra issue which increased the articles from 62 (2014) to 72. Given the in-built lag in the 2 year Impact Factor I would ordinarily expect a decrease so what this shows is that we have a “rising roll” of incoming citations. In short - excellent performance. Downloads finished the year on 33,000, up from 26,000.
And don’t forget to look at this year’s “Best Map” winner which is available for free download from Taylor and Francis.
A nice primer on R Project for Maps which provides a good introduction on the use of R for a range of GI applications, be they webmapping or raster analyses. The list of libraries and discussion of Microsoft R Open is particularly good. Worth a delve into.
1. Bloom’s Two Sigma Problem: Benjamin Bloom tested the improvement in student learning under three styles of teaching. Conventional lecture, “mastery” learning (incorporating staged learning, peer assistance and strong feedback loops) and one to one tuition. The last gave the best performance, but the second gave a 1 sigma increase in performance for limited resource input (Donald Clark covers this in part). Don’t take anyone elses word for it, read the paper!!
2. Spaced Repetition: otherwise known as deliberate practice. The brilliant Bounce by Matthew Syed covers this - I blogged about in detail. Repetitively practice, at the edge of your ability with feedback. You will become exceptional.
3. Nudge Analytics: interesting to see this listed here and I fully agree. This is about small changes in environment (”nudges”) that can lead to big changes in behaviour and performance. Expect this one to have increasingly profound effects.
There is something for everyone to take away here - education is a shared experience with teachers and learners. We need teaching that is passionate and focused upon learners need. But equally we need learners who want to learn otherwise these strategies will have limited impact.
Thinkwhere provide a nice summary of A Digital Humanities Primer for English Students, a guide written by an MA student Jenna Herdman. It’s a series of relatively straightforward tutorials, but pulling together relevant skillsets for the spatial and literary analysis of various texts. It’s both useful and shows how the “digital humanities” (interesting title but that’s a whole new blog post!) are pushing subject boundaries. What I found interesting here was how there was no understanding of the underlying geographic information and the reliance/use of cloud based services. The first is a fairly natural transposition of ideas - not too dissimilar to the way, for example, statistics is used in various subject areas. The latter is perhaps more interesting as it reflects the way the software industry has moved many applications to online services - but it also reflects how younger researchers rely on online applications. QGIS is arguably a significantly better resource for this type of geospatial use, but is probably largely unknown outside of the specialist sector.
Well, just when you thought the Finch Report had had the last word in open access publication and access to government funded research… we find that the government itself is, errr, withholding access to government funded research! Yesterday Sense About Science published its report entitled Missing Evidence.
Yes its true, but rather than a pre-designed political malfeasance to withhold information and evidence what the report has found is
“weak rules and chaotic systems. It turns out that we don’t know what has become of millions of pounds of government-commissioned research. Government itself doesn’t know: some departments have no idea how much research they have commissioned, whether it was published, or where it all is now.”
Not surprisingly then the recommendations are for a standardised central register, clear definitions of “external research” and prompt publication. All very sensible and it is desirable that we both bring government into line with the rest of the sector and, more importantly, make evidence openly available in a timely manner that can inform public debate and so democracy. It is part of the checks and balances of open government allowing elected officials to be held accountable for the decisions they make.