A really nice PR blog entry over at SSTL on the decommissioning of Alsat-1, SSTLs first DMC satellite. Its a really nice example of a fit-for-purpose satellite, new low-cost technology and meeting the environmental needs of developing nations. Well worth a read.
Surface roughness is an important geomorphological variable which has been used in the earth and planetary sciences to infer material properties, current/past processes and the time elapsed since formation. No single definition exists, however within the context of geomorphometry we use surface roughness as a expression of the variability of a topographic surface at a given scale, where the scale of analysis is determined by the size of the landforms or geomorphic features of interest. Six techniques for the calculation of surface roughness were selected for an assessment of the parameter’s behaviour at different spatial scales and dataset resolutions. Area ratio operated independently of scale, providing consistent results across spatial resolutions. Vector dispersion produced results with increasing roughness and homogenisation of terrain at coarser resolutions and larger window sizes. Standard deviation of residual topography highlighted local features and doesn’t detect regional relief. Standard deviation of elevation correctly identified breaks-of-slope and was good at detecting regional relief. Standard deviation of slope (SDslope) also correctly identified smooth sloping areas and breaks-of-slope, providing the best results for geomorphological analysis. Standard deviation of profile curvature identified the breaks-of-slope, although not as strongly as SDslope and it is sensitive to noise and spurious data. In general, SDslope offered good performance at a variety of scales, whilst the simplicity of calculation is perhaps its single greatest benefit.
Came across the AeroPress coffee maker recently (thanks bro). Looks a very interesting take on coffee making by using a mix of a filter and plunger. However the plunger is an air plunger to create even pressure as the brewed coffee passes through the filter paper. Supposedly to create a smooth filtered taste, but with greater flavour. And, to boot, its portable.
OK, perhaps that’s a little uncharitable, but there have been a few heckles raised at poor A-level students who don’t get university places. As ever, Mike Baker has a level headed commentary on the topic. Some of the selected highlights include the fact that 183,000 students are in clearing, compared to 135,000 last year, and that includes overseas students and re-applicants, as well as UK-based school leavers. 158,000 didn’t get places last year. And also remember it’s not the universities fault; they are capped as to the number of students they can recruit, receiving a penalty if they go over or under. And the coalition government cut the number of extra places from 20,000 to 10,000.
What is undeniable is that this is perhaps one of the most challenging years for students; increasing participation in HE for a limited number of places and a very difficult job market. Interestingly, there are a number of well qualified A-level students who don’t get places and this seems to be attributable to either poor careers advice or academic snobbery (or both). Whilst we might want to go to the best university we think we can achieve, please please make sure you have an “insurance” place. Somewhere you can undeniably go if you really make a mess of things. In fact, think carefully about where you want to go. The Russell Group offer a certain type of education and not necessarily the best teaching. However the career opportunities are undeniable. Also, do you want to be middling in a class of extremely bright people, or top of a class of well qualified people. Students thrive in different educational environments.
And, briefly, its worth pondering what a private HE system might be like in the UK. BBC News have an interesting article on fees in the US where costs in excess of $50,000 PER YEAR for a four year course are common. The UK is good value in comparison! But if we did go for a part-private system, how many of our current students would opt for it? Particularly if they were guaranteed a place?
OSGeo have announced the WMS benchmarking for 2010 which has seen a long list of products taking part. Last year it was MapServer and GeoServer, with ESRI having to pull out due to time commitments. This year its MapServer, GeoServer, CadCorp GeognoSIS, Constellation SDI, ERDAS Apollo, Mapnik, Oracle MapViewer and QGIS mapserver. Great to see such an extensive line-up.
I’ve been playing around with postcodes a little more recently, particularly since the OS released CodePoint under the open licensing model. I don’t often have recourse to using postcodes given most of my work is with imagery or DEMs, but wanted to do some geocoding. Postcodes aren’t as straightforward as you might at first think and they have a history behind them as well. EDINA has a good post on some of the postcode features.
Address Point provides coordinates for every single delivery address in the UK and these are aggregated in to post codes by the Royal Mail. Address Point is then used to define centroid for each individual postcode. CodePoint Open gives postcode unit centroid which is good for geocoding, but given postcodes notionally represent areas it then raises the question about postcode boundaries. However whilst postcodes are area based, they represent points (i.e. addresses), so how do you define a boundary? From EDINA:
Postcode unit boundaries are a type of synthetic boundary. The postcode unit is a collection of delivery points with the same postcode, while the ‘boundary’ can be drawn anywhere , so long as it contains all delivery points. The postcode unit boundary does not exist in the real world. Where the boundary lines fall makes no difference to the postman, only the delivery points matter. In other words, there is no such thing as a correct postcode boundary.
The Code-Point unit boundaries however, have been created to best represent the postcode unit footprint in a way that allows the dataset to be used for many different applications. ADDRESS-POINT delivery points with sufficient positional quality were used to create the postcode unit polygons, which have been edited to follow major physical features but still enclose each delivery point in the correct postcode unit. Some postcode units do not contain enough delivery points with sufficient positional quality to create an acceptable polygon ‘footprint’. These have been left out of the Code Point polygon dataset and are listed separately in a ‘Discards’ look-up table.
So postcode boundaries are defined, but are synthetic. They are incredibly useful given the extensive utilisation of postcodes, but they are not in not CodePoint Open. However for those in HE they are available through Digimap.
Good catch over at LiDAR News on an article by Mike Pinkerton, a surveyor in New Zealand reviewing the change that laser scanning is having on the industry. Worth a read.
GIM have a nice roundup of current terrestrial laser scanners which is well worth a look at. Select from the (limited!) list and see how comparable they are.
Interesting article over at All Points on the CEO of Intermap stepping down. The author has declared an interest in being a shareholder, but takes a fairly hard line on the state of Intermap who’s share price is currently pretty low and potentially ripe for takeover. They have put alot of momentum behind NextMap and the collection of data for the USA and Europe. Is this the right direction to take? Time will tell, although it has seen hugely successful take up in the UK and perhaps this was part of the reason for expanding collection over much larger areas.