James Killick on the problem of geodata standardisation:
The lack of common, broadly adopted geospatial data exchange standards is crippling the geospatial industry. It’s a bit like going to an EV charger with your shiny new electric vehicle and discovering you can’t charge it because your car has a different connector to the one used by the EV charger. The electricity is there and ready to be sucked up and used, but, sorry — your vehicle can’t consume it unless you miraculously come up with a magical adaptor that allows the energy to flow.
Standards exist for public-transport information but are missing for many other types of geodata. The commercial premise for these domains is different.
For public-transport organisations, their data is not the product. Trains and buses moving through a city are. Network and schedule data is a means to get more people to use public transport, so you want to get this information in front of as many people as possible—through displays on stations, a website or third-party applications. And you want to integrate with other transport authorities’ data to provide a seamless service. All this is best accomplished through shared interfaces and data models.
On the other hand, road-network and address data isn’t a vehicle to sell a product; it usually is the product. You license it because you offer a service (delivery, navigation) that requires this information. The companies providing that data often survey and maintain the data themselves. The idea that you could swap out or merge their data with someone else’s using the routines and data models you already build is a threat to that business model. They don’t want interoperability; they want lock-in, so you keep paying them, not somebody else.
Iván Sánchez Ortega reporting from his activities during the latest OGC code sprint:
when pygeoapi is requested a coverage from GIS client (preferring image/tiff or application/ld+json or the like), the raw data is returned. But when it’s a web browser (preferring text/html), then a webpage with a small viewer is returned.
It’s an interesting deep-dive into HTTP content negotiation, how it relates to geo-data problems and what OGC API implementations could do better.
A new book documents the history and activities of Youthmappers, a global movement that engages university students in local mapping using modern technology to collect data and organise activities. Open Mapping towards Sustainable Development Goals looks Youthmappers chapters worldwide and how their work relates and contributes to achieving Sustainable Development Goals.
But this isn’t your average Springer publication with contributions from tenured university professors who only leave their offices to spend the summer uninterrupted in their holiday homes to write. All chapters were written by Youtmappers activists:
[T]his book aims to document and share insights about this movement’s emergence from the first-person voices of the very students themselves who are among those at the forefront of creating our new people’speople’s map of the world. […] Each chapter puts forward the voices of students and recent graduates in countries where YouthMappers works, all over the world. Many of them hail from countries where expertise in geospatial technologies for the SDGs is nascent and needed.
Each chapter is written in the context of a primary and a secondary SDG, identified by each chapter’s authors. The topics covered are as wide-ranging as the SDGs and as diverse as its contributors, including city planning, agriculture, gender equality or ethics.
The ebook version is available for free as PDF or ePub, the paperback is EUR 39.99, and the hardcover is EUR 49.99.
The recordings of all 22 talks from this year’s PostGIS Day are up on Youtube now. That’s almost eleven hours of PostGIS content to keep you warm this winter.
Google Maps is no longer served from the subdomain maps.google.com but from a path on Google’s main domain (google.com/maps). If you’re allowing Google Maps to use your location, Google can now use it on all sites under google.com.
Jonathan Crowe, writing on The Map Room, has a better understanding than I had of TomTom’s new Map platform:
TomTom plans to do so by combining map data from its own data, third-party sources, sensor data, and OpenStreetMap. I’ve been around long enough to know that combining disparate map data sources is neither trivial nor easy. It’s also very labour intensive. TomTom says they’ll be using AI and machine learning to automate that process. It’ll be a real accomplishment if they can make it work. It may actually be a very big deal. I suspect it may also be the only way to make this platform remotely any good and financially viable at the same time.
This sounds very ambitious. Automated data fusion has been a popular research topic amongst PhD students for years. Maybe TomTom will be the first organisation to create a viable product this way; who knows?