Geospatial 2.0 Workshop

“What is the crossing point of machine learning and Earth Observation?” was the question asked this week at the Satellite Applications Catapult Geospatial 2.0 event at the Harwell campus near Oxford. https://sa.catapult.org.uk/news-events-gallery/events/geospatial-2-0-workshop/

 

There were a number of excellent presentations given. From suppliers of data to companies trying to derive value from the ever increasing amount of satellite data. Digital Globe have built an online tool called GDBx https://developer.digitalglobe.com/gbdx/. It allows access to a huge amount of satellite imagery through a web interface (80 petabytes of imagery). Digital Globe are seeing the market move away from desktop software with remote sensing specialists (termed Geospatial 1.0) to data on the cloud, accessed through APIs and SDKs giving the data to developers to build applications (Geospatial 2.0). The Satellite Catapult offer noncommercial discounted licenses to access this data.

 “You are old fashioned if you use ArcMap”

Quite a statement from one of the presenters, though I can see where they were coming from. However, I still see a place for desktop GIS though. I am sure ESRI still sell plenty of ArcGIS Desktop licenses and QGIS is growing from strength to strength (http://gis.stackexchange.com/questions/66599/how-many-active-qgis-users-are-there-in-total). For example, in a remote field location (like a seismic crew) the bandwidth over a VSAT is still going to mean that accessing data in the cloud is a virtual impossibility. I don’t think the desktop is dead, ArcGIS Pro is gaining traction and ArcGIS has plenty of imagery tools. I do suspect this is not known widely enough.

QGIS also has plenty of excellent classification tools for EO data http://fromgistors.blogspot.com/p/semi-automatic-classification-plugin.html

“You still need humans”

 The presentation that most resonated with me was given by Prof Steven Reece from the University of Oxford.

He highlighted common problems across all areas of machine learning

  • Missing data
  • Faulty data
  • Biased data
  • Heterogeneous data
  • Timeliness of data (or how soon it becomes out of date)

It is also worth noting that he stated machine learning can

  • Aggregate data
  • Model data
  • Perform anomaly detection
  • Prediction forecasting
  • Information and intelligence

The change is in the quantity and quality of data; machine learning hasn’t changed much in 20 years. If the EO industry feels like it is moving towards a subscription based data model, are we going to end up with the same results but cheaper/faster? Geospatial 2.0. I look forward to attending the next meeting.

Want to learn more about GIS and EO for Oil and Gas? Then my page contains all my blogs, plus case studies and links http://gis.acgeospatial.co.uk/

 

Leave a Reply

Your email address will not be published.