Skip to content

Building virtual rasters from STAC

  • blog

I’ve been writting a lot recently about STAC, COG and Python. These are some of my recent posts:

COG and Landsat in QGIS

COG and Landsat with Python

STAC, COG, Python and QGIS

Smiley Face, STAC and COG

I am very much aware that the majority of work with Earth Observation data is still within the project setting and this post tries to imagine more practical applications of that. I am going to create a series of virtual raster files, they are super small in size which are basically pointers to the raw data. A project could potentially collect tens of these and rather than manually search and download, I will use STAC and COGs to keep things super simple.

Before I jump into the code, don’t discount open data cubes. They are incredibly powerful and on large scale projects may just be the perfect solution. Please go and check them out.

Setup

I think at this point if you are not familiar with STAC, COGs and Python then please revisit my previous posts. There are no additional setup sets that I haven’t covered in those posts and this post builds on that previous work.

Build your search

I am reusing code from previous posts to build my search using sat search. The only change is this time I am adding a query to only return me scenes with cloud cover less than 5%.

from satsearch import Search
import geopandas as gpd
gdf = gpd.read_file('map.geojson')
bounds = gdf.bounds
boundary = bounds.values.tolist()
search = Search(bbox=boundary[0], datetime='2020-05-01/2020-07-30',
collections=['sentinel-s2-l2a-cogs'], url='https://earth-search.aws.element84.com/v0',
query={'eo:cloud_cover': {'lt': 5}})
print('bbox search: %s items' % search.found())

Next I am going to build a dictionary called url_dict and append the url (value) to a filename (key) and search for each item that matches the tile I am interested in (30UXB).

items = search.items()
url_dict = {}
for item in items:
if item.id[4:9] == "30UXB":
file_url = item.asset('visual')['href']
fileout = item.id+'.vrt'
url_dict[fileout] = file_url

Then finally, I am going to iterate over that dictionary and call gdalbuildvrt on each item to create my vrt file.

import subprocess
for _, (key, value) in enumerate(url_dict.items()):
subprocess.call('gdalbuildvrt ' + key + ' ' + value)

In the associated notebook I’ve created a bespoke directory to save this output to, go and have a look if you are note sure how this is done. In the end I get a directory of vrt files:

all my Sentinel2 True Colour Images for my date range

Usage

These files can be loaded in QGIS and used as any other type of file. You can even perform raster analysis on them. Or use them as a basemap if you so wish.

All the code for this blog post is available here. I hope this has been of use.

Image credit https://unsplash.com/photos/wKhhk6r90Ek

I am a freelancer able to help you with your projects. I offer consultancy, training and writing. I’d be delighted to hear from you.

Feel free to connect or follow me; I am always keen to talk about Earth Observation.

I am @map_andrew on twitter