Normalizing and classifying raster data using gdal_translate and gdal_calc

No satellite-borne instrument can fully deal with external artifacts and interference. Let's take a look at how gdal_translate and gdal_calc.py can be used to normalize and classify noisy real world data.

Normalizing and classifying raster data using gdal_translate and gdal_calc

When you take a picture 800 kilometres away from the subject, chances are the picture is out of focus, has motion blur, luminous intensity is insufficient, or something gets in the way of you and your subject. Imaging instruments and other sensors on satellites do a marvelous job with known and predictable parametres, but no instrument can fully deal with external artifacts and interference. Even the highest grade satellite data will be riddled with noise. It is very hard, or even impossible, to get a ground truth from the orbit.

At Terramonitor, we always apply athmospheric and radiometric correction for satellite data to mitigate noise. This is as close to the truth as we can get, and is the basis of our Analysis Ready product, for instance. But sometimes we are happy with indicative results, and we are able to sacrifice resolution for a direct answer. Instead of very accurate results, we are able to say yes or no. This is especially true for non-visual products, where the class of an observation is more important than the observation itself.

gdal_translate and gdal_calc.py

Let's take a look at how gdal_translate and gdal_calc.py can be used to normalize and classify noisy real world data. gdal_translate is mainly used for converting raster data from one format to another, but it can be used for scaling pixel values as well. gdal_calc.py is a raster calculator that may be used to run arithmetic operations for each pixel in the input raster, enabling a much more fine-grained control over scaling pixel values. Both are included in the open source GDAL toolbox.

Building Height normalization with gdal_translate

Our input is an image from central Paris. The value of each pixel is the observed height in metres. The ground sampling distance is 10 metres, which means buildings generally consist of multiple pixels with possibly different values due to noise.

For the first try, we will scale the brightest pixel to the tallest building in Paris, the Eiffel Tower at 324 metres. Here is the gdal_translate command that does that:

gdal_translate -scale 0 324 0 255 paris_part.tiff -ot Byte paris_part_scaled.png
"Raw" output of the first attempt at scaling. The pixel values are shown in different shades of gray, so that bright pixels represent tall buildings. The Eiffel Tower is the bright cluster of pixels at the top left (northwest).

With this scaling, we can make out only the tallest buildings, the brightest pixel is at 324 metres, halfway at 167 metres is 50% brightness, and only 20% brightness at 64 metres. This is not optimal, because practically all the buildings in Paris are 50 metres tall or less! Let's try scaling the maximum to 50 metres. All buildings 50 metres or taller will be displayed in full brightness.

gdal_translate -scale 0 50 0 255 paris_part.tiff -ot Byte paris_part_scaled_more.png
Second attempt at scaling, where the maximum brightness is reached at 50 metres.

This produces a much more interesting image, where you can still see the tallest buildings, but see every other building as well. You can even see that there is a cluster of clearly lower buildings around the Vanves–Malakoff station in the lower left (southwest) corner.

There is some suboptimization that can be done with gdal_translate to find a better upper and lower scaling limit for the values, but as far as this tutorial is concerned, this is the best we can do.

Building Height classification with gdal_calc.py

One problem with the image above is that there are multiple different observations for many buildings. While this may or may not represent the truth, it might make more practical sense to force one height value for each bulding. gdal_calc.py allows us to not just scale, but classify pixels as well.

Partial histogram for the input image

Looking at the histogram for the input image, we see clear low points at 2-4, 14-18, and 40, so we'll use these as border values for our classes. gdal_calc requires these parametres to be expressed in Numpy syntax. The possibilities are enormous, but we only need very basic functionality.

Value rangeMapped classgdal_calc syntax
Less than 300*(A<3)
3 to 1511*logical_and(A>=3,A<15)
15 to 4022*logical_and(A>=15,A<40)
40 or more33*(A>40)

Putting these together, the gdal_calc.py command looks like this:

gdal_calc.py -A paris_part.tiff --outfile=paris_part_4_classes.tiff --calc="0*(A‚>3)+1*logical_and(A>=3,A<15)+2*logical_and(A>=15,A<40)+3*(A>=40)"

To get a viewable PNG file, we also need to run a similar gdal_translate command as the ones above:

gdal_translate -scale 0 3 0 254 paris_part_4_classes.tiff -ot Byte paris_part_4_classes.png
Result of building height classification with gdal_calc.py

There is far less noise and a clearer distinction between different classes of building heights. gdal_calc can be used to apply functions and a number of different logical expressions, so the classification possiblities are endless. For the purposes of this tutorial, the image above is where we will stop.

Want to know more?

Terramonitor is a space data company processing terabytes of satellite and remote sensing data daily. We develop automated, scalable processing pipelines for processing not just optical satellite data, but also single band observations as the one seen here.

If you are a GIS developer looking to create your own classification model, get in touch and ask about our Analysis Ready products.

If you are an application developer interested in integrating geospatial data to your service, we are more than happy to do the heavy lifting for you. Get in touch and ask about our Land Cover API (to be released in Q4/2020).