More

How do I access GeoTransform array from gdal on the command line?

How do I access GeoTransform array from gdal on the command line?


I'm using the gdal library via OSGeo for windows, and I was just wondering how to access the geotransform array for a .ecw file.


This can be found fromgdalinfo, except it has three forms:

  1. If rotation / shear coefficients (adfGeoTransform[2]andadfGeoTransform[4]) are zero, the output is simplified:
    Origin = (%.15f,%.15f) % adfGeoTransform[0], adfGeoTransform[3]
    Pixel Size = (%.15f,%.15f) % adfGeoTransform[1], adfGeoTransform[5]

  2. If rotation / shear coefficients are non-zero, the full 6-coefficients are shown:
    GeoTransform =
    %.16g, %.16g, %.16g % adfGeoTransform[0], adfGeoTransform[1], adfGeoTransform[2]
    %.16g, %.16g, %.16g % adfGeoTransform[3], adfGeoTransform[4], adfGeoTransform[5]

  3. If there is no GeoTransform information, then neither of the above two forms are shown.

The order of coefficients used by GDAL are important, and are documented here.


In C# you can access GDAL provided you have the libraries installed. Reference gdal_csharp.dll. This tutorial helps a little.

using OSGeo.GDAL; // later inside the class OSGeo.GDAL.Gdal.AllRegister(); // IMPORTANT or nothing will work // open the ECW dataset in GDAL OSGeo.GDAL.Dataset pInDS = Gdal.Open(ECW_Path, OSGeo.GDAL.Access.GA_ReadOnly); // get the GeoTransform from the dataset double[] pGT= new double[6]; pInDS.GetGeoTransform(pGT); // use the parameters CellWidth = pGT[1]; CellHeight = pGT[5]; OriginX = pGT[0]; OriginY = pGT[3]; // rows and columns for the raster Rows = (ulong)pInDS.RasterYSize; Cols = (ulong)pInDS.RasterXSize;

The GeoTransform object is:

adfGeoTransform[0] /* top left x */ adfGeoTransform[1] /* w-e pixel resolution */ adfGeoTransform[2] /* 0 */ adfGeoTransform[3] /* top left y */ adfGeoTransform[4] /* 0 */ adfGeoTransform[5] /* n-s pixel resolution (negative value) */

This is the same for every supported GDAL raster format.


GDAL (CentOS 8) Install Failing

For the best part of today, I've been trying to get my head around how to install GDAL on my CentOS 8 server.

I've researched on many different answers and solutions across different sites and across StackOverflow and nothing seems to be working! (I'm probably missing something obvious somewhere)

I'm trying to install GDAL using the command pip3 install gdal

Which in return, produces the following error:

  • Proj 6
  • GEOS
  • epel-release (For CentOS 8)
  • gdal-libs
  • PowerTools (I've also enabled powertools)

The error I've spotted within the output from above is cpl_port.h: No such file or directory and I have crosschecked this error across other forums, who have advised to run the following command:

sudo apt-get install libgdal-dev

And finally, running the pip3 install gdal command.

Although, since I'm running CentOS 8, I don't have access to "apt-get" and instead have tried using "Yum" but haven't had any luck with getting the above proposed solution to work.


The geotransform is used to convert from map to pixel coordinates and back using an affine transformation. The 3rd and 5th parameter are used (together with the 2nd and 4th) to define the rotation if your image doesn't have 'north up'.

But most images are north up, and then both the 3rd and 5th parameter are zero.

The affine transform consists of six coefficients returned by GDALDataset::GetGeoTransform() which map pixel/line coordinates into georeferenced space using the following relationship:

See the section on affine geotransform at: http://www.gdal.org/gdal_datamodel.html

Reprojecting coordinates using GDAL library

The problem is that you need to swap your axis order to use Cartesian X/Y space or Lon/Lat, and not "Lat/Lon" order. Setting this should work. double x = -45.49159617 // Lon double y = -23.57014667 // Lat The difference you saw from your round-trip conversion was from projecting outside.

Error in using gdal java binding

I tried executing your source code and I got the error Native library load failed. java.lang.UnsatisfiedLinkError: no osrjni in java.library.path So just adding gdal.jar in your project build path is not sufficient for executing your program. There are some external native library dependencies. Check out this link,https://trac.osgeo.org/gdal/ticket/3598. Here it is.

Working with rasters in file geodatabase (.gdb) with GDAL

Currently both FileGDB and OpenFileGDB drivers handle only vector datasets. Raster support is not part of Esri's FGDB API. You will need to use Esri tools to export the rasters to another format, such as GeoTIFF.

Convert .dbf .prj .shp .shx to GeoJson

Collectively, those files are called an "ESRI Shapefile". The minimum required are .shp, .shx and .dbf, but there may also be .prj, .sbn, .sbx, .fbn, .fbx, .ain, .aih, .ixs, .mxs, .atx, .shp.xml, or .cpg files (no kidding!) Typically, you only refer to the .shp file, but the other files must.

Python gdal projection information

One idea is that you could modify gdal_polygonize.py to re-project the result. Use a MEM driver to store the interim polygonised result from dst_layer, then reproject using the osr module. Another idea that is much simpler is to make a database VIEW that projects a table's geometry column with SRID=4326.

Django SerializerDoesNotExist 'geojson'?

You need to include the geojson serializer in settings.py SERIALIZATION_MODULES = < "geojson": "django.contrib.gis.serializers.geojson", >.

Max() implemented with basic operators

another one: (A+100)*(A>-100) - 100 here the min value will be displaced to 0 to match the lower bound, then displaced back to -100.

Cant connect to postgres with GDAL ogr2ogr remotely

The syntax of the connection string is important, particulary with the use of quotations. See the PostgreSQL driver for an example of what this should look like. For example, try: $ ogrinfo PG:"host=172.17.2.176 port=5432 user='user' password='password' dbname='dbname'" Also, if you frequently connect to this database, you can use environment variables.

Libgdal.so android error: could not load library “libgdal.so.1”

After several days of trials and errors (mainly errors) I have found a solution to my problem. I'm writing it as an answer to my question so that it may be helpful to other ones. The problem was during the link-time of GDAL. Briefly: the linker created the "real name".

How to extract the name of inputfile for the outputfile when using python?

You can specify starting directory and based on the extension process all the file in it, here is what I would do. import os import fnmatch filenamels = [] inputExtension = ".nc" outputExtension = ".img" inputDir = "C:Users" for filename in os.listdir(inputDir): if fnmatch.fnmatch(filename, '*' + inputExtension): filenamels.append(filename[:-len(inputExtension)]) for fn.

Use topojson to project shapefiles with different extents

Don’t use --width and --height specify the projection’s scale and translate instead, and you’ll get a fixed projection that will be the same for all inputs. In fact, since the default d3.geo.albers scale and translate is designed to fit in a 960×500 viewport, you can simply say: topojson -o.

Authorizing a PHP web app + GDAL app to use Google services via OAuth2

You are correct, you are trying to use a server-side authentication key using the client-side authentication flow, this won't work. What you need to do is instead to pass your Access token to the GDAL GME driver. You can do it in one of the following ways: Just add a.

Description of parameters of GDAL SetGeoTransform

The geotransform is used to convert from map to pixel coordinates and back using an affine transformation. The 3rd and 5th parameter are used (together with the 2nd and 4th) to define the rotation if your image doesn't have 'north up'. But most images are north up, and then both.

Python gdal stopped working when using ogr Within, Contains or other

The crash is most likely a well documented gotcha. Similar work can be done with Shapely, which doesn't have that gotcha. Also, the query in the for loop can be done faster by using an Rtree index, there are a few Q&A's related to shapely + Rtree out there.

Installing GDAL with ECW support

This works for GDAL 1.11.2, but it should work back to 1.10.0. Download the latest version of the ECW library from here (currently 5.2.1): http://download.intergraph.com/download-portal $ unzip erdas-ecwjp2sdk-v5.2.1-linux.zip $ chmod +x ERDAS_ECWJP2_SDK-5.2.1.bin $ ./ERDAS_ECWJP2_SDK-5.2.1.bin Choose Desktop Read-Only and accept the license. A directory named hexagon is extracted. Copy that to.

Replicating a chropleth in D3.js using d3.tsv

Your code expects counties to be indexed by prop id, e.g. rateById.set(d.id, +d.rate). However, your tsv calls them "Mun", as in "Mun" "rate" "01001" 350058.5 "01002" 224305 "01003" 132115 So change d.id to d.Mun (in 2 places), or rename "Mun".

Setting up Django with GeoDjango Support in AWS Beanstalk or EC2 Instance

So I now have a working ebextensions workflow on 2013.09 stack ami ami-35792c5c. For the 2014.09 stack see the other solution. The solution below works with postgis by installing the needed gdal component, 00_repo.config needs to look like this: files: "/etc/yum.repos.d/pgdg-93-redhat.repo": mode: "000644" owner: root group: root content: | [pgdg93].

Converting DTED to other format

Okay I think I figured it out : unless going through a virtual raster, I shouldn't use raster bands but just CreatCopy. Here's a working code for me : #include <iostream> #include "gdal_priv.h" #include "cpl_conv.h" // for CPLMalloc() using namespace std int main(int argc, char *argv[]) < QApplication a(argc, argv).

Unable to install rgdal in ubuntu 14.04 (undefined to reference to 'pj_ctx_fclose')

Finally, I sovled this issue . Firstly, removing proj4-4.8 by apt-get. sudo apt-get remove libproj-dev Secondly, adding my compiled proj4-4.9 lib path into a new conf file (such as rgdal.conf) under the path of '/etc/ld.so.conf.d/'. cd /etc/ld.so.conf.d sudo touch rgdal.conf echo 'the lib path of proj4-4.9' | sudo tee -a.

Python gdal undefined symbol GDALRasterBandGetVirtualMem

If a gdal version is already installed, this problem will occur even if you have linked against the version installed in

/.local. A solution is given at a Planet MYSQL post here: In this case, we can tell the linker to preload our newer 1.11.0 library in our shell this.

Installing gdal to use with topo json and D3

The easiest way to install GDAL on Windows is using the OSGEO4W Installer. Just use the "Advanced" option and you can choose GDAL from "Commandline_Utilities"-"gdal". The installer also enables you to have multiple GDAL versions side by side, should you ever need that.

Error installing fiona in OSX 10.9

If you're using XCode 5.1, the problem is likely unrecognised options in Clang. See here for details on how to fix this: http://bruteforce.gr/bypassing-clang-error-unknown-argument.html

How to setup in GDAL 1.9.2 CPP project on windows VS C++?

For Setup, I did it. just sharing so someone can save time Step 1 Download this Project from here This project is just project file. You need to copy GDAL source code to it in next steps.. Step 2 Extract it. Step 3 Now copy paste all folders/files from gdal-1.11.1.

Missing projection file in R's rgdal package for geospatial analysis

I had a quick Google and came across this post which suggests this was an OS X error where the packages were built incorrectly. On my Mac system, I wasn't able to reproduce your error, so for what it's worth I installed my versions of rgdal and rgeos from http://www.kyngchaos.com/software:frameworks.

Can not open include file expat.h GDAL? How to setup expat XML library?

I have solved it. Just sharing for knowledge. Download and Install XML Library in C Drive and give its path in include directories via View.Propery Manager. C:Expat-2.0.0Sourcelib .

Convert features of a 'multifeature' GeoJSON into R spatial objects

You can use the require_geomType parameter for various GDAL functions to extract the features that you need: library(rgdal) ogrListLayers("test.geojson") ## [1] "OGRGeoJSON" ## attr(,"driver") ## [1] "GeoJSON" ## attr(,"nlayers") ## [1] 1 # This fails but you can at least see the geoms it whines about ogrInfo("test.geojson", "OGRGeoJSON") ## Error.

GDAL function to split large image into specific chunks

You can split the image into chunks/tiles with gdal2tiles. You could also split it into parts depending on the amount of pixels for each chunk using gdal_translate with the -srcwin option. This however would require you to write a bash/batch script to loop through your 30 image chunks. If your.

Can not open input file libpng15-vc9.lib

you should give absolute path d:/libpng-16/libpng16.lib or Add the path (d:/libpng-16) in Linker->General->Additional Library Directives The same applies for all the other libraries also. .

GTiff mask with shapefile in python with gdal, ogr, etc

This functionality is already incorporated into the gdal command line utilities. Given your case, i don't see any reason why you want to do this yourself in Python. You can loop over the geomerties with OGR and for each one call gdalwarp with the appropriate parameters. import ogr import subprocess.

How to I get the coordinates of a cell in a geotif?

Use the affine transformation matrix, which maps pixel coordinates to world coordinates. So for example, use the affine package. (There are other ways to do the same, using simple math.) from affine import Affine fname = '/path/to/raster.tif' Here are two ways to get the affine transformation matrix, T0. E.g., using.

Numpy Read as Array fails when using float 32

How would you address more than 4gb with 32bit process? In fact, usually you are limited with 2Gb. It might be configured to be more on unix machines, but on windows, 32bit processes are always limited with 2Gb. Switch to 64bit python instead.

Convert a PNG with 4 bands to any format with 1 band and a color table

Maybe this command: convert E8.png -colors 256 E8-256colors.bmp gets you closer to what you want? It is a bit large, though, this bitmap. (129 MByte). So this one should be smaller: convert E8.png -type palette -colors 256 E8-palette-256colors.bmp The last one is only 16 MByte. You headline says 'any format'.

Writing compressed netCDF4 files with raster

ncdf files are not written via GDAL because the rgdal package (at least the binary version on windows) does not come with the ncdf driver. Instead, writeRaster uses package ncdf or (preferably) ncdf4, so you would have to use arguments provided by the ncdf4 package (in the ncvar_def function). That.

No module named osgeo.ogr

Problem is solved in this way: (VIRTUAL_ENV)[email protected]:

$ pip install --no-install GDAL==1.11.2 1.11.2 because my version GDAL is 1.11.2: (VIRTUAL_ENV)[email protected]ntu:

$ gdal-config --version 1.11.2 next: (VIRTUAL_ENV)[email protected]:

/.virtualenvs/VIRTUAL_ENV/build/GDAL$ python setup.py build_ext --include-dirs=/usr/include/gdal/ (VIRTUAL_ENV)[email protected]:

/.virtualenvs/bamap/build/GDAL$ pip install --no-download GDAL --include-dirs=/usr/include/gdal/ for me is correct.

How to fix symbol lookup error: undefined symbol errors in a cluster environment

After two dozens of comments to understand the situation, it was found that the libhdf5.so.7 was actually a symlink (with several levels of indirection) to a file that was not shared between the queued processes and the interactive processes. This means even though the symlink itself lies on a shared.

Error creating a spatial database using EXTENSIONS

The issue was that it was using the sqlite3 library that is installed by default by OS X. After linking the brewed library using brew link sqlite3 --force I was able to create the postgis extension.

How to convert point data collected at grid interval to a georeferenced dataset in r?

Without knowing at least the projection and datum of the dataset (but hopefully more info such as resolution and extent), there is no easy way to do this. If this is a derived map, try to find what was used to generate it. With this information you can then use.


1 Answer 1

For a north up image with no rotation (i.e. padfTransform[4] == padfTransform[2] == 0 ), the formula simply becomes:

i.e add the starting X (or Y) to the column (or row) number multiplied by the pixel width (or height).

The rotation terms may be what's tripping you up. When a pixel is rotated, the Y skew needs to be considered when calculating the X map coordinate and the X skew needs to be considered when calculating the Y map coordinate. There's a quite detailed explanation in the wikipedia article on world files.

That there is rotation is what was tripping me up. So can you tell me what P and L stand for? I naively supposed that P was Pixel dimension and L was Line dimension (the lines between pixels). Yes, that is as foolish as a sophomore reasoning with senior. In that vein, what do those letters stand for?

@philologon P is column number, and L is row number. These are "coordinates" on the raster image.

Thanks. I now see it is right there in the documentation. I always find a way to embarrass myself in these kinds of things, usually by skimming too fast.


Dealing with geospatial files from the shell¶

Raster data is often found in a wide variety of formats: “Raw” binary, GeoTIFF, HDF, NetCDF, etc. GDAL offers an abstraction that deals with a very large variety of formats in a consistent fashion, simplifying data access. A raster file as far as GDAL is concerned consists of one or more bands (two-dimensional data arrays). Each element in the array (pixel) represents an area. The location of that particular pixel is encoded by its position (row, column) in the array and both its geotransform and projection information. The geotransform converts from pixel number in row and column position into a reference system. This system is further characterised by its projection properties (for example, for mapping positions on the surface of the Earth to a plane). The geotransform needs to know about the top left hand corner of the image, as well as the pixel spacing. This information is encoded in a six-element vector:

  1. Top left x position (Easting)
  2. E-W pixel resolution (horizontal pixel resolution)
  3. Rotation. 0 if image is “North up”
  4. Top left y position (Northing)
  5. Rotation. 0 if image is “North up”
  6. N-S pixel resolution (vertical pixel resolution. Note that this value is usually negative. Can you guess why?)

Let’s see how this works in a real example. In order to do this, we shall use the gdalinfo command line utility:

The syntax is very straightforward: you give the filename as the first argument and gdalinfo reports the properties of the dataset. This particular file is a GeoTIFF (see how GDAL tells us that much in the Driver line). It then repeats the name of the file, and reports its size (in this case, 200 by 200 pixels). The coordinate system is a UTM/33N projection using a WGS84 datum, and the following lines are a complete description of the projection and all the parameters that are needed. The origin line is the origin location of the dataset, i.e., the coordinates of the top left hand corner of the image, followed by the pixel spacing. If any metadata (data that describes the data, such as author, parameters used in the production of the dataset, etc.) is present in the file, it is reported. In this case, two elements of metadata are given: AREA_OR_POINT and INTERLEAVE . Following this, we have the bounding box of the data (in projected coordinates as well as longitude/latitude degrees), as well as the center of the dataset. Finally, a list of the bands, the data type of each, and the color interpretation scheme (if present) are given.

A typical requirement in many applications is to crop, resample and reproject data from a file. The gdalwarp utility does all this (and quite a bit more. See the command line help for a taste of its many abilities). In this example, we’ll use landcover data from the MODIS instrument, and extract and reproject a region of interest, reproject it and resample it. The area of interest is the Kakadu National Park, located in the Northern Territory in Australia (the Park’s extends from latitudes 131.88E to 133E and 14S to 12S approximately). The output projection is the Map Grid of Australia, with GDA94 datum.

The MODIS data are provided in square tiles representing a 10degree (1200x1200km) area on the surface of the Earth. Data are gridded using the MODIS sinusoidal projection. All the MODIS landcover tiles are available on the UCL system (in /data/geospatial_10/ucfajlg/MOD12/ ), but you need to find out which one covers Kakadu. The following website allows you to look what the tile is. It will be in the following format hXXvYY , where XX is the horizontal tile and YY is the vertical tile.

Having found the tile, identify the MODIS HDF file that contains the data, and run gdalinfo on it. Note the metadata items, and the subdatasets. HDF files contain different spatial datasets in them. In the case of the MODIS land cover product, these are different land cover schemes, as well as auxiliary data regarding uncertainty etc. For this exercise, we want to use the LandCover 1 product. This is done by using the subdataset name instead of the filename. GDAL understands this, and looks for the file and selects the required dataset. They are of the form HDF4_EOS:EOS_GRID:"/path/to/hdf/file.hdf":MOD12Q1:Land_Cover_Type_1 . Note that you need to escape the " in the shell.

It is more convenient to use EPSG codes to specify the target projection. They are shorter, easier to remember and less prone to error. The simplest way is to search for GDA94 MDA94 zone 53 on spatialreference.org, and note down the EPSG code (a number!).

We now calculate the boundary of the park in the target spatial reference system using gdaltransform:

In the above piece of code, we have used a conversion from latitude/longitude values in WGS-84 (EPSG code 4326) to the target EPSG code (marked here as XXXX). We are now ready to transform the data using gdalwarp:

Check that indeed output_file.tif corresponds to the extent, projection and resolution that you expect. Also, try gdalinfo -stats output_file.tif to get a quick idea of the distribution of raster cell values (this is a useful quick check).


Threshold Based Raster Classification

Next, we will create a classified raster object. To do this, we will use the numpy.where function to create a new raster based off boolean classifications. Let's classify the canopy height into four groups: - Class 1: CHM = 0 m - Class 2: 0m < CHM <= 20m - Class 3: 20m < CHM <= 40m - Class 4: CHM > 40m


  • Development:
    • The gdal-dev mailing list should serve as the central point for initial discussions
    • Ticket #4294 should serve for technical aspects and patch and test file repository.
    • Code and history of changes is in a git repository at ​ https://github.com/etiennesky/gdal-netcdf/ (netcdf-dev branch)
    • Anyone that wishes to contribute can post patches here or given write access to the repository.
    • Since the modifications above involved a lot of related changes to significant parts of the driver, we managed the actual development process using a Github repository for integration and testing.
    • The autotest suite had been modified (in svn trunk) to test the improvements - including using an online cf-checker to test conformance when available.
    • The code will be committed to svn trunk, and if possible (dependent on feedback) will be back-ported to the 1.8 series (should a 1.8.2 release occur after implementation and sufficient testing).
    • The commit will happen as soon as we get approval from members of the gdal-dev list or RFC vote (if needed)

    DESCRIPTION

    The coordinate reference systems that can be passed are anything supported by the OGRSpatialReference.SetFromUserInput() call, which includes EPSG Projected, Geographic or Compound CRS (i.e. EPSG:4296), a well known text (WKT) CRS definition, PROJ.4 declarations, or the name of a .prj file containing a WKT CRS definition.

    Starting with GDAL 2.2, if the SRS has an explicit vertical datum that points to a PROJ.4 geoidgrids, and the input dataset is a single band dataset, a vertical correction will be applied to the values of the dataset.

    A source SRS must be available for reprojection to occur. The source SRS will be by default the one found in the input dataset when it is available, or as overridden by the user with -s_srs

    The coordinate reference systems that can be passed are anything supported by the OGRSpatialReference.SetFromUserInput() call, which includes EPSG Projected, Geographic or Compound CRS (i.e. EPSG:4296), a well known text (WKT) CRS definition, PROJ.4 declarations, or the name of a .prj file containing a WKT CRS definition.

    Starting with GDAL 2.2, if the SRS has an explicit vertical datum that points to a PROJ.4 geoidgrids, and the input dataset is a single band dataset, a vertical correction will be applied to the values of the dataset.

    If not specified (or not deduced from -te and -ts), gdalwarp will generate an output raster with xres=yres, and that even when using gdalwarp in scenarios not involving reprojection.

    near: nearest neighbour resampling (default, fastest algorithm, worst interpolation quality).

    bilinear: bilinear resampling.

    cubic: cubic resampling.

    cubicspline: cubic spline resampling.

    lanczos: Lanczos windowed sinc resampling.

    average: average resampling, computes the weighted average of all non-NODATA contributing pixels.

    rms root mean square / quadratic mean of all non-NODATA contributing pixels (GDAL >= 3.3)

    mode: mode resampling, selects the value which appears most often of all the sampled points.

    max: maximum resampling, selects the maximum value from all non-NODATA contributing pixels.

    min: minimum resampling, selects the minimum value from all non-NODATA contributing pixels.

    med: median resampling, selects the median value of all non-NODATA contributing pixels.

    q1: first quartile resampling, selects the first quartile value of all non-NODATA contributing pixels.

    q3: third quartile resampling, selects the third quartile value of all non-NODATA contributing pixels.

    sum: compute the weighted sum of all non-NODATA contributing pixels (since GDAL 3.1)

    The creation options available vary by format driver, and some simple formats have no creation options at all. A list of options supported for a format can be listed with the --formats command line option but the documentation for the format is the definitive source of information on driver creation options. See raster_drivers format specific documentation for legal creation options for each format.

    Mosaicing into an existing output file is supported if the output file already exists. The spatial extent of the existing file will not be modified to accommodate new data, so you may have to remove it in that case, or use the -overwrite option.

    Polygon cutlines may be used as a mask to restrict the area of the destination file that may be updated, including blending. If the OGR layer containing the cutline features has no explicit SRS, the cutline features must be in the SRS of the destination file. When writing to a not yet existing target dataset, its extent will be the one of the original raster unless -te or -crop_to_cutline are specified.

    When doing vertical shift adjustments, the transformer option -to ERROR_ON_MISSING_VERT_SHIFT=YES can be used to error out as soon as a vertical shift value is missing (instead of 0 being used).

    Starting with GDAL 3.1, it is possible to use as output format a driver that only supports the CreateCopy operation. This may internally imply creation of a temporary file.


    Internal nodata masks

    TIFF files can contain internal transparency masks. The GeoTIFF driver recognizes an internal directory as being a transparency mask when the FILETYPE_MASK bit value is set on the TIFFTAG_SUBFILETYPE tag. According to the TIFF specification, such internal transparency masks contain 1 sample of 1-bit data. Although the TIFF specification allows for higher resolutions for the internal transparency mask, the GeoTIFF driver only supports internal transparency masks of the same dimensions as the main image. Transparency masks of internal overviews are also supported.

    When the GDAL_TIFF_INTERNAL_MASK configuration option is set to YES and the GeoTIFF file is opened in update mode, the CreateMaskBand() method on a TIFF dataset or rasterband will create an internal transparency mask. Otherwise, the default behaviour of nodata mask creation will be used, that is to say the creation of a .msk file, as per RFC 15

    Starting with GDAL 1.8.0, 1-bit internal mask band are deflate compressed. When reading them back, to make conversion between mask band and alpha band easier, mask bands are exposed to the user as being promoted to full 8 bits (i.e. the value for unmasked pixels is 255) unless the GDAL_TIFF_INTERNAL_MASK_TO_8BIT configuration option is set to NO. This does not affect the way the mask band is written (it is always 1-bit).


    How do I access GeoTransform array from gdal on the command line? - Geographic Information Systems

    3.6 Reconciling projections

    3.6 Reconciling projections

    3.6.1.2 Changing Projections

    3.6.2.1 Run the pre-requisite scripts

    3.6.3 Reconcile the datasets

    3.6.3.1 load an exemplar dataset

    3.6.3.2 get information from source file

    3.6.3.6 Putting this together

    This section of notes is optional to the course, and the tutor may decide not to go through this in class.

    That said, the information and examples contained here can be very useful for accessing and processing certain types of geospatial data.

    In particular, we deal with obtaining climate data records from ECMWF that we will later use for model fitting. These data come in a netcdf format (commonly used for climate data) with a grid in latitude/longitude. To ‘overlay’ these data with another dataset (e.g. the MODIS LAI product that we have been using) in a different (equal area) projection, we use the gdal function

    where wkt stands for well known text and is a projection format string.

    Other codes we use are ones we have developed earlier.

    In these notes, we will learn:

    We will then save some datasets that we will use later in the notes. For this reason, it’s possile to skip this section, and return to it later.

    For various reasons, different geospatial datasets will come in different projections.

    Considering for example, satellite-derived data from Low Earth Orbit LEO, the satellite sensor will typically obtain image data in a swath as it passes over the Earth surface. Projected onto the Earth surface, this appears as a strip of data:

    but in the satellite data recording system, the data are stored as a regular array. We call such satellite data ‘swath’ (or ‘swath-like’) data (in the satellite imager coordinate system) and we may obtain data products in anything up to Level 2 in such a form.

    These data are often difficult for data scientists to deal with. They generally prefer to have a dataset mapped to a uniform space-time grid, even though this may involve some re-sampling, which can sometimes result in loss of information. The convenience of a uniform space-time grid means that you can. for example, look at dynamic features (information over time).

    The properties of the ‘uniform space-time grid’ will depend on user requirements. For some, it is important to have an equal area projection, one where the ‘pixel size’ is consistent throughout the dataset.

    even if this is not convenient for viewing some areas of the Earth (map projections are very political!).

    Or other factors may be more important, such as user familiarity with a simple latitude/longitude grid typically used by climate scientists.

    For others, a conformal projection (preserving angles, as a cost of distance distortion) may be vital.

    We have see that MODIS data products, for example, come described in an equal area sinusoidal grid:

    but the data for high latitudes and longitudes appears very distorted.

    We must accept then, that dealing with geospatial data must involve some understanding of projections, as well as practically, how to convert datasets between different projections.

    Earth shape

    One factor that can make life even more complicated than using just different projections is the use of different assumptions about the Earth shape (e.g. sphere, spheroid, radius variations). Often, the particular assumptions used by a group of users is just a result of history: it is what has ‘traditionally’ used for that purpose. It can be seen as too bothersome or expensive to change this.

    Since we can convert between different projections though, we can also deal with different Earth shape assumptions. We just have to be very clear about what was assumed. If at all possible, the geospatial datasets themselves should contain a full description of the projection and Earth shape assumed, but this is not always the case.

    The datasets we will mostly be dealing are in the following projections:

    MODIS Sinusoidal (tested), which assumes a custom spherical Earth of radius 6371007.181 m. In cartopy this is given as Sinusoidal.MODIS:

    In the MODIS data hdf products, the projection information is stored directly. Extracted as a wkt, this is:

    According to SR-ORG, the MODIS projection uses a spherical projection ellipsoid but a WGS84 datum ellipsoid. This is not quite the same as the definition in the wkt above.

    It is also defined by SR-ORG with the EPSG code 6974 for software that can use semi_major and semi_minor projection definitions.

    Some software may use the simpler 6965 definition (or the older 6842).

    The MODIS projection 6974 is given as:

    None of these codes are defined in gdal (see files in $GDAL_DATA/ * .wkt for details), so to use them, we have to take the file from SR-ORG.

    For the datasets we are using, it makes no real difference whether the projection information from the file is used instead of MODIS projection 6974, so we will use that from the file. For other areas and especially for any higher spatial resolution datasets, it is worth investigating which is more appropriate.

    ECMWF netcdf format (derived from GRIB) ERA Interim climate datasets (1979-Present). These are geographic coordinates (latitude/longitude) in a custom spheroid with a radius 6371200 m.

    This information can be obtained from any example of a GRIB file, as we shall see below. As a wkt, this is:

    A more common spheroid to use is WGS84, although even in that case there are multiple ‘realisations’ available (used mainly by the DoD). Users should generally implement that given in EPSG code 4326 used by the GPS system, for example.

    3.6.1.2 Changing Projections

    We can conveniently use the Python `cartopy <https://scitools.org.uk/cartopy/docs/v0.16/>`__ package to explore projections.

    We download an image taken from the satellite sensor (SEVIRI):

    The sensor builds up images of the Earth disc from geostationarty orbit, actioned by the platform spin.

    In the code below, we plot the dataset in the ‘earth disk’ (Orthographic) projection, then re-map it to the equal area Sinusoidal projection.

    Exercise 3.6.1 Extra Homework

    • Explore some different types of projection using cartopy and make a note of their features.
    • Read up (follow the links in the text above) on projections.
    • make sure we have the MODIS LAI dataset locally
    • read them in for a given country.
    • register with ecmwf, install ecmwfapi
    • get the temperature datasset from ECMWF for 2006 and 2017 for Europe
    • get the country borders shapefile

    Set up the conditions

    3.6.2.1 Run the pre-requisite scripts

    Make sure you register with ECMWF * register with ECMWF and install the API

    Sort data prerequisities * Run the codes in the prerequisites section

    3.6.3 Reconcile the datasets

    In this section, we will use gdal to transform two datasets into the same coordinate system.

    To do this, we identify one dataset with the projection and geographic extent that we want for our data (a MODIS sub-dataset here, the ‘exemplar’).

    We then download a climate dataset in a latitude/longitude grid (netcdf format) and transform this to be consistent with the MODIS dataset.

    3.6.3.1 load an exemplar dataset

    Since we want to match up datasets, we need to produce an example of the dataset we want to match up to.

    We save the exemplar as a GeoTiff format file here.

    3.6.3.2 get information from source file

    Now, we pull the information we need from the source file (the netcdf format t2 dataset).

    • the data type
    • the number of bands (time samples in this case)
    • the geotransform of the dataset (the fact that it’s 0.25 degree resolution over Europe)

    and access these from the source dataset.

    Now, set up a blank gdal dataset (in memory) with the size, data type, projection etc. that we want, the reproject the temperature dataset into this.

    The processing may take some time if the LAI dataset is large (e.g. France).

    The result will be of the same size, projection etc as the cropped LAI dataset.

    Finally, we crop the temperature dataset using gdal.Warp() and save it to a (GeoTiff) file:

    Now let’s look at the time information in the metadata:

    The time information is in hours since 1900-01-01 00:00:00.0 . This is not such a convenient unit for plotting, so we can use datetime to fix that:

    3.6.3.6 Putting this together

    We can now put these codes together to make a function match_netcdf_to_data() :

    In this section, we have learned about projections, and have reconciled two datasets that were originally in different projections. NThey also were defined with geoids with different Earth radius assumptions.

    These issues are typical when dealing with geospatial data.

    This part of the notes is non compulsory, as the codes and ideas are quite complicated for people just begining to learn coding. We have included it here to allow students to revisit this later. It is also included because we want to develop some interesting datasets for modelling, so we need to deal with reconciling datasets from different providers in different projections.

    In this section, we have developed the following datasets:

    Exercise 3.6.2 Extra Homework

    Go carefully through these notes and make notes of the processes we have to go through to reconcile datasets such as these.

    Learn what issues to look out for when coming across a new dataset, and how to use Python code to deal with it. Try to stick to one geospatial package as far as possible ( gdal here) as you can make problems for yourself by mixing them.


    Watch the video: Πώς να βρείτε την γραμμή εντολών στον υπολογιστή σας