Static Vector Tiles II: Openlayers with Custom Projection

Building off the work from my previous post on vector tiles, I wanted to develop a second process for use in Openlayers. There are many tutorials out there demonstrating the use of vector tiles with predefined styles and tutorials explaining how to build a fully vector tile map online. My aim instead is to demonstrate the processes to build:

  1. A vector tile set up overlaying a raster tile cache
  2. How to integrate a custom style.json
  3. How to do all of this in a custom projection using Openlayers

As always, there is more than one method to do any of this. What is shown below can definitely be improved upon. Please take this and make it better.

Github Respository is here.

Website example is here.

This example uses Openlayers6. The map is served in the NZTM projection (EPSG:2193). No effort has been made in ordering the labels. The labels display as the Openlayers decluttering orders them. Included in the JS code is a basic pop-up window demonstrating how to get information from the vector tile.

The project is built as a static set up. The vector tile cache is built directly into the website. THIS IS NOT OPTIMAL, but does demonstrate the principle. Ideally, you would have a location like AWS S3, to serve your tile cache from.

In order to use a custom projection, you will need to build an XYZ tile cache. MBTiles do not handle projections other than Web Mercator (EPSG:3857).

Basic steps

  1. Download or reproject the shapefile in NZTM
  2. Upload shapefile to PostgreSQL database with PostGIS extensions
  3. Tile PostgreSQL table into NZTM (EPSG:2193) XYZ tile cache using TRex
  4. Construct Openlayers6 JS for tile consuption

Sample Data

https://data.linz.govt.nz/layer/50280-nz-geographic-names-topo-150k/

Note:

The Geographic Names layer is clipped and filtered for this example. I clipped only to the Wellington Region and filtered the data only to use:

desc_code = BAY, METR, LOC, POP, TOWN, SBRB

Upload to PostgreSQL

shp2pgsql -s 2193 /data/wellyRegion_townBay.shp public.wellyRegion_townBay_nztm | psql -h localhost -d <yourDatabaseHere> -U <youUserNameHere>

TRex Tiling

TRex will create an XYZ tile cache in the projection of your choosing. You will need to know the resolutions and bounding box of your projection in order to make this work. I was fortunate to have this information at hand thanks to a great tutorial from LINZ.

TRex uses a config file for tiling. The config used in this example is here

The command used to run TREX:

t_rex generate --progress true --maxzoom=14 --minzoom=0 --extent=174.627603,-41.613839,176.259896,-40.737190  --config /configpsql_points.toml

TRex will generate gzip pfb’s. If you prefer to unzip them:

find . -type f | xargs -n1 -P 1 -t -I % gzip -d -r -S .pbf %
find . -type f | xargs -n1 -P 1 -t -I % % %.pbf

Openlayers JS

The Openlayers for this is version 6.  <script> tags needed are:

<link rel="stylesheet" href="https://cdn.jsdelivr.net/gh/openlayers/openlayers.github.io@master/en/v6.5.0/css/ol.css" type="text/css">

<script src="https://cdn.jsdelivr.net/gh/openlayers/openlayers.github.io@master/en/v6.5.0/build/ol.js"></script>

<script src="//cdnjs.cloudflare.com/ajax/libs/proj4js/2.3.15/proj4.js"></script>

<script src="https://unpkg.com/ol-mapbox-style@6.3.2/dist/olms.js" type="text/javascript"></script>

For the full JS example

NZTM Construct in Openlayers

Building the projection for Openlayers

// set NZTM projection extent so OL can determine zoom level 0 extents.
// Define NZTM projection
proj4.defs("EPSG:2193","+proj=tmerc +lat_0=0 +lon_0=173 +k=0.9996 +x_0=1600000 +y_0=10000000 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs");

// Register projection with OpenLayers
ol.proj.proj4.register(proj4);

// Create new OpenLayers projection
var proj2193 = new ol.proj.Projection({
	code: 'EPSG:2193',
	units: 'm',
	extent: [827933.23, 3729820.29, 3195373.59, 7039943.58]
});

// NZTM tile matrix origin, resolution and matrixId definitions.
var origin = [-1000000, 10000000];
var resolutions = [ 8960, 4480, 2240, 1120, 560, 280, 140, 70, 28, 14, 7, 2.8, 1.4, 0.7, 0.28, 0.14, 0.07 ];
var matrixIds = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16];

Applying to Raster and Vector Tiles

Raster Tiles

Another great tutorial from LINZ regarding the set up of an XYZ for raster tiles.

// Tile Services Map
var urlTemplate =
  "https://tiles.maps.linz.io/nz_colour_basemap/NZTM/{z}/{x}/{y}.png";

// Set raster layer
var layer = new ol.layer.Tile({
  source: new ol.source.XYZ({
    url: urlTemplate,
    projection: proj2193,
    attributions: ['<a href="http://data.linz.govt.nz">Data from LINZ. CC BY 4.0</a>'],
    tileGrid: new ol.tilegrid.TileGrid({
      origin: origin,
      resolutions: resolutions,
      matrixIds: matrixIds,
      extent: [827933.23, 3729820.29, 3195373.59, 7039943.58]
    })
  })
});
Vector Tiles

Set up the vector layer to use custom projection:

// Set vector layer
var placeSource = new ol.source.VectorTile({
  cacheSize: 0,
  overlaps: true,
  tilePixelRatio: 1, // oversampling when > 1
  tileGrid: new ol.tilegrid.TileGrid({ 
    origin: [-1000000, 10000000],
    maxZoom: 16,
    tileSize: 4096,
    extent: [827933.23, 3729820.29, 3195373.59, 7039943.58],
    resolutions: resolutions,
  }),
  extent: [827933.23, 3729820.29, 3195373.59, 7039943.58],
  format: new ol.format.MVT(),
  projection: ol.proj.get('EPSG:2193'),
  url: 'https://xycarto.github.io/static.vector.tiles.openlayers.nztm/tiles/wellyRegion_townBay_nztm/{z}/{x}/{y}.pbf'
});

var vectorMap = new ol.layer.VectorTile({
  declutter: true,
  source: placeSource,
  renderMode: 'vector',
  zIndex: 10
  })

Styling

For the style file example

  1. The method in this example is loading the vector tile and overaying it on a raster tile cache. In order to accomplish this, a vector tile cache must be loaded first to the map, THEN the rules from the style JOSN are applied using:
fetch('./styleText.json').then(function(response) {
  response.json().then(function(glStyle) {
    olms.applyStyle(vectorMap, glStyle, 'wellyRegion_townBay_wgs');
  });
});
  1. The above uses olms.applyStyle. To access this function you will need to add the scipt tag to your HTML:
<script src="https://unpkg.com/ol-mapbox-style@6.3.2/dist/olms.js" type="text/javascript"></script>

Notes

  1. Not fully complete. Working example only
  2. Runs slow in Safari
  3. This is in no way the only way to do this. My hope is someone takes this and make it better.
  4. Still needs label work for use on mobile devices.

RGB Elevation Creation for 3D Online Mapping (Terrain RGB)

Mt. Ruapehu, Ngauruhoe, and Tongariro

The following is about how to build an elevation file to work with procedural-gl.js. I just really wanted to build my own elevation dataset and thought it would be helpful to share how I did it.

Procedural-gl.js has changed the game in attainable 3D online mapping. When I came across this project, I was very excited and wanted to get to mapping ASAP.  There is so much potential in just a few lines of JS code; just add a base map, an elevation file and boom, you have a smooth functioning 3D map, mobile ready. I cannot thank Felix Palmer enough for putting this out there. You can check out the source code and some really great maps he developed here.

I’m not one to leave well enough alone. I wanted to develop a method to use New Zealand specific elevations with my goal being to eventually incorporate New Zealand’s LiDAR elevations. However, before I get to building elevation models for LiDAR, I wanted to test building an elevation model with the LINZ 8m DEM. This dataset covers the entire nation and was light weight enough to test on the ol’ home Linux box.

Method

As mentioned above, to run procedural-gl.js in its most basic form, you’ll need a tile cache for your base map and a tile cache for your elevation file.  The base map I have covered, but I was unfamiliar for what was needed for the elevation tile cache. Fortunately, I came across SyncPoints tutorial on how to do this.

I am not going to rewrite their entire process.  They did a really good job explaining the what, why, and how.  Instead, I will layout what I did for the NZ specific data. Much of this process can be further refined and this is really just the proof of concept.

Two quick notes before I get started:

  1. For now, this process is only done in Web Mercator.  I always want to push the limits and do my online work in NZTM, but for now, I am going to work with what I have.
  2. I am not going to talk about building a raster tile cache for the base map.  I was fortunate to have a web mercator tile cache for NZ available.

We want to create an elevation file, in PNG rgb format, rendered into an XYZ tile cache directory, with 512x512px tiles.

The basic steps are this:

  1. Remove all errant noData pixels from a geotiff.  
  2. Convert your elevation geotiff to rgb (single band to three band), in Byte format.
  3. Render your rgb geotiff into xyz raster tiles. Use a directory format.
  4. Consume through procedual-gl.js

List of Tools

Data

The data I used for this was the full LINZ 8m DEM elevation. Make sure your data is in Web Mercator projection (EPSG:3857).  See some earlier blogs about how to retroject across all the tiles using BASH and gdalwarp.

LINZ 8m DEM

Process

1. With the VRT, we can gather all the elevation tiles under one file name.  

gdalbuildvrt elevation.vrt *.tif

2. Remove nodata

Using the VRT, we can preform the noData conversion across all the elevation tiles as if it were a mosaic.  Better yet, the output from this command will produce a mosaic out the other side. GDAL is magic.

gdalwarp  -t_srs EPSG:3857 -dstnodata None  -co TILED=YES  -co COMPRESS=DEFLATE  -co BIGTIFF=YES  elevation.vrt elevation_noData_mosaic.tif

3. RGB-ify the elevation mosaic.

rio rgbify -b -10000 -i 0.1 /elevation_noData_mosaic.tif elevation_noData_mosaic_rgb.tif

4. Create your XYZ tile cache in EPSG:3857

GDAL2Tiles is a quick way to render xyz tile caches if you have single layer and are working in Web Mercator.  Be sure to use 3.1 or greater to get the functionality of xyz caching and tile size control.  You need these both.  In the end, I used the GDAL Docker developed by perrygeo, with modification, --with-python, and got access to GDAL 3.2.  It was a lifesaver, however, I did need to modify the Dockerfile to add GDALs Python bindings and rebuild the Docker.

gdal2tiles.py --s_srs=EPSG:3857 --zoom=0-16 --xyz --tilesize=512 --processes=7 elevation_noData_mosaic_rgb.tif /data/tile-cache

Localhost Testing

Once I built the tile cache, I built a quick site on my localhost to test. I embedded the tile cache into the site (place the tile cache in the same directory as your web files), then I ran it under a local server.  The advantage of embedding your tile cache for testing is that it allows the local server to serve out the raster tiles as well. There is no need to set up a tile server. The local server does this for you.

I switched it up this time used the npm http-server to avoid some cors issues I was encountering. 

http-server . —cors 8000

Code and Website

You can view the code for the HTML and JS here:

https://github.com/xycarto/3D_Aotearoa/

The running website can be found here:

https://xycarto.github.io/3D_Aotearoa/

It works great on mobile devices, but will happily display on your desktop.

Static Vector Tiles for Small Projects

Contours Wellington

Update 04/05/2021: This method has been improved and found here. https://xycarto.com/2021/04/30/static-vector-tiles-ii-openlayers-with-custom-projection/

The following covers a method for visualizing a complex vector tile dataset using a static server.  This particular blog is more for beginners and reduces the technical terms as much as possible. I wouldn’t recommend this method for large complex vector tile data sets, but instead for quick one-off sites that are looking to incorporate a small number of vector tiles where the GeoJSON might be too large/complex to handle.

If you just want to skip to the map and code, visit the Github page created for this blog.

Here is the big reveal at the beginning. If you want to serve a basic vector tile locally or with a static server:

  1. Create an tile cache with uncompressed .pbf files
  2. Place that tile cache on something like GitPages, e.g. embed the tile cache directly along with your website or embed the cache on your localhost along with the other site files
  3. In Leaflet, access the tile cache using ‘L.vectorGrid.protobuf()’

If you’d like to read more in depth:

I work on a Ubuntu OS; however, the methods below will just as easily work with iOS and Windows environments. The parts for building the site, e.g. the HTML, JS, CSS and GitPages, are OS agnostic.

If you want to follow the steps outlined you will need to have installed:

  • GDAL: I am only including this since I used GDAL to extract the contours from the original elevation file.
  • OGR2OGR: Easy tool for converting your shapefile to GeoJSON.
  • Tippecanoe: Excellent tool for generating your tile caches in Web Mercator.
  • T-Rex: Necessary for building your tile cache in a custom projection like NZTM. I tried a number of methods and T-Rex seemed to work best. Sadly, as of Nov 2020, ogr2ogr cannot currently create an MVT in NZTM projection, but it can do projections where the resolutions are halved at each zoom level.
  • Leaflet: Your friendly JS library for building your map in a web environment.
  • SimpleHTTPServer: if you’d like to do localhost testing, you can run your site with this.

You will also need a Github account and a basic knowledge of building a webpage .

Here are the two stacks depending on your needs:

  • Web Mercator: GDAL, OGR2OGR, Tippecanoe, Leaflet, GitPages
  • NZTM: GDAL, OGR2OGR, PostGIS, T-Rex, Leaflet, GitPages

The basic premise is this: with your vector file, you need to construct a tile cache with uncompressed PBF files. The tile cache is the same as any XYZ tile cache you would use for raster tiles, except you are filling it with .pbf files. That tile cache will need to reside in a location accessible by your Leaflet application. In this example we are using GitPages as our server and embedding our tiles directly along with the website we built. Technically, Gitpages is acting as the server. If you are testing and serving on localhost, just embed your tile cache with your web files.

I am going to keep the code light for this blog and instead layout the steps. You can find full code examples in my Github repository here.

Building Your Own Contours

Download Elevation in NZTM
https://data.linz.govt.nz/layer/53621-wellington-lidar-1m-dem-2013/

If you download the entire dataset, you can create the contour lines from the VRT. This saves a lot of time by not creating a mosaic.

Build VRT

gdalbuildvrt dem.vrt *.tif

Contour (@ 50m intervals)

gdal_contour -a elev -i 50 dem.vrt wellyDEMContour.shp

Web Mercator Tile Cache

If you are building a Web Mercator site, you can use Tippecanoe to render the tile cache. You need to create a GeoJSON of your shapefile first

Shape to GeoJSON

ogr2ogr -f GeoJSON -a_srs EPSG:2193 -t_srs EPSG:3857 wellyDEMContour.json wellyDEMContour.shp

Tile JSON

tippecanoe --no-tile-compression --projection=EPSG:3857 --minimum-zoom=12 --maximum-zoom=16 --output-to-directory "static.vector.tiles/contoursWebmer" wellyDEMContour.json

NZTM Tile Cache

If you are building an NZTM site you will need to use T-Rex to generate the NZTM tile cache for the vector tiles. T-Rex likes best if you can give it a PostGIS table to work from. I’d also recommend simplifying complex contour data sets. Your tile generation will be much faster.

Upload to PostgreSQL (you can upload the original NZTM shapefile)

shp2pgsql -s 2193 wellyDEMContour_NZTM.shp public.contournztm | psql -h localhost -d dbName -U userName

T-Rex Config

See full config here

Note the tile size you are setting. You will need this later for your Leaflet application

[grid.user]
width = 4096
height = 4096
extent = { minx = -1000000, miny = 3087000, maxx = 3327000, maxy = 10000000 }
srid = 2193
units = "m"
resolutions = [8960.0,4480.0,2240.0,1120.0,560.0,280.0,140.0,70.0,28.0,14.0,7.0,2.8,1.4,0.7,0.28,0.14,0.07]
origin = "TopLeft"

T-Rex Tile Cache

t_rex generate --progress true --maxzoom=12 --minzoom=9 --extent=160.6,-55.95,-171.2,-25.88 --config /static.vector.tiles/trexConfig/configpsql_contour.toml

Decompress PBF Files

find . -type f | xargs -n1 -P 1 -t -I % gzip -d -r -S .pbf %
find . -type f | xargs -n1 -P 1 -t -I % % %.pbf

Important bit for Leaflet

See the GitPages site for how to set up the Leaflet JS in NZTM


// Access the tile cache
var vector = L.vectorGrid.protobuf(vectorURL, styles);

Have fun!

Basic GRASS GIS with BASH, plus GDAL

As a follow-up to the last blog, I thought it would be helpful to demonstrate how next to break up the elevation example into individual watersheds. The reason being, in my last example I demonstrated the process using a square raster tile. Hydrological processes are not accurate when run on square tiles. It is best to run further processes, like stream order extraction, on complete watersheds.

If you are reading this and want to follow along, it is very helpful to go back and read this blog. The following builds off that last example.

I’ll break this into two parts:

1. Creating multiple watershed boundaries of different sizes with GRASS and using a basic loop in BASH for the process.
2. Clipping the original raster by the watershed boundaries using GDAL and SQL with a basic loop in BASH.

Creating Watershed Boundaries with Varying Sizes

This section will modify the script created in the last post.

Notice in the previous script, when I used r.watershed, I set the threshold value to 100000. You can use this value to set the minimum size of your watershed based on cell units. I see this as minimum number of pixels per watershed since your cell unit is the size of your pixel.

threshold=100000
accumulation=${rasterName}_accumulation
drainage=${rasterName}_drainage
stream=${rasterName}_stream
basin=${rasterName}_basin
r.watershed elevation=$fillDEM threshold=$threshold accumulation=$accumulation drainage=$drainage stream=$stream basin=$basin --overwrite

Have a play setting the threshold value to different values, like 2000000 or 5000. You will get very different outputs in your watersheds vector layer. Try to determine the upper most threshold value you can input before GRASS can no longer determine a watershed.

Warning: really small threshold values will take very long to process and give very poor results, like thousands of very tiny watersheds of little to no value in analysis.

Once you have found you upper most limit for you threshold, determine a few lesser values and build them into a list. Maybe take the upper limit, half it, then half it again.

Make that into a list into a variable:

$ list=$(echo 2000000 1000000 500000)

Then loop the list to iterate through your threshold values when running r.watershed in GRASS. All we change from the last script is the threshold value to use the item from the list. Note we add the threshold value as part of the name too.

Use the standard form of a BASH loop:

for i in $list
do
    echo $i 
done

Loop with GRASS watershed commands:

list=$(echo 2000000 1000000 500000)

for i in $list
do
    # Run watershed operation on fill sink raster
    threshold=$i
    accumulation=${rasterName}_accumulation_${i}
    drainage=${rasterName}_drainage_${i}
    stream=${rasterName}_stream_${i}
    basin=${rasterName}_basin_${i}
    r.watershed elevation=$fillDEM threshold=$threshold accumulation=$accumulation drainage=$drainage stream=$stream basin=$basin --overwrite

    # Convert Basin (watershed) to vector format
    basinVect=${rasterName}_basinVect_${i}
    r.to.vect input=$basin output=$basinVect type=area column=bnum --overwrite

    # Export catchment to vector format
    basinVectOut=${outDir}/${rasterName}_basinVectOut_${i}.shp
    v.out.ogr input=$basinVect output=$basinVectOut type=area format=ESRI_Shapefile --overwrite
done

Full script example here

Your outputs this time will be a collection of shapefile watersheds constructed by your threshold value.

Clipping your Raster by Individual Watersheds using GDAL, OGR and SQL

The watersheds layers are full of nice watersheds, however you cannot simply clip the elevation raster by the full vector file. For the clip, you need to select each individual watershed first. The following will demonstrate how to list each of those watersheds and use them as a clip file via a loop.

To start, set up a test directory for your processed raster outputs:

$ mkdir grass_test/raster_watersheds

The basic command for clipping a raster with a vector in GDAL is:

$ gdalwarp -of GTiff -cutline input_shapefile_for cut.shp input_raster_to_be_cut.tif output_raster.tif

You could create a shapefile for each watershed and clip your raster by that watershed individually using the command above, but the process would be fraught with errors and take a very long time.

It is much easier to automate the process using a combination of ogrinfo, sql and gdal.

We’ll do this with an example using JM_basinVectOut_2000000.shp you just created.

JM_basinVectOut_2000000.shp

In the example, we’ll:

1. Read and list individual watersheds from the vector layer, by listing all the ‘cat’ values leveraging the ogrinfo -sql switch.
2. Using the list, we’ll clip the original raster by each individual watershed

First, list the ‘cat’ values in the vector watershed layer.

The basic ogrinfo command is:

$ ogrinfo /grass_test/JM_basinVectOut_2000000.shp

For this operation; however, we need to list the information of each row in the attributes:

$ ogrinfo -geom=NO -q -sql "SELECT cat FROM JM_basinVectOut_2000000" /grass_test/JM_basinVectOut_2000000.shp

In the above ogrinfo command lists all of the rows for the ‘cat’ column, excluding any geographic information for the file. To do this we use the ‘-sql’ switch available in ogrinfo.

Note when using the ‘-sql’ switch you need a layer name, ‘JM_basinVectOut_2000000’ from the shapefile. The layer name is just the name of the shapefile with the .shp removed.

Use grep to select the parts from the output you want and sed to clean up the outputs for a tidy list.

$ ogrinfo -geom=NO -q -sql "SELECT cat FROM JM_basinVectOut_2000000" /home/ireese/grass_test/JM_basinVectOut_2000000.shp | grep 'cat (Integer)' | sed s/'cat (Integer) =//g'

Make it a list variable:

$ watershedList=$(ogrinfo -geom=NO -q -sql "SELECT cat FROM JM_basinVectOut_2000000" /home/ireese/grass_test/JM_basinVectOut_2000000.shp | grep 'cat (Integer)' | sed s/'cat (Integer) =//')

Before we loop the process in gdal lets look quickly at why this list was created.

In gdalwarp we are going to again leverage the ‘-sql’ switch in order to select each individual watershed row and clip only by that row. We’ll be writing the a basic ‘sql’ similar to the above, but instead filtering by each watershed:

SELECT cat FROM watershed WHERE cat=$i

So, putting all the bits together, using clip and SQL, in gdalwarp we get:

$ gdalwarp -of GTiff -dstnodata -9999 -cutline $inputVector -csql "SELECT cat FROM $inputVectorLayerName where cat='$i'" -crop_to_cutline input_raster_to_be_cut.tif output_raster.tif

Note, I am using ‘-crop_to_cutline’ and ‘-dstnodata’. Basically I’m saying, crop the output raster the bounding box of the input vector and set a ‘nodata’ value of -9999 to the pixels outside the clip area.

Now we need put this all together in a script to loop the process.


#!/bin/bash

# set base path
outDir=grass_test
outDirRast=grass_test/raster_watersheds

# Set raster as variable
raster=${outDir}/lds-tile-jm-GTiff/JM.tif
rasterName=$( basename $raster | sed 's/.tif//g' )

#prep your input vectors
inputVector=/home/ireese/grass_test/JM_basinVectOut_2000000.shp
inputVectorLayerName=$(basename $inputVector | sed 's/.shp//')

#create your watersheds list
watershedList=$(ogrinfo -geom=NO -q -sql "SELECT cat FROM $inputVectorLayerName" $inputVector | grep 'cat (Integer)' | sed s/'cat (Integer) = //')

for i in $watershedList
do
    gdalwarp -of GTiff -dstnodata -9999 -cutline $inputVector -csql "SELECT cat FROM $inputVectorLayerName where cat='$i'" -crop_to_cutline $raster $outDirRast/${rasterName}_${i}.tif
done

Copy the above script, check your file paths, and save it in a text editor as:

rasterClipByWatershed.sh

 

To run the script:

$ bash grass_test/rasterClipByWatershed.sh

Final Note
In the above process, it may be necessary to run some clean up operations to remove invalid geometries in the watersheds vector layer. I did not include this, but it is helpful to resave the watersheds shapefile first by running a buffer operation with a buffer value of ‘0’. This will remove the invalid geometries. You can run this in GRASS or using ogr2ogr with PostGIS. Here is an ogr2ogr with PostGIS example:

$ ogr2ogr -f "ESRI Shapefile" output_vector.shp input_vector.shp -dialect sqlite -sql "select id, ST_buffer(Geometry,0) as geom from input_vector" -overwrite

All the scripts from the past two posts can be found here.

Let me know of this helpful. I could potentially move on to doing some more basic hydro analysis with GRASS and BASH, but these things take time and energy. It would be good to know if these sorts of blogs have value to the wider geospatial world.

Open Source Stack for Raster Tiling in Custom Projections

This post is a high level look at the recent stack I built for a raster tiling set up. I am working out some kinks in my online and network delivery of cartographic products, so I thought it was time to set up a raster tiling service to access XYZ and WTMS services from my raster tile caches. I’ll be adding maps and zooms levels in the future, so check back now and again. Antarctica is on it’s way soon!

nz_from_basemap_service

Basic Demo Service using NZTM projection is here: https://xycarto.github.io/

See below for WMTS links

Raster tiling is not the only method, but it is still a viable choice for delivering nice looking maps online, serving across networks, and designing with raster data. I am particularly enamored with the quality of the visual outputs. For me, it is akin to the difference between music in vinyl and digital formats. In addition, the process is well documented and fairly straight forward. By virtue of having been around for a while, raster tiling has a wealth of information and standards to work with, delivery from S3 is a robust process, and there is nice integration with QGIS, Leaflet and Openlayers.

I break the stack in to three areas: analysis, rendering, and delivery

Analysis
QGIS: Sketching, QC, and general geospatial work.

GDAL: Processing raster data. Configuring your rasters in an optimal format from the beginning will greatly improve your rendering speeds. I recommend creating a good set of overviews and gathering everything into a virtual raster tile (VRT).

Postgres/PostGIS: Handling your vector data. Pulling all your data from a database significantly improves rendering speeds. Don’t forget to index!

Rendering
Tilemill/Mapnik XML: Yes, I still design using CartoCSS when working with raster data. I love the simplicity of the language. Tilemill is easy enough to containerize these days too. Tilemill exports into the Mapnik XML format, essential for my process further down the line. Here is how to hack Tilemill to work in a custom projection.

Mapnik: Support for using Mapnik XML

Mapnik with Python Bindings: Necessary for using Mapnik XML documents in MapProxy

MapProxy: MapProxy is a map server and tile renderer . It is easy to build on your machine, though I recommend using a container like Docker. Specifically, I use a hack provided by PalmerJ at Github to increase rendering speeds through multi-threading.

Delivery
Amazon S3: Simple Storage Service. Amazon is pretty cheap, free in many cases, and a good place for storing your tile cache. You get an easily accessed URL for your tiles and a home for your WMTS GetCapabilities document.

WMTS: For me, the real power in a base map service is the WMTS, so, below are two links to the WTMS service for you to set up in QGIS if you’d like to have a play. Here is a quick tutorial about how to set up WMTS if you are unfamiliar.

https://s3-ap-southeast-2.amazonaws.com/basemaps.temp/nz_colour_basemap/WMTSCapabilities.nz_colour_basemap.xml
https://s3-ap-southeast-2.amazonaws.com/basemaps.temp/nz_topo_basemap/WMTSCapabilities.nz_topo_basemap.xml

XYZ: Building a web map? If your tile cache is in S3, in a TMS structure, and public you should be able to access it via simple XYZ request like so:

https://{s3-your-region-here}/{your_bucket}/{project_name}/{projection}/{z}/{x}/{y}.png

Leaflet: Leaflet will handle all the XYZ requests to the server and allow for custom projections. Have a look here for the basic HTML, CSS and JS set up.