Hi, I’ve implemented a custom terrain tile cutter through a web controller, which serves up Cesium quantized-mesh .terrain files on demand, and caches them for subsequent calls. I understand that the layer.json file identifies the available X/Y tiles for each LOD.
What I’m seeing seems odd, though, so I’m hoping for a deeper explanation. When I clear my cache, fire up the service, and launch a browser, the default view is at Level 6. However, I am seeing network traffic to my controller requesting terrain tiles for LOD above 6; requests all the way up to the maximum LOD defined in my layer.json.
So, say my layer.json defines LOD 0 - 12. Even when the map initializes on Level 6, I am seeing requests for terrain tiles up to Level 12.
Are these higher LODs really being loaded by Cesium all the way up to the max defined in layer.json?
Part of why I’m asking is that I currently have the terrain cutting working in conjunction with my hill-shading and other tile operations, such that when the Z/X/Y GET request is handled, I only have to poll my source GeoTIFF file with GDAL once to get the projected DEM. That DEM then feeds each process that needs it. As such, any given .terrain file will only be truly available when the map specifically asks for a given tile.
But if Cesium uses layer.json to check for all tiles at all listed available levels, even if the tile has not been requested for visualization, I’ll need to duplicate the terrain cutting functionality in that controller and fetch the DEM again.
Any help with understanding more deeply what’s going on would be appreciated!