How to create animated heatmaps?

@sean_lilley, I’m making more progress. Haven’t got all our algorithms for calculating noise impact implemented yet, but I’ve got enough to begin to get an idea of the results.

(Sorry for the crummy video clip, but I couldn’t find something online that did a very good job.)

There’s some strange positioning of the heatmaps happening, so I’m going to need to investigate that. But it’s obvious that anything I can do to improve efficiency is a good idea. You mentioned:

This loop is pretty slow and can be optimized. A better approach might be to
map noise values to colors in the shader. Would need to upload noise values directly
as a float texture and also pass in the gradient stops/colors as a uniform array.

Could you point me to an example or walk me through in more detail how to accomplish this? I’d really like to improve performance if possible.

Here’s an example of what I had in mind for that. The noise values are uploaded directly to the GPU into a floating point texture instead of being mapped to colors in JavaScript. The shader now does the mapping.

This requires that the user has OES_texture_float support. "OES_texture_float" | Can I use... Support tables for HTML5, CSS3, etc reports 96% coverage across devices so it should be relatively safe to use. If not there are ways to encode floats as RGBA8 textures (clipping planes and elevation band materials are examples of code that does that)


@sean_lilley, this seems to work very well! Thank you very much for all the help with this!

1 Like

Hi @sean_lilley , how I can replicate this example to make it work with terrain elevation data so that I can shade different parts of the terrain (within the rectangle, not the whole globe) based on elevation information?

I cannot visualize how height will map with the texture. How I can make elevation information available in shader? In this example, we use st to pick the colour, but how about elevation?

Also, it’ll be helpful if you can explain what is this texture coordinate (st) and how it’s different than fragment coordinate. How texture coordinate and texture work together ?

Hi @atul-sd, I’m not sure if it’s possible to do exactly what you’re looking for but here are some ideas.

Elevation is only available to globe materials, like in the example below.

var viewer = new Cesium.Viewer("cesiumContainer", {
  terrainProvider: Cesium.createWorldTerrain()

var source =
  `czm_material czm_getMaterial(czm_materialInput materialInput) {
      czm_material material = czm_getDefaultMaterial(materialInput);
      float height = materialInput.height;
      float minHeight = 0.0;
      float maxHeight = 9000.0;
      float normalized = (height - minHeight) / (maxHeight - minHeight);
      normalized = clamp(normalized, 0.0, 1.0);
      vec4 minColor = vec4(1.0, 0.0, 0.0, 1.0);
      vec4 maxColor = vec4(1.0, 1.0, 0.0, 1.0);
      vec4 color = mix(minColor, maxColor, normalized);
      material.diffuse = color.rgb;
      material.alpha = color.a;
      return material;

var material = new Cesium.Material({
  fabric: {
    source: source

viewer.scene.globe.material = material;{
  destination : new Cesium.Cartesian3(1375618.7733043884, -6137772.839146217, -1197945.3787250787),
  orientation : new Cesium.HeadingPitchRoll(0.28379159317977365, -0.44149770835724755, 0.00008304058741526177),
  endTransform : Cesium.Matrix4.IDENTITY

Unfortunately there is no built-in way to limit this to a single rectangle.

The other idea is to call sampleTerrainMostDetailed multiple times to get a grid of elevations that you can create a texture with. Then you can follow a similar approach to the sandcastles posted earlier in this thread.

Sean, thanks for the reply. I’ve also thought of creating texture using an elevation grid for the drawn polygon. I have a question -

I’ve been reading about texture coordinate mapping and I’m uncertain how texture coordinate mapping will work if I generate the texture from elevation data. Consider the following image -

let suppose that I started reading elevation data from the top row (below point A), left to right and populating the colour texture matrix in the same order. Now is it possible that my texture doesn’t map to the polygon as it is? Think of it like a sheet of texture paper, could it be possible that generated texture get rotated (let say 90) while applying on the polygon, in that case, all colour mapping will be wrong which I don’t want.

I’m planning to generate the grid and get elevation for each block (within the polygon) from the backend server (as an array of elevations). Please let me know if you see any potential problems with this approach.

PS: I’m very new to WebGL and graphics, this might be a novice question so please bear with me.

I don’t have a lot of experience with this myself, but I do know that PolygonGraphics has an stRotation property that orients the texture relative to north. There are some good Sandcastle examples about using textures / materials on various primitives.

While I have @sean_lilley 's attention, maybe you could take a look at this post I made way back about a different drawing issue I had? Not sure if I could solve that with a custom material as well. (Don’t want to derail the discussion here, just get extra eyes on a post that nobody seems to have noticed.)

@atul-sd the texture is mapped to the containing geographic rectangle like below. So you’ll want to arrange your elevation texture data so that the most south-west elevation sample is first, followed by the elevation sample slightly east of that, and so on. In other words the texture data is row major, increasing from west to east and south to north. For texels that don’t overlap the polygon you may want to assign them a sentinel value like 0.0 to indicate that they represent no data. Ultimately those texels won’t get rendered but they still need to exist.

The stRotation property as @James_B brought up may help as well if you don’t like the default west to east and south to north orientation.

1 Like

Thanks, @James_B and @sean_lilley, I’ll try it out and post my results here.

Hi, @sean_lilley. This option “…some classes in the private Renderer API like Texture and Sampler” is very useful (example). Are you going to document it in the future?

@sean_lilley the heatmaps have been working well in general, but we’ve got some graphical anomalies that we haven’t been able to figure out so far. Wanted to post a few pictures and describe what we’re seeing in hopes you may have some ideas.

The screen shots below are of the same aircraft. The ONLY thing that has changed in each screen shot is the altitude/elevation of the camera position. When zoomed out we typically have no anomalies. The further we zoom in the more anomalies we get.

We’ve gone over the calculation of the grid/array that defines the heatmap, and the number rows/columns seems correct. And the way the code is written changing the altitude of the camera doesn’t change any calculations. So we’re puzzled about what might be causing this.

Any ideas about what may be causing this and how we can fix it?

Camera further out

Camera zoomed in a little

Camera zoomed in further

@Rob I’m not sure why that’s happening. Are you able to reproduce the problem with the sandcastle I posted earlier (How to create animated heatmaps? - #22 by sean_lilley)?

I’ll have to experiment and see if I can get it to happen with your example. It may require me modifying the example to fly a bunch of flights. I’ll let you know if I can reproduce it there.