How to create animated heatmaps?

@sean_lilley, I’m curious about the size (width x height) of the coplanar polygon geometry versus the size of the defined grid. My grids are a pretty consistent size of around 10,400 x 10,400 meters (which we determined to be a sufficient size to hold the generated noise footprints), and I divide the grid based on the desired “resolution” (which we like to run by default in our WorldWind application at about 217 meter intervals… but this appears to require too much processing in the browser, so I’m defining the grid in intervals of about 434 or 868 meters at this point for performance sake).

I don’t see latitude/longitude being used to define the grid in your example, so I’m curious about a couple of things…

  1. When we use the modelMatrix to position the primitive I assume the “center” of the primitive is what’s being positioned, correct?
  2. How is the size of the primitive defined?
    a) I’m assuming the geometry object defines the size of the primitive, correct?
    b) If so, I’m assuming it’s important that I define the size in meters of the coplanar polygon geometry to match the size in meters of the width and height of the grid. Do they need to match exactly? What happens if one is larger than the other?

I found a way to work around the depthTestAgainstTerrain issue. There’s even more esoteric stuff going on now but the summary is that depth test is disabled on the heatmap primitive so that it always renders above terrain. To prevent the heatmap from rendering above everything else I modified some of the primitive’s render settings so that it always renders in the opaque pass but still does alpha blending. The important thing to note is that heatmap primitives need to be added to the scene before all other primitives, or else the heatmaps will render above the other primitives. I kept it simple in my sandcastle but a more complicated app might want to create a PrimitiveCollection for all the heatmaps and add that collection to scene.primitives first.

I can’t promise this approach doesn’t have its own drawbacks but I think it gets closer.

3

1 Like

The longitude/latitude are only used to compute the model matrix which will position the primitive. It’s the center of the primitive that’s being positioned.

The size of the primitive is based on both the size of the geometry and the model matrix, but since the model matrix has unit scale it’s just the geometry that matters.

I think you want the size of the coplanar polygon and the grid to match exactly. But if they don’t match the texture will scrunch to fit the geometry shape.

1 Like

@sean_lilley, I did as you suggested and created a primitive collection for the heatmaps that gets added to scene.primitives when the app first loads. This seems to do the trick causing the heatmaps to appear above the rest of the terrain but below other primitives we’ve added to the scene. So far this seems like a great approach. Thank you!

@sean_lilley, I’m making more progress. Haven’t got all our algorithms for calculating noise impact implemented yet, but I’ve got enough to begin to get an idea of the results.

footprints
(Sorry for the crummy video clip, but I couldn’t find something online that did a very good job.)

There’s some strange positioning of the heatmaps happening, so I’m going to need to investigate that. But it’s obvious that anything I can do to improve efficiency is a good idea. You mentioned:

This loop is pretty slow and can be optimized. A better approach might be to
map noise values to colors in the shader. Would need to upload noise values directly
as a float texture and also pass in the gradient stops/colors as a uniform array.

Could you point me to an example or walk me through in more detail how to accomplish this? I’d really like to improve performance if possible.

Here’s an example of what I had in mind for that. The noise values are uploaded directly to the GPU into a floating point texture instead of being mapped to colors in JavaScript. The shader now does the mapping.

This requires that the user has OES_texture_float support. "OES_texture_float" | Can I use... Support tables for HTML5, CSS3, etc reports 96% coverage across devices so it should be relatively safe to use. If not there are ways to encode floats as RGBA8 textures (clipping planes and elevation band materials are examples of code that does that)

2 Likes

@sean_lilley, this seems to work very well! Thank you very much for all the help with this!

1 Like

Hi @sean_lilley , how I can replicate this example to make it work with terrain elevation data so that I can shade different parts of the terrain (within the rectangle, not the whole globe) based on elevation information?

I cannot visualize how height will map with the texture. How I can make elevation information available in shader? In this example, we use st to pick the colour, but how about elevation?

Also, it’ll be helpful if you can explain what is this texture coordinate (st) and how it’s different than fragment coordinate. How texture coordinate and texture work together ?

Hi @atul-sd, I’m not sure if it’s possible to do exactly what you’re looking for but here are some ideas.

Elevation is only available to globe materials, like in the example below.

var viewer = new Cesium.Viewer("cesiumContainer", {
  terrainProvider: Cesium.createWorldTerrain()
});

var source =
  `czm_material czm_getMaterial(czm_materialInput materialInput) {
      czm_material material = czm_getDefaultMaterial(materialInput);
      float height = materialInput.height;
      float minHeight = 0.0;
      float maxHeight = 9000.0;
      float normalized = (height - minHeight) / (maxHeight - minHeight);
      normalized = clamp(normalized, 0.0, 1.0);
      vec4 minColor = vec4(1.0, 0.0, 0.0, 1.0);
      vec4 maxColor = vec4(1.0, 1.0, 0.0, 1.0);
      vec4 color = mix(minColor, maxColor, normalized);
      material.diffuse = color.rgb;
      material.alpha = color.a;
      return material;
  }`;

var material = new Cesium.Material({
  fabric: {
    source: source
  },
});

viewer.scene.globe.material = material;

viewer.scene.camera.setView({
  destination : new Cesium.Cartesian3(1375618.7733043884, -6137772.839146217, -1197945.3787250787),
  orientation : new Cesium.HeadingPitchRoll(0.28379159317977365, -0.44149770835724755, 0.00008304058741526177),
  endTransform : Cesium.Matrix4.IDENTITY
});

Unfortunately there is no built-in way to limit this to a single rectangle.

The other idea is to call sampleTerrainMostDetailed multiple times to get a grid of elevations that you can create a texture with. Then you can follow a similar approach to the sandcastles posted earlier in this thread.

Sean, thanks for the reply. I’ve also thought of creating texture using an elevation grid for the drawn polygon. I have a question -

I’ve been reading about texture coordinate mapping and I’m uncertain how texture coordinate mapping will work if I generate the texture from elevation data. Consider the following image -

let suppose that I started reading elevation data from the top row (below point A), left to right and populating the colour texture matrix in the same order. Now is it possible that my texture doesn’t map to the polygon as it is? Think of it like a sheet of texture paper, could it be possible that generated texture get rotated (let say 90) while applying on the polygon, in that case, all colour mapping will be wrong which I don’t want.

I’m planning to generate the grid and get elevation for each block (within the polygon) from the backend server (as an array of elevations). Please let me know if you see any potential problems with this approach.

PS: I’m very new to WebGL and graphics, this might be a novice question so please bear with me.

I don’t have a lot of experience with this myself, but I do know that PolygonGraphics has an stRotation property that orients the texture relative to north. There are some good Sandcastle examples about using textures / materials on various primitives.

While I have @sean_lilley 's attention, maybe you could take a look at this post I made way back about a different drawing issue I had? Not sure if I could solve that with a custom material as well. (Don’t want to derail the discussion here, just get extra eyes on a post that nobody seems to have noticed.)

@atul-sd the texture is mapped to the containing geographic rectangle like below. So you’ll want to arrange your elevation texture data so that the most south-west elevation sample is first, followed by the elevation sample slightly east of that, and so on. In other words the texture data is row major, increasing from west to east and south to north. For texels that don’t overlap the polygon you may want to assign them a sentinel value like 0.0 to indicate that they represent no data. Ultimately those texels won’t get rendered but they still need to exist.

The stRotation property as @James_B brought up may help as well if you don’t like the default west to east and south to north orientation.

1 Like

Thanks, @James_B and @sean_lilley, I’ll try it out and post my results here.

Hi, @sean_lilley. This option “…some classes in the private Renderer API like Texture and Sampler” is very useful (example). Are you going to document it in the future?

@sean_lilley the heatmaps have been working well in general, but we’ve got some graphical anomalies that we haven’t been able to figure out so far. Wanted to post a few pictures and describe what we’re seeing in hopes you may have some ideas.

The screen shots below are of the same aircraft. The ONLY thing that has changed in each screen shot is the altitude/elevation of the camera position. When zoomed out we typically have no anomalies. The further we zoom in the more anomalies we get.

We’ve gone over the calculation of the grid/array that defines the heatmap, and the number rows/columns seems correct. And the way the code is written changing the altitude of the camera doesn’t change any calculations. So we’re puzzled about what might be causing this.

Any ideas about what may be causing this and how we can fix it?

Camera further out
image

Camera zoomed in a little

Camera zoomed in further

@Rob I’m not sure why that’s happening. Are you able to reproduce the problem with the sandcastle I posted earlier (How to create animated heatmaps? - #22 by sean_lilley)?

I’ll have to experiment and see if I can get it to happen with your example. It may require me modifying the example to fly a bunch of flights. I’ll let you know if I can reproduce it there.

Thanks for this, Sean. I am writing software to display satellite infared imagery using the example you’ve shown here. I have got it working successfully in my sandcastle demo. However, when I try running it in my node project using npm install cesium, I am getting the following error: Module '"cesium"' has no exported member 'Texture'. I have the same issue when trying to import Sampler as well.

Do you know how I could go about resolving this issue so that I can use this method in a project which uses the npm version of Cesium? Thanks for the help!

Your has no exported member error sounds like Typescript to me. Both Texture and Sampler are private APIs, so there are no type definitions. You can still access private stuff by augmenting the declaration:

declare module "cesium" {
  class Texture {
    constructor(obj: any);
  }

  class Sampler {
    constructor(obj: any);
  }    
}

Typically I put such augmentations in their own .d.ts file, which I add to the include section of tsconfig.json.

Does that help?

1 Like