How to create animated heatmaps?

@sam.rothstein @Rob . This is not my code. I found it and just sent to you :slight_smile:

1 Like

@fation

Understood - thanks for clarifying. Where is the code from?

-Sam

@sean_lilley, I’ve made some progress (thanks to you!). But I have a number of things to work out.

Each flight has it’s own instance of the Material, MaterialAppearance, CoplanarPolygonGeometry, and the Primitive that is created from those.

However, there are 3 issues I’m running into immediately:

  1. It appears that all of the heatmaps are the same as if there is something being cached and all the aircraft are using the cached data. I know GLTF models have a “cacheKey” and if they share a common cache key then they share textures. Is that what’s happening here?
  2. There are places in the map tiles that occlude the heatmaps. See the heatmap in the lower, right portion of the image above. This happens in various places on the map/globe. Is there a setting to make the heatmap be visible above other terrain?
  3. When I turn the heatmaps on I instantly get results for all the aircraft. But after some amount of time (it seems to vary… but maybe 20 to 30 seconds) all of the heatmaps go away. If I wait long enough they come back. I’m not seeing any errors. I don’t know if this is maybe a memory issue. Or if there is a caching issue causing all flights to share the same texture maybe the flight with the shared texture goes out of view and causes the issue?

I’d really appreciate any thoughts/suggestions you might have (particularly regarding the first 2 issues)!

@Rob it’s great to see the fast progress you’ve made here. Hopefully I can help with some of these questions.

  1. I couldn’t reproduce this issue in my own sandcastle here. Could you double check that you’re not using the same texture for each material?
  2. I thought this problem might come up since the coplanar polygon geometry isn’t clamped to ground like the rectangle geometry is. Could you try adding viewer.globe.depthTestAgainstTerrain = false;. This will make geometry below terrain become visible.
  3. This one could be related to the first issue or some other issue in the app. I can’t think of anything in CesiumJS that would cause this behavior. If you can put together a sandcastle that isolates the problem that would be helpful.

@sean_lilley, thanks for the quick response!

  1. I was wrong. I was accidentally using the same texture. Got this sorted out. Thanks!
  2. Setting depthTestAgainstTerrain = false does seem to address the issue, however, see note below.
  3. I think this issue was related to issue number one. I’m not seeing it occur now, so I believe it’s been addressed.

Note regarding depthTestAgainstTerrain:
While this did appear to address the issue, I’m hoping there’s something else we can do. We set depthTestAgainstTerrain = true for this application for some other reasons. The biggest reason is that we draw walls trailing the aircraft following the aircraft’s path for a period of time (typically the past 60 seconds worth of the aircraft’s path). These walls help to see the path on the ground. But when the aircraft are located somewhere where the ground elevation is several hundred feet or more above mean sea level we run into a problem. For performance reasons we don’t want to sample the ground elevation at each flight’s track points. So we use the local airport’s ground elevation and subtract some amount to be safe and use that as the lower extent for our walls. Of course this means the walls extend below the surface which then causes a parallax effect when you tilt the scene since the walls appear to slide across/through the globe.

Since it appears we know why the coplanar polygon objects are being occluded… is there another way to address this issue?

@sean_lilley, I’m curious about the size (width x height) of the coplanar polygon geometry versus the size of the defined grid. My grids are a pretty consistent size of around 10,400 x 10,400 meters (which we determined to be a sufficient size to hold the generated noise footprints), and I divide the grid based on the desired “resolution” (which we like to run by default in our WorldWind application at about 217 meter intervals… but this appears to require too much processing in the browser, so I’m defining the grid in intervals of about 434 or 868 meters at this point for performance sake).

I don’t see latitude/longitude being used to define the grid in your example, so I’m curious about a couple of things…

  1. When we use the modelMatrix to position the primitive I assume the “center” of the primitive is what’s being positioned, correct?
  2. How is the size of the primitive defined?
    a) I’m assuming the geometry object defines the size of the primitive, correct?
    b) If so, I’m assuming it’s important that I define the size in meters of the coplanar polygon geometry to match the size in meters of the width and height of the grid. Do they need to match exactly? What happens if one is larger than the other?

I found a way to work around the depthTestAgainstTerrain issue. There’s even more esoteric stuff going on now but the summary is that depth test is disabled on the heatmap primitive so that it always renders above terrain. To prevent the heatmap from rendering above everything else I modified some of the primitive’s render settings so that it always renders in the opaque pass but still does alpha blending. The important thing to note is that heatmap primitives need to be added to the scene before all other primitives, or else the heatmaps will render above the other primitives. I kept it simple in my sandcastle but a more complicated app might want to create a PrimitiveCollection for all the heatmaps and add that collection to scene.primitives first.

I can’t promise this approach doesn’t have its own drawbacks but I think it gets closer.

3

The longitude/latitude are only used to compute the model matrix which will position the primitive. It’s the center of the primitive that’s being positioned.

The size of the primitive is based on both the size of the geometry and the model matrix, but since the model matrix has unit scale it’s just the geometry that matters.

I think you want the size of the coplanar polygon and the grid to match exactly. But if they don’t match the texture will scrunch to fit the geometry shape.

1 Like

@sean_lilley, I did as you suggested and created a primitive collection for the heatmaps that gets added to scene.primitives when the app first loads. This seems to do the trick causing the heatmaps to appear above the rest of the terrain but below other primitives we’ve added to the scene. So far this seems like a great approach. Thank you!

@sean_lilley, I’m making more progress. Haven’t got all our algorithms for calculating noise impact implemented yet, but I’ve got enough to begin to get an idea of the results.

footprints
(Sorry for the crummy video clip, but I couldn’t find something online that did a very good job.)

There’s some strange positioning of the heatmaps happening, so I’m going to need to investigate that. But it’s obvious that anything I can do to improve efficiency is a good idea. You mentioned:

This loop is pretty slow and can be optimized. A better approach might be to
map noise values to colors in the shader. Would need to upload noise values directly
as a float texture and also pass in the gradient stops/colors as a uniform array.

Could you point me to an example or walk me through in more detail how to accomplish this? I’d really like to improve performance if possible.

Here’s an example of what I had in mind for that. The noise values are uploaded directly to the GPU into a floating point texture instead of being mapped to colors in JavaScript. The shader now does the mapping.

This requires that the user has OES_texture_float support. "OES_texture_float" | Can I use... Support tables for HTML5, CSS3, etc reports 96% coverage across devices so it should be relatively safe to use. If not there are ways to encode floats as RGBA8 textures (clipping planes and elevation band materials are examples of code that does that)

2 Likes

@sean_lilley, this seems to work very well! Thank you very much for all the help with this!

1 Like

Hi @sean_lilley , how I can replicate this example to make it work with terrain elevation data so that I can shade different parts of the terrain (within the rectangle, not the whole globe) based on elevation information?

I cannot visualize how height will map with the texture. How I can make elevation information available in shader? In this example, we use st to pick the colour, but how about elevation?

Also, it’ll be helpful if you can explain what is this texture coordinate (st) and how it’s different than fragment coordinate. How texture coordinate and texture work together ?

Hi @atul-sd, I’m not sure if it’s possible to do exactly what you’re looking for but here are some ideas.

Elevation is only available to globe materials, like in the example below.

var viewer = new Cesium.Viewer("cesiumContainer", {
  terrainProvider: Cesium.createWorldTerrain()
});

var source =
  `czm_material czm_getMaterial(czm_materialInput materialInput) {
      czm_material material = czm_getDefaultMaterial(materialInput);
      float height = materialInput.height;
      float minHeight = 0.0;
      float maxHeight = 9000.0;
      float normalized = (height - minHeight) / (maxHeight - minHeight);
      normalized = clamp(normalized, 0.0, 1.0);
      vec4 minColor = vec4(1.0, 0.0, 0.0, 1.0);
      vec4 maxColor = vec4(1.0, 1.0, 0.0, 1.0);
      vec4 color = mix(minColor, maxColor, normalized);
      material.diffuse = color.rgb;
      material.alpha = color.a;
      return material;
  }`;

var material = new Cesium.Material({
  fabric: {
    source: source
  },
});

viewer.scene.globe.material = material;

viewer.scene.camera.setView({
  destination : new Cesium.Cartesian3(1375618.7733043884, -6137772.839146217, -1197945.3787250787),
  orientation : new Cesium.HeadingPitchRoll(0.28379159317977365, -0.44149770835724755, 0.00008304058741526177),
  endTransform : Cesium.Matrix4.IDENTITY
});

Unfortunately there is no built-in way to limit this to a single rectangle.

The other idea is to call sampleTerrainMostDetailed multiple times to get a grid of elevations that you can create a texture with. Then you can follow a similar approach to the sandcastles posted earlier in this thread.

Sean, thanks for the reply. I’ve also thought of creating texture using an elevation grid for the drawn polygon. I have a question -

I’ve been reading about texture coordinate mapping and I’m uncertain how texture coordinate mapping will work if I generate the texture from elevation data. Consider the following image -

let suppose that I started reading elevation data from the top row (below point A), left to right and populating the colour texture matrix in the same order. Now is it possible that my texture doesn’t map to the polygon as it is? Think of it like a sheet of texture paper, could it be possible that generated texture get rotated (let say 90) while applying on the polygon, in that case, all colour mapping will be wrong which I don’t want.

I’m planning to generate the grid and get elevation for each block (within the polygon) from the backend server (as an array of elevations). Please let me know if you see any potential problems with this approach.

PS: I’m very new to WebGL and graphics, this might be a novice question so please bear with me.

I don’t have a lot of experience with this myself, but I do know that PolygonGraphics has an stRotation property that orients the texture relative to north. There are some good Sandcastle examples about using textures / materials on various primitives.

While I have @sean_lilley 's attention, maybe you could take a look at this post I made way back about a different drawing issue I had? Not sure if I could solve that with a custom material as well. (Don’t want to derail the discussion here, just get extra eyes on a post that nobody seems to have noticed.)

@atul-sd the texture is mapped to the containing geographic rectangle like below. So you’ll want to arrange your elevation texture data so that the most south-west elevation sample is first, followed by the elevation sample slightly east of that, and so on. In other words the texture data is row major, increasing from west to east and south to north. For texels that don’t overlap the polygon you may want to assign them a sentinel value like 0.0 to indicate that they represent no data. Ultimately those texels won’t get rendered but they still need to exist.

The stRotation property as @James_B brought up may help as well if you don’t like the default west to east and south to north orientation.

1 Like

Thanks, @James_B and @sean_lilley, I’ll try it out and post my results here.

Hi, @sean_lilley. This option “…some classes in the private Renderer API like Texture and Sampler” is very useful (example). Are you going to document it in the future?