How to create animated heatmaps?

Hi Sam,

Once I read through all the explanations and examples I could find regarding the particle system it didn’t seem like a good fit to me (unless there are some pertinent customization options that I didn’t discover). Runtime performance is definitely a real concern. But if you feel the particle system would provide a good solution I’d love to hear how.

The only way I can imagine using rectangle entities is by creating a procedural texture material that allows the number of rows/columns to be specified and an array of values or colors specified for each “cell” in the resulting grid. However, I have no experience with materials/shaders so I don’t know how feasible this is or what the performance would be like.

It seems CesiumHeatmap hasn’t been updated in over 3 years. It seemed to have issues with performance once I began drawing several (25 or more), and I can’t get the heatmaps to update (change colors). I also haven’t come up with a good way to move the heatmaps with the aircraft so far. In addition to these problems heatmaps.js seems to have a strange approach to coloring the grids/cells… at least it doesn’t produce the results I expect. Maybe I’ll figure this out with more experimentation.

I’d really appreciate some guidance/suggestions from the Cesium development team regarding the best path forward.

1 Like

1 Like

Thank you for the example. I’m trying to understand the creation of the heatmap and how I would utilize this for my needs. I have several questions I’d like to run by you:

  1. It seems the values in the example are all added with a size of 20 and an intensity of 0.05. So what causes the size and color variance in the heatmap results? Is it based on the number of points in a region? I need to calculate the noise level for each point in the heatmap and draw colors based on the noise level value. How would I go about doing this?
  2. There is a canvas object created for the heatmap and then there is a rectangle created in Cesium and an image of the canvas is set as the rectangle’s material. If I’m creating a heatmap for each aircraft will this present a performance issue?
  3. What would be the best approach for updating each heatmap’s colors each animation frame?
  4. What would be the best approach for moving/repositioning the heatmap each animation frame?
1 Like


Thank you for sharing all of these details with me. Particle systems and rectangular entities clearly both have challenges associated with them. I was not familiar with CesiumHeatmap until you brought it up in your previous post. The library seems somewhat outdated and has clearly not enjoyed great adoption. However, it is promising that you were able to get a project up and running quickly using CesiumHeatmap. With a few dev hours, you should be able to optimize the project to improve the runtime performance. Let’s keep exploring.

We can also continue looking into heatmap.js. Their GitHub page seems to have helpful documentation. Other community members have asked questions in the Issues section. The issue section might be a resource worth exploring as you move forwards.

@fation thank you very much for sharing this example. I am curious to hear your responses to @Rob’s questions. I took some time to look through your source code. I think it could be a great resource for the rest of the community.

I really appreciate all the energy around this topic. I am looking forward to learning more and continuing to explore heatmaps in CesiumJS.


@sam.rothstein I want to follow up on your comment that you would share with the CesiumJS development team and update with feedback. Noone is going to have a better idea of the best path forward than them. So I’m very anxious to get their input.

1 Like

Hi @Rob, you were on the right track with this comment:

The only way I can imagine using rectangle entities is by creating a procedural texture material that allows the number of rows/columns to be specified and an array of values or colors specified for each “cell” in the resulting grid. However, I have no experience with materials/shaders so I don’t know how feasible this is or what the performance would be like.

We did something similar for another project that required heatmap visualization. I extracted some of the functionality into a sandcastle example:

Peek 2021-09-08 14-59

The idea is to create a textured ground primitive and update the texture whenever the values change. Just note that this example uses some classes in the private Renderer API like Texture and Sampler. It also it relies on some undocumented behavior in the material API that lets you use a Texture object directly instead of an image uri. Check out the comments in the sandcastle for more details about the approach.

I don’t have code for moving the ground primitive. Worst case you can create a new primitive each time it needs to move, as long as this isn’t happening super frequently. You should be able to re-use the same texture.

Let me know if this helps. I’m happy to explain the approach in more detail if anything is unclear.


Hi @sean_lilley, I really appreciate the details in your response along with the working example! I’ve been looking over your Sandcastle example and definitely need to spend some time digesting.

Since these heatmaps will be used to represent “noise footprints” for aircraft that are being animated the ability to move the heatmaps is really critical. I’ll need to take your example and implement these heatmap objects as a feature that can be enabled for each aircraft in the scene.

We attempt to allow the animation to occur with whatever frame rate is supported by the user’s computer/browser, so we are often rendering several times per second. For this reason I’m concerned that creating a new primitive each time will be very costly.

It’s been a long time since I worked with any primitives. Is the ability to move primitives in this fashion not supported? For this to meet the need I’ll definitely have to come up with a way to move the heatmaps with each animation frame. If you have any suggestions regarding a good way to move these heatmaps so that I can position them during animation with the aircraft they’re associated with I’d very much appreciate anything you can offer.

1 Like

Rectangle geometries need to be recomputed every time they move unfortunately in order to match the Earth’s curvature at that location. Looking at your use case a bit more the geometries are small enough that you can probably get away with using a coplanar polygon instead which can be moved freely with the model matrix property. I put together a second sandcastle that shows the heatmap moving.

The update frequency for the noise texture is 0.05 seconds. The geometry itself is updated every frame. The main bottleneck for performance is actually doing the gradient math when updating the noise texture but I wrote some ideas in the sandcastle for improving that.



@sean_lilley, this is fantastic! I began looking into the modelMatrix this morning as that is how we reposition our aircraft models during animation. The example you put together looks like it should work perfectly. Thank you so much!

I did read your notes about ideas regarding the noise texture…
// This loop is pretty slow and can be optimized. A better approach might be to
// map noise values to colors in the shader. Would need to upload noise values directly
// as a float texture and also pass in the gradient stops/colors as a uniform array.

This is new territory for me, but I would love to pursue this and learn about it. Are there any examples you can point me to? Or can you walk me through what I need to do to pursue this approach?

I really appreciate your help with this!

@sam.rothstein @Rob . This is not my code. I found it and just sent to you :slight_smile:

1 Like


Understood - thanks for clarifying. Where is the code from?


@sean_lilley, I’ve made some progress (thanks to you!). But I have a number of things to work out.

Each flight has it’s own instance of the Material, MaterialAppearance, CoplanarPolygonGeometry, and the Primitive that is created from those.

However, there are 3 issues I’m running into immediately:

  1. It appears that all of the heatmaps are the same as if there is something being cached and all the aircraft are using the cached data. I know GLTF models have a “cacheKey” and if they share a common cache key then they share textures. Is that what’s happening here?
  2. There are places in the map tiles that occlude the heatmaps. See the heatmap in the lower, right portion of the image above. This happens in various places on the map/globe. Is there a setting to make the heatmap be visible above other terrain?
  3. When I turn the heatmaps on I instantly get results for all the aircraft. But after some amount of time (it seems to vary… but maybe 20 to 30 seconds) all of the heatmaps go away. If I wait long enough they come back. I’m not seeing any errors. I don’t know if this is maybe a memory issue. Or if there is a caching issue causing all flights to share the same texture maybe the flight with the shared texture goes out of view and causes the issue?

I’d really appreciate any thoughts/suggestions you might have (particularly regarding the first 2 issues)!

@Rob it’s great to see the fast progress you’ve made here. Hopefully I can help with some of these questions.

  1. I couldn’t reproduce this issue in my own sandcastle here. Could you double check that you’re not using the same texture for each material?
  2. I thought this problem might come up since the coplanar polygon geometry isn’t clamped to ground like the rectangle geometry is. Could you try adding viewer.globe.depthTestAgainstTerrain = false;. This will make geometry below terrain become visible.
  3. This one could be related to the first issue or some other issue in the app. I can’t think of anything in CesiumJS that would cause this behavior. If you can put together a sandcastle that isolates the problem that would be helpful.

@sean_lilley, thanks for the quick response!

  1. I was wrong. I was accidentally using the same texture. Got this sorted out. Thanks!
  2. Setting depthTestAgainstTerrain = false does seem to address the issue, however, see note below.
  3. I think this issue was related to issue number one. I’m not seeing it occur now, so I believe it’s been addressed.

Note regarding depthTestAgainstTerrain:
While this did appear to address the issue, I’m hoping there’s something else we can do. We set depthTestAgainstTerrain = true for this application for some other reasons. The biggest reason is that we draw walls trailing the aircraft following the aircraft’s path for a period of time (typically the past 60 seconds worth of the aircraft’s path). These walls help to see the path on the ground. But when the aircraft are located somewhere where the ground elevation is several hundred feet or more above mean sea level we run into a problem. For performance reasons we don’t want to sample the ground elevation at each flight’s track points. So we use the local airport’s ground elevation and subtract some amount to be safe and use that as the lower extent for our walls. Of course this means the walls extend below the surface which then causes a parallax effect when you tilt the scene since the walls appear to slide across/through the globe.

Since it appears we know why the coplanar polygon objects are being occluded… is there another way to address this issue?

@sean_lilley, I’m curious about the size (width x height) of the coplanar polygon geometry versus the size of the defined grid. My grids are a pretty consistent size of around 10,400 x 10,400 meters (which we determined to be a sufficient size to hold the generated noise footprints), and I divide the grid based on the desired “resolution” (which we like to run by default in our WorldWind application at about 217 meter intervals… but this appears to require too much processing in the browser, so I’m defining the grid in intervals of about 434 or 868 meters at this point for performance sake).

I don’t see latitude/longitude being used to define the grid in your example, so I’m curious about a couple of things…

  1. When we use the modelMatrix to position the primitive I assume the “center” of the primitive is what’s being positioned, correct?
  2. How is the size of the primitive defined?
    a) I’m assuming the geometry object defines the size of the primitive, correct?
    b) If so, I’m assuming it’s important that I define the size in meters of the coplanar polygon geometry to match the size in meters of the width and height of the grid. Do they need to match exactly? What happens if one is larger than the other?

I found a way to work around the depthTestAgainstTerrain issue. There’s even more esoteric stuff going on now but the summary is that depth test is disabled on the heatmap primitive so that it always renders above terrain. To prevent the heatmap from rendering above everything else I modified some of the primitive’s render settings so that it always renders in the opaque pass but still does alpha blending. The important thing to note is that heatmap primitives need to be added to the scene before all other primitives, or else the heatmaps will render above the other primitives. I kept it simple in my sandcastle but a more complicated app might want to create a PrimitiveCollection for all the heatmaps and add that collection to scene.primitives first.

I can’t promise this approach doesn’t have its own drawbacks but I think it gets closer.


The longitude/latitude are only used to compute the model matrix which will position the primitive. It’s the center of the primitive that’s being positioned.

The size of the primitive is based on both the size of the geometry and the model matrix, but since the model matrix has unit scale it’s just the geometry that matters.

I think you want the size of the coplanar polygon and the grid to match exactly. But if they don’t match the texture will scrunch to fit the geometry shape.

1 Like

@sean_lilley, I did as you suggested and created a primitive collection for the heatmaps that gets added to scene.primitives when the app first loads. This seems to do the trick causing the heatmaps to appear above the rest of the terrain but below other primitives we’ve added to the scene. So far this seems like a great approach. Thank you!

@sean_lilley, I’m making more progress. Haven’t got all our algorithms for calculating noise impact implemented yet, but I’ve got enough to begin to get an idea of the results.

(Sorry for the crummy video clip, but I couldn’t find something online that did a very good job.)

There’s some strange positioning of the heatmaps happening, so I’m going to need to investigate that. But it’s obvious that anything I can do to improve efficiency is a good idea. You mentioned:

This loop is pretty slow and can be optimized. A better approach might be to
map noise values to colors in the shader. Would need to upload noise values directly
as a float texture and also pass in the gradient stops/colors as a uniform array.

Could you point me to an example or walk me through in more detail how to accomplish this? I’d really like to improve performance if possible.

Here’s an example of what I had in mind for that. The noise values are uploaded directly to the GPU into a floating point texture instead of being mapped to colors in JavaScript. The shader now does the mapping.

This requires that the user has OES_texture_float support. "OES_texture_float" | Can I use... Support tables for HTML5, CSS3, etc reports 96% coverage across devices so it should be relatively safe to use. If not there are ways to encode floats as RGBA8 textures (clipping planes and elevation band materials are examples of code that does that)