How to create animated heatmaps?

@sean_lilley, I’m making more progress. Haven’t got all our algorithms for calculating noise impact implemented yet, but I’ve got enough to begin to get an idea of the results.

footprints
(Sorry for the crummy video clip, but I couldn’t find something online that did a very good job.)

There’s some strange positioning of the heatmaps happening, so I’m going to need to investigate that. But it’s obvious that anything I can do to improve efficiency is a good idea. You mentioned:

This loop is pretty slow and can be optimized. A better approach might be to
map noise values to colors in the shader. Would need to upload noise values directly
as a float texture and also pass in the gradient stops/colors as a uniform array.

Could you point me to an example or walk me through in more detail how to accomplish this? I’d really like to improve performance if possible.

Here’s an example of what I had in mind for that. The noise values are uploaded directly to the GPU into a floating point texture instead of being mapped to colors in JavaScript. The shader now does the mapping.

This requires that the user has OES_texture_float support. "OES_texture_float" | Can I use... Support tables for HTML5, CSS3, etc reports 96% coverage across devices so it should be relatively safe to use. If not there are ways to encode floats as RGBA8 textures (clipping planes and elevation band materials are examples of code that does that)

2 Likes

@sean_lilley, this seems to work very well! Thank you very much for all the help with this!

1 Like

Hi @sean_lilley , how I can replicate this example to make it work with terrain elevation data so that I can shade different parts of the terrain (within the rectangle, not the whole globe) based on elevation information?

I cannot visualize how height will map with the texture. How I can make elevation information available in shader? In this example, we use st to pick the colour, but how about elevation?

Also, it’ll be helpful if you can explain what is this texture coordinate (st) and how it’s different than fragment coordinate. How texture coordinate and texture work together ?

Hi @atul-sd, I’m not sure if it’s possible to do exactly what you’re looking for but here are some ideas.

Elevation is only available to globe materials, like in the example below.

var viewer = new Cesium.Viewer("cesiumContainer", {
  terrainProvider: Cesium.createWorldTerrain()
});

var source =
  `czm_material czm_getMaterial(czm_materialInput materialInput) {
      czm_material material = czm_getDefaultMaterial(materialInput);
      float height = materialInput.height;
      float minHeight = 0.0;
      float maxHeight = 9000.0;
      float normalized = (height - minHeight) / (maxHeight - minHeight);
      normalized = clamp(normalized, 0.0, 1.0);
      vec4 minColor = vec4(1.0, 0.0, 0.0, 1.0);
      vec4 maxColor = vec4(1.0, 1.0, 0.0, 1.0);
      vec4 color = mix(minColor, maxColor, normalized);
      material.diffuse = color.rgb;
      material.alpha = color.a;
      return material;
  }`;

var material = new Cesium.Material({
  fabric: {
    source: source
  },
});

viewer.scene.globe.material = material;

viewer.scene.camera.setView({
  destination : new Cesium.Cartesian3(1375618.7733043884, -6137772.839146217, -1197945.3787250787),
  orientation : new Cesium.HeadingPitchRoll(0.28379159317977365, -0.44149770835724755, 0.00008304058741526177),
  endTransform : Cesium.Matrix4.IDENTITY
});

Unfortunately there is no built-in way to limit this to a single rectangle.

The other idea is to call sampleTerrainMostDetailed multiple times to get a grid of elevations that you can create a texture with. Then you can follow a similar approach to the sandcastles posted earlier in this thread.

Sean, thanks for the reply. I’ve also thought of creating texture using an elevation grid for the drawn polygon. I have a question -

I’ve been reading about texture coordinate mapping and I’m uncertain how texture coordinate mapping will work if I generate the texture from elevation data. Consider the following image -

let suppose that I started reading elevation data from the top row (below point A), left to right and populating the colour texture matrix in the same order. Now is it possible that my texture doesn’t map to the polygon as it is? Think of it like a sheet of texture paper, could it be possible that generated texture get rotated (let say 90) while applying on the polygon, in that case, all colour mapping will be wrong which I don’t want.

I’m planning to generate the grid and get elevation for each block (within the polygon) from the backend server (as an array of elevations). Please let me know if you see any potential problems with this approach.

PS: I’m very new to WebGL and graphics, this might be a novice question so please bear with me.

I don’t have a lot of experience with this myself, but I do know that PolygonGraphics has an stRotation property that orients the texture relative to north. There are some good Sandcastle examples about using textures / materials on various primitives.

While I have @sean_lilley 's attention, maybe you could take a look at this post I made way back about a different drawing issue I had? Not sure if I could solve that with a custom material as well. (Don’t want to derail the discussion here, just get extra eyes on a post that nobody seems to have noticed.)

@atul-sd the texture is mapped to the containing geographic rectangle like below. So you’ll want to arrange your elevation texture data so that the most south-west elevation sample is first, followed by the elevation sample slightly east of that, and so on. In other words the texture data is row major, increasing from west to east and south to north. For texels that don’t overlap the polygon you may want to assign them a sentinel value like 0.0 to indicate that they represent no data. Ultimately those texels won’t get rendered but they still need to exist.

The stRotation property as @James_B brought up may help as well if you don’t like the default west to east and south to north orientation.

1 Like

Thanks, @James_B and @sean_lilley, I’ll try it out and post my results here.

Hi, @sean_lilley. This option “…some classes in the private Renderer API like Texture and Sampler” is very useful (example). Are you going to document it in the future?

@sean_lilley the heatmaps have been working well in general, but we’ve got some graphical anomalies that we haven’t been able to figure out so far. Wanted to post a few pictures and describe what we’re seeing in hopes you may have some ideas.

The screen shots below are of the same aircraft. The ONLY thing that has changed in each screen shot is the altitude/elevation of the camera position. When zoomed out we typically have no anomalies. The further we zoom in the more anomalies we get.

We’ve gone over the calculation of the grid/array that defines the heatmap, and the number rows/columns seems correct. And the way the code is written changing the altitude of the camera doesn’t change any calculations. So we’re puzzled about what might be causing this.

Any ideas about what may be causing this and how we can fix it?

Camera further out
image

Camera zoomed in a little

Camera zoomed in further

@Rob I’m not sure why that’s happening. Are you able to reproduce the problem with the sandcastle I posted earlier (How to create animated heatmaps? - #22 by sean_lilley)?

I’ll have to experiment and see if I can get it to happen with your example. It may require me modifying the example to fly a bunch of flights. I’ll let you know if I can reproduce it there.

Thanks for this, Sean. I am writing software to display satellite infared imagery using the example you’ve shown here. I have got it working successfully in my sandcastle demo. However, when I try running it in my node project using npm install cesium, I am getting the following error: Module '"cesium"' has no exported member 'Texture'. I have the same issue when trying to import Sampler as well.

Do you know how I could go about resolving this issue so that I can use this method in a project which uses the npm version of Cesium? Thanks for the help!

Your has no exported member error sounds like Typescript to me. Both Texture and Sampler are private APIs, so there are no type definitions. You can still access private stuff by augmenting the declaration:

declare module "cesium" {
  class Texture {
    constructor(obj: any);
  }

  class Sampler {
    constructor(obj: any);
  }    
}

Typically I put such augmentations in their own .d.ts file, which I add to the include section of tsconfig.json.

Does that help?

1 Like

Hey James, thanks for the quick and useful response! I am working in a TypeScript environment and think you are correct. It looks like my errors have disappeared after doing what you’ve said.

This is just a way to make TypeScript happy, right? the Texture and Sampler classes are still being pulled from cesiums source code, and we are just providing a public declaration of the classes? Again, thanks for the help!

Cesium is written in Javascript. Typescript lets you import vanilla-JS libraries by incorporating a “typing” file (.d.ts extension) which describes the “shape” of the library. As of a year or two ago, Cesium ships with its own built-in typings. Typescript detects these (through a package.json entry), and when you write import { Texture } from "cesium", it looks at what the cesium module says it exports and tells the typechecker about the object you just imported. Since the bunded types don’t describe private APIs, the snippet I showed you above augments those definitions with extra exports. (You could update that blurb to describe the types more accurately, if you like – I just threw in an any argument to get it working-at-all.)

At runtime, all the type stuff is abstracted away, and depending on what kind of emit is configured in your tsconfig.json, you either get require("cesium").Texture (module: "commonjs") or it leaves the existing import {Texture} from "cesium" (module: "esnext" etc). So, like you said, the type definitions are just to make the type checker happy.

1 Like

Just to help anyone else who runs into this problem, here is a complete declaration to get the original example posted by sean working:


declare module "cesium" {
  class Texture {
    constructor(obj: any);
  }

  class Sampler {
    constructor(obj: any);
  }

  enum TextureWrap {
    CLAMP_TO_EDGE,
  }
}

@sean_lilley hope you’re doing well. I’m also hoping you can help us modify the gradient shader you helped us create for displaying colored heatmaps to work with WebGL2.

This is the Sandcastle example you created which now throws rendering errors with WebGL2.

We’re hoping to get this resolved so that we can have our site take advantage of WebGL2.

Thanks,
Rob