How to transfer dynamic texture in post processing stage

1. A concise explanation of the problem you're experiencing.

I want to send a dynamic texture to the fragment shader of post processing stage. The "dynamic" means the texture will be modified periodically. But I don't know how to do. The examples in the document only show how to send scale and vector data to fragment shader.

2. A minimal code example. If you've found a bug, this helps us reproduce and repair it.

The texture need to be modified, so I suppose I have to use a function to return the texture I want in the uniforms block?

  viewer.scene.postProcessStages.add(new Cesium.PostProcessStage({
    fragmentShader: fragmentShaderSource,
    uniforms: {
      myTexture: function () {
        // what should I do here?
      }
    }
  }));

3. Context. Why do you need to do this? We might know a better way to accomplish your goal.

I made a particle system in WebGL, and the particles will be rendered to a texture(canvas) first. I want to inject these particles by making use of post processing stage, because it provides the depth texture of Cesium so that I can render the particles correctly.

4. The Cesium version you're using, your operating system and browser.

Cesium 1.53

Could you help me? Thanks.

I know if you pass a URL to an image there, Cesium will automatically fetch and create a texture from it (see my answer here https://groups.google.com/d/msg/cesium-dev/9yJ-rkWf5d4/ZBtKWSc1AwAJ). I’m not sure if passing in a canvas directly will work. Have you tried that?

I have tried passing canvas directly, it does not work, and I get lots of errors

[.WebGL-09279D88]GL ERROR :GL_INVALID_OPERATION : glDrawElements: Source and destination textures of the draw are the same.
255 WebGL: INVALID_ENUM: bindTexture: invalid target
Cesium.js:91999 WebGL: too many errors, no more errors will be reported to the console for this context.

My code

let fragmentShaderSource = Util.getShaderCode('glsl/postprocessing.frag');
viewer.scene.postProcessStages.add(new Cesium.PostProcessStage({
    fragmentShader: fragmentShaderSource,
    uniforms: {
        particleTrails: function () {
            let canvas = document.getElementById('canvas');
            return canvas;
        }
    }
}));

在 2019年1月29日星期二 UTC+8上午12:11:37,Omar Shehata写道:

So it looks like you can pass a canvas only if you set it directly, without a function wrapping it. I don’t think there’s a way to update it with the public API, but you can do it by manually adding a uniform to the PostProcessStage._texturestoCreate array.

See this GitHub issue for more details and a code example: https://github.com/AnalyticalGraphicsInc/cesium/issues/7521

I hope this helps, and thanks for bringing up this issue! Can you tell me a bit about what you’re using post processing for? It’s relatively new in CesiumJS so I’m curious what applications people are using it for.

Basically, I want to draw the trails of particle.

You can see the detail in this thread https://groups.google.com/forum/#!topic/cesium-dev/gjjd9TNeY2A. I follow your suggestions(particle system and primitives in Cesium) in that thread, but the performance is not good enough when rendering a large amount of particles(lower than 10fps with 50000+ particles).

After trying particle system and primitives, I decided to use WebGL. I got satisfying performance with WebGL, but it is not easy to integrate my WebGL code into Cesium. After reading several threads in forum and the document of Cesium, it seems that making use of post processing stage is the easiest way.

Thanks for your code example, it works, but the fps drops from 40 to 20 in my computer. Is it possible to update the texture without creating new texture over and over again?

Edited: fixed the format of reply

在 2019年1月29日星期二 UTC+8下午10:45:42,Omar Shehata写道:

Yeah so I think what the application is doing right now is rendering to a canvas, capturing that canvas as a texture, uploading it back to the GPU to re-render in a post process, which is going to be slow done every frame.

I think a much faster approach would be to render your particles directly in CesiumJS. Unfortunately there aren’t a lot of available code examples on this but you could set up a ViewportQuad:

https://cesiumjs.org/Cesium/Build/Documentation/ViewportQuad.html?classFilter=Viewport

With a custom vertex and fragment shader, and set up all your uniforms/attributes there. There’s been a lot of interest in the last few months of doing custom WebGL passes in CesiumJS so this might be something that gets more support in the coming months. In the mean time, these blog posts might be helpful:

Thank you for providing useful information! I will try doing custom rendering in the next version of application. But for the moment I want to use post processing stage to give a demo of my application quickly (20 fps is still ok for a demo).

And now I have a question about the Sandcastle you provide in the Github issue https://github.com/AnalyticalGraphicsInc/cesium/issues/7521 . It seems that there is a memory leak due to the repeat call of "_texturesToCreate" method. How can I destroy the previous textures when creating new texture? I tried to use _texturesToRelease but Cesium stopped with error "texture.destroy is not a function". How can I destroy the previous texture?

My code
        stage._texturesToRelease.push({
            name: 'particleTrails',
            source: document.getElementById('canvas')
        });

I don’t think there should be a memory link, as long as you’re pushing the same “name”. This is what happens when you push a texture to that array:

It creates a new texture but it’s placed using the same key as before, which means the old texture no longer has any references to it and should get garbage collected.

Yes, you are right, there is no memory leak. My program just uses too much memory for textures before GC.