Problem related to the depthTexture

Hi all,

I am working with PostProcessStages of CesiumJS lib. I want to create a effect that related to the depth of each pixel: If the depth of a pixel is so large, the pixel would be not come from geometry. I set the pixel’ color is blue, other pixel’s color is red. However, in the video below (second video), there’s a red rectangular area that doesn’t correspond to any geometry. I don’t know why this area appears. Can someone explain this to me?
My code snippet of PostProcessStages:

const fs =`
    uniform sampler2D colorTexture;
    in vec2 v_textureCoordinates;
    uniform sampler2D depthTexture;

    void main() {
        float depth =  texture(depthTexture, v_textureCoordinates).x;

        if (depth > 0.99) {
			// This is likely a background pixel (far distance)
			out_FragColor = vec4(0.0, 0.0, 1.0, 1.0);  // Blue for background
			return;
		} else {
			// This is likely a geometry pixel
			out_FragColor = vec4(1.0, 0.0, 0.0, 1.0);  // Red for geometry
			return;
		}
    }`;
	
const MyPostProcessStage = new PostProcessStage({
    fragmentShader : fs,
    uniforms : {
        backgroundColor: new Color(0.95, 0.95, 0.95, 1),
    }
});

The original model:

The model with MyPostProcessStage:

Thanks,

1 Like


The depth of the red rectangle area is small than 0.99. This area does not contain any geometry. Why it’s depth is small ? And, how to fix it ?

2 things come to my mind:

  1. Cesium’s Multi-Frustum Rendering.
  2. Cesium’s depth plane

If i understand correctly, you want to draw your geometry red, and the rest blue (simple speaking). Why not use custom shaders and attach that to your geometry?