HDR Post Processing

I have not been able to get the initial HDR rendered image of the Scene with pixelDatatype: PixelDatatype.FLOAT and HDR enabled. Is it possible to obtain the HDR rendered Scene texture in a PostProcessingStage?

Clarification as to how I have tried to determine whether the colorTexture uniform in the PostProcessingStage fragmentShader is HDR or not: I checked each RGB value to see if it exceeded 1.0 and output vec4(1e6, 1e6, 1e6) in the shader if so, black if not. The result was a black image.

When outputting vec4(1.0, 1.0, 1.0, 1.0) the image was white. This should not be the case with gamma 2.2 should it? I'm new to HDR stuff so please correct me if I'm wrong.

It seems Scene._view.sceneFramebuffer.getFramebuffer() is passed to PostProcessingStage.execute as the colorTexture, which I have checked to be HALF_FLOAT when HDR rendering is enabled. I do not understand why the values I checked with the method above do not exceed 1.0. They seem to be between 0.0 and 1.0. Are the HDR values collapsed before being sent to the post processing stages?

Also asked here: https://github.com/AnalyticalGraphicsInc/cesium/issues/5808#issuecomment-447858922

Can you share the code you used to test this here? It’ll help me look into this.

I’m afraid I did not save the test code, but as far as I can remember it was based on the PostProcessing Sandcastle example with a very simple shader.

I would bump the issue here (https://github.com/AnalyticalGraphicsInc/cesium/issues/5808#issuecomment-447858922) and @ mention Dan. If you do have a Sandcastle showing what you’re describing that’ll help a lot too. It might be that the tonemapping happens before it reaches the post process stage, so the values get mapped back to 0-1.