1. Question
I would like to render a 360° panorama in Cesium and blend it with the rendered scene. (E.g. the panorama being very far away and geometry/point-cloud being rendered “in-front”, but being blended based on distance).
1.1 Approaches
After poring over the code and trying to make it work I see different avenues that each have their own challenges:
a) external rendering and passing in the the rendered buffer as a texture (cons: separate rendering context necessary, potentially slower and more resource hungry)
b) set the panorama as a texture on a sphere (cons: lack of control over the rendering, how to blend with the other geometry? I would like to do post-processing in a frag shader)
c) render the panorama in the fragment shader (cons: cannot get the math to work correctly, I have issues with calculating the world coordinates correctly)
d) create a new primitive that is some cross-over between SkyBox, ViewportQuad and FrustumGeometry that renders the panorama on a rectangle on the near plane (cons: unsure how to do that well, what constraints to keep in mind)
2. Code Example (for approach 1.1.c)
current fragment shader (not working correctly, it is rotated in weird ways, depending on the position of the camera on the globe):
void main(void)
{
vec2 screenPos = (gl_FragCoord.xy / czm_viewport.zw) * 2.0 - 1.0;
vec4 ndcPos = vec4(screenPos, 1.0, 1.0);
vec4 clipPos = ndcPos / gl_FragCoord.w;
vec4 eyePos = czm_inverseProjection * clipPos;
vec4 worldPos = czm_inverseView * eyePos;
vec3 ray = normalize(worldPos.xyz);
vec2 uv = equirectangular(ray.xyz);
vec4 color = texture2D(colorTexture, v_textureCoordinates);
float depth = texture2D(depthTexture, v_textureCoordinates).x;
vec4 pano = texture2D(u_panorama, uv);
gl_FragColor = mix(color, pano, depth);
}
``
3. Context
I want to provide an integrated rendering of a point-cloud and a panorama in order to allow things like measuring the distance between different parts of a panorama.