I don’t know if this is a bug or if I am just not understanding how the dynamic screen space error works.
I used the “3D Tiles Point Cloud” sandcastle example (Cesium Sandcastle) and added the 3D Tiles inspector using:
viewer.extend(Cesium.viewerCesium3DTilesInspectorMixin); const inspectorViewModel = viewer.cesium3DTilesInspector.viewModel;
Then in the inspector, I activated the “Dynamic Screen Space Error”.
I expected that tiles further away are rendered more sparse or even disappear. To see how the values work, I moved the sliders for “Screen Space Error Density” and “Screen Space Error Factor”. But it doesn’t matter how i move these sliders, nothing happens. I also moved around in the point cloud to test different positions. The only slider that actually does something is the “Maximum Screen Space Error” slider, but that is not part of the dynamic SSE.
My use case is that in my program, I display very large point clouds, and some users complained that the performance is very bad. We already use the Max SSE as a Rendering Quality Setting. I thought we could optimize this further by “throwing away” tiles that are far away, and all research pointed to dynamic SEE.
Is this broken? Or am I misunderstanding the dynamic SSE?
In my program, changing the density does a little bit of what I am trying to achieve, but it’s like loading 120 tiles instead of 125, which is not enough. So I thought I’d have to adjust the screen space error factor, but this changes nothing (tried values 1, 4, 1024).
Any help or advice is very much appreciated!