The Maximum Screen Space Error allows you to control the trade-off between “visual quality” and “rendering performance”. Some details about the meaning of this value are explained in the 3d tiles overview at 3d-tiles/3d-tiles-overview.pdf at main · CesiumGS/3d-tiles · GitHub (page 5). The basic idea is: Each tile has a “geometric error” From this geometric error, the screen space error is computed. If this error is too large, then a more detailed representation of the tile is loaded.
So… if 16.0 causes a good quality but a low performance, and 1600 causes a low quality but a good performance, you might want to try 800.0 and see whether the trade-off is right there…
Referring to your example: “Far Buildings are always displayed.” - that’s right. But they should be displayed with a low resolution, and thus, not affect the performance. Only buildings that are close to the camera should be rendered with high resolution.
But … it might also be that the input data does not have a “sensible” geometric error. For example, if the geometric error was the same for all tiles, regardless of their actual detail, then the whole approach wouldn’t work. I don’t know the data set or the related tools. Some insights might already be gained from looking at the
tileset.json. But again: If this cannot be figured out here, then we might have to take a closer look at the resulting tileset (if it can be shared publicly).