Rendering quality issue with 1.1 tilesets (vibrating textures)

Hello everyone,

Since we started using 1.1 tilesets, we noticed a degradation in the visual quality of the rendered images.

Specifically, we note quite a lot of vibration/aliasing in noisy textures (such as grass for instance). The effect is most sensible when moving the camera.

The issue is illustrated in the following sandcastle, which presents 3 renditions of the same data:

  • 1.1 tileset with ktx compression
  • 1.1 tileset without ktx compression
  • 1.0 (with draco compression)

The last variant (1.0 version) offers a much less noisy, more “stable”, experience.

We were wondering what could be the cause of this effect, and if something could be done to alleviate it?

Best regards,
David Hodgetts

1 Like

There are a few things coming together here, and some of them may warrant a deeper investigation. For now, a quick summary of points that may be relevant.


I tried to capture the three tilesets. (It’s a GIF, limited to 256 colors - that does not make sense, but the point is that it’s easier to visually compare the final (rendered) state directly)

Cesium Forum 38634 Texture quality

From what I see, there are subtle differences, but I’d have a hard time attributing any concept of “quality” to them (i.e. I couldn’t say which one is “the best one”).


In your sandcastle, you did set
const tilesetOptions = { maximumScreenSpaceError: 1.5 };
This is pretty low compared to the default (16). It may be appropriate, depending on the structure of the data.

This brings up some questions about the tilesets that may be relevant here. It looks like the 1.1 tilesets have been created with Cesium ion. But which tool has been used for creating the 1.0 tileset?

The 1.0 and 1.1 ones are structurally very different, so it’s hard to make any comparisons here.

For example, the 1.0 one defines a geometric error of 2330 for the tileset, and then geometric errors of (4 / pow(2, level)) for the other tiles. (I.e. 4.0, 2.0, 1.0, 0.5, …).

The 1.1 one defines a geometric error of 1529 for the tileset, and then 44 for the root tile of an implicit tileset (i.e. the errors will be 44, 22, 10, 5…)

In theory, one way to “align” both could be to use different maximumScreenSpaceError values for these tilesets. (It should be a factor of 10, assuming that it’s linear…). But this is only one of the differences.

Other aspects are impossible to sensibly compare. For example, I looked at one texture in the 1.0 data set, and it had a size of 1024x1024. A texture in the 1.1 data set had a size of 1536x1546 (!). But these might have been at different levels of detail (It would take more time to systematically analyze this)


One important detail: When I’m looking at the “not ktx2 compressed” tileset, the viewer eventually prints

The tiles needed to meet maximumScreenSpaceError would use more memory than allocated for this tileset.
The tileset will be rendered with a larger screen space error (see memoryAdjustedScreenSpaceError).
Consider using larger values for cacheBytes and maximumCacheOverflowBytes.

to the console. This basically means that it tries to load to much data (too many textures) to fit into memory. And it will then fall back to using a lower level of detail.

So in theory, it might be that this lower level of detail then uses lower-resolution textures, which may (visually) appear “more washed-out”, which could (depending on your perspective and expectation) also be called “less noisy”…


Maybe all that was too “low-level”. At least, I hope it’s not distracting…

Hi there,

Thanks for the quick answer !

When having a static camera and changing the tilesets, the degradation is indeed not visible. The visual “vibrations” appear when you move the camera (for example by zooming in / out). It feels like the textures the engine is loading are too high compared to the geometric LOD loaded.

With 1.1 tilesets, we see this little noise on the grass and the trees, that we didn’t have with the 1.0 tilesets.

The 1.1 tilesets were produced using the Cesium Ion. The 1.0 tilesets were produced with iTwin capture modeler.

We set the maximumScreenSpaceError to 1.5 for the 1.0 tilesets because we couldn’t load the highest LOD without getting really close to the tileset.
Now that we process the tilesets in 1.1, we set the MSSE to 16 by default, because 1.5 is way too high like you mentioned. But even with a MSSE at 16, we still get this noise / flickering effect. Setting the MSSE higher than 16 doesn’t give a good visual result.

We took a screen video without any compression to get an idea of what we experience. You can download the video here

I see the general point of that “noise”. And avoiding that noise is probably the primary goal for you. Hopefully, someone from the CesiumJS core team can give further hints about possible ways of how to achieve that.


The following part may not be relevant for you.

But I think that in order to resolve this, it might be helpful to have a closer look at some aspects of the data. So I’ll drop some notes/observations here, hoping that it is considered to be (at least interesting, and maybe even) relevant for those who investigate this further.

I had a look at one tile in both the “old” (1.0) and the “new” (KTX-compressed, 1.1) tileset. And I tried to find a tile with similar size and covered area in both tilesets. These tiles, at the time of writing this, are

I extracted the GLB from the B3DM, and had a look at both of them, in https://gltf.report/ , and moved them slightly - I hope the GIF captures that:

Cesium Forum 38634 Texture quality single tile

The main point here is: The KTX-compressed one also shows the noise. (So this is not a ~“CesiumJS rendering issue” in the most narrow sense, but somehow ~“inherent” to the data)

I then had a closer look at the textures themself.

The old one uses a JPEG image with 512x512 pixels.
The new one uses a KTX2 texture with 1736x1732 pixels.

Two points that might be relevant here:

  • The new one is much larger, obviously…
  • The new one has a size that is not a power of two. (So maybe there’s some upscaling going on somewhere in the rendering engines…? Not sure about that…)

Zooming into the same area of these textures, just for comparison:

The shallow observation that I could make now: Maybe there is more noise in the grass because… there just is more noise in the grass? (This refers to the possibility that I mentioned earlier, namely that the old, small, JPEG-compressed texture may just be so “washed out” and “blurred” that there is no noise that could be visible…)

Even when just looking at that single KTX texture and zooming in and out a bit, the noise is clearly visible:

Cesium Forum 38634 Texture quality single texture

Finally, when looking at the JPEG texture (zoomed out pretty far), one can see that there also is some noise, and that is more prominent when the filter mode is GL_NEAREST (right) than for GL_LINEAR (left):

Cesium Forum 38634 Texture quality single jpg filter

That might appear to be a tangent on the one hand (because I’m pretty sure that GL_LINEAR is used everywhere). But an underlying point is that some of the noise could be explained by some issues with mipmapping. (Roughly: When the JPG textures are mipmapped but the KTX textures are not, or when they are using different GL_TEXTURE_LOD_BIAS values for some reason…)


EDIT, literally as a bottom line: It’s not unlikely that the reason for that noise is what you mentioned, namely that

… the textures the engine is loading are too high compared to the geometric LOD loaded.

Thank you @Marco13 for your thorough analysis of our issue, we appreciate a lot.

I’d like to clarify a few points: We’re using identical input data for both version 1.0 and 1.1 of the tiler, and we observe the same issue regardless of whether KTX compression is enabled in the 1.1 pipeline.

Based on your examples, it seems we may have identified the root cause: the new reality tiler appears to be applying unnecessarily high resolutions to tiles that don’t require such detail. Into your exemple, when loading a Level 20 tile, we’re receiving texture quality equivalent to Level 23 (raw data), which is higher than needed for proper display. Would you agree with this assessment of the tiler’s behavior?

I’m hoping that someone can confirm these assumptions. There are some unknowns and things that may have to be examined more thoroughly. (For example: I think that the built-in mipmapping of OpenGL/WebGL is not applicable for compressed textures, which would explain that the effect is only visible (or at least much stronger) for KTX. But if it is also happening for non-KTX-textures, then there may be an additional issue. Usually, mipmapping is exactly aiming at preventing that sort of aliasing/noise). But from my current understanding, and the symptoms so far, it indeed looks like the texture resolution may be too high for the geometry in the size that it is displayed with, yes. (Some details are still examined internally - I’ll try to post any updates if there is a conclusion about that)

Thank you @Marco13 for investigating our issue. Please don’t hesitate to reach out if you need any additional information or a full dataset to explore this in more detail.

We remain very enthusiastic about the 3D Tiles 1.1 format and deeply appreciate the work on this format update. The performance gains and bandwidth savings we’re seeing are impressive and exactly what we need for our production environment. However, we’re currently blocked from deploying these assets due to the visual artifacts we’re experiencing.