How can we control how many tiles are loaded at any given time? We want to load more tiles closer to us and fewer farther away, or even stop loading tiles that are a certain distance away. Likewise how can we unload tiles when we leave an area? We are working mostly with Oculus Quest 2, it works but we notice the memory usage gets quite high and leads to crashes. We tried unchecking Preload Ancestors and Preload Siblings but then it becomes too slow, and adjusting the Maximum Simultaneous Tile Loads and Loading Descendant Limit had some effect but still crashes even when we set them say to 5.
Hi @carlrealvr,
Thereâs an option under the hood called âmaximumCachedBytesâ which would have the effect you want, but unfortunately itâs not exposed in the UI as a UProperty yet. I just wrote an issue to do that:
Kevin
We tried setting this to a lower number in TilesetOptions, even as low as 32 instead of 512 Mb, and it had some effect but less than we had hoped and still seeing memory issues. Is there anything else we can try? Ideally weâd like to show in high LOD in a radius of say 200-300 meters, and anything beyond that should be in low res. Also it seems like currently itâs cumulative so although at 512 Mb we crash quicker, on 32 Mb it still uses up memory and crashes after 2-3 minutes of moving through space, or turning around in various directions. Again, seems like âpurgingâ is slower than adding new tiles.
Are you doing this in the Editor? The Editor unfortunately does no garbage collection, so memory will always grow without bound in the Editor. You can force garbage collection every frame by running gc.CollectGarbageEveryFrame 1
in the console.
If that doesnât help, you can try setting the maximumCacheBytes property to 0, which will mean no tiles are cached beyond what is currently needed for rendering. If thatâs still doesnât help (it sounds like it wonât), the next possibility is that there are simply too many tiles required for rendering. You can adjust that by changing the maximumScreenSpaceError property (at the cost of reduced quality).
What tileset is this, and what did you use to create it? 3D Tiles is very flexible in how the tile pyramid is arranged, so some tools produce very poor pyramids which require a huge number of tiles to be rendered for typical scenes.
You can save some memory by turning off physics mesh generation if you donât need them. Itâs an option on the Tileset:
And just to make sure, you are the latest Cesium for Unreal release, v1.6.0, right?
Thanks @Kevin_Ring - to clarify, this is when running on Oculus Quest 2 with the Melbourne 3D Tiles that Cesium provides, so we canât run the garbage collection because it reduces the performance of the application. We can try the maximumScreenSpaceError, but an ideal situation is to have it render at high quality up close in a given radius, and having low quality beyond that. How do we implement this logic?
Hi @carlrealvr,
Cesium for Unreal automatically uses detailed tiles up close and less detailed tiles far away. Thatâs standard behavior. Thereâs currently no way to make it âeven less detailedâ far away. It sounds like an appealing feature, but it would probably just be frustrating in practice. For example, âhigh quality within a radiusâ would require you to choose that radius. And thereâs likely no one radius that would look good in a variety of scenes (e.g. different camera heights, different distances to the âinterestingâ elements of the scene like mountains or buildings). The only situation where I would imagine that would be useful is if camera motion is extremely restricted.
Still, the Oculus Quest 2 should be powerful enough that it shouldnât be running out of memory and crashing. Unfortunately I donât have access to an Oculus Quest 2 to test with. Are you sure OOM is the cause of the crash? Does it always crash when visiting a similar area, or is it just a matter of how long you spend flying around? Is there any possibility thereâs application logic that is causing a memory leak? Itâs of course possible that thereâs a memory leak in Cesium for Unreal itself, but weâd expect to see that on more powerful systems, too (eventually). To my knowledge, we havenât seen that. You are using v1.6.0 right?
Also can you please confirm that youâre testing outside the Unreal Engine Editor when you see this crash? Iâm not sure it would be possible to run on the Oculus inside the Editor, but I do want to make sure because memory growth due to lack of garbage collection inside the Editor is a known problem and not something we are able to fix (it has to be taken up with Epic).
Kevin
@Kevin_Ring yes this is outside of editor. Yes this is using v1.6.0. Weâre using OVR Metrics to measure and we can see the UMEM climbs to 4300 or so before it crashes. And it seems to be cumulative: As soon as spawn in the map, its starts climbing until it crashes. Regarding the radius: Thatâs common m.o. in VR, to have multiple LODs, so shouldnât be an issue. Much like in the real world where youâll see things blurry if theyâre far away, same happens here. Sounds like youâre already doing LODs, we just need to fine tune them so theyâre a bit more sensitive. And no, Quest 2 is not very powerful. Usually at total 20-30MB texture at say 300K polygons, it will crash⌠so itâs important to make sure that the player never accumulates anywhere close to this threshold at any given time. We just need to purge as fast as we gain based on a threshold
Sounds like youâre already doing LODs, we just need to fine tune them so theyâre a bit more sensitive.
Yep, thatâs exactly what maximumScreenSpaceError does. If the Quest can only tolerate 20-30MB of textures, the Melbourne tileset is going to be extremely unlikely to work well at the standard quality, unfortunately.
@Kevin_Ring isnât maximumScreenSpaceError a way to apply a lower quality to everything? What if we want to maintain high quality for the immediate surroundings - but custom. Say yours gives 300 yards, and we need, 100 yards radius at highest resolution for our use case. From where and how can we control this? We could then play with it such that it doesnât hit the threshold?
Thatâs not an option, because thatâs really not how 3D Tiles works. Thereâs no notion of âhighest quality for a radius.â It selects the appropriate quality by targeting a particular pixel error regardless of distance. Lower quality geometry and textures are needed for a particular pixel error in parts (tiles) that are far away, and thatâs what Cesium for Unreal gives you (automatically). The Maximum Screen Space Error property specifies that pixel error value to target.
The computation of the ScreenSpaceError (SSE) from the GeometricError is the same for the whole screen (i.e. for everything that is visible). And it only depends on the distance of the object. (And on the screen size, but we can ignore that for now). So right now, the maximumScreenSpaceError
is the main way of adjusting the trade-off between performance and quality, in a rather generic and intuitive way: When the quality is too low, one can decrease the maximumScreenSpaceError
. When the performance is too low (or the memory requirements are too high), then one can increase the maximumScreenSpaceError
.
Going beyond that: One could certainly make a case for other strategies (and I moved that discussion to an issue ). But itâs certainly not trivial conceptually, due to the many degrees of freedom.
You talked about a âhigher resolution for things that are closer to the viewerâ, and a âthresholdâ of, say, 100 yards. But the distance will always have to be taken into account, within and outside of the specified threshold. For example, when something is 1000 miles away, you have to use a different computation than for something that is 101 yards away. (This could be addressed with the non-linear distance approach that I mentioned in the linked issue).
Further, it depends on the use case: What you described certainly makes sense for an application where you dedicatedly want to look at a single building (that is close to you), and donât care much about the rest. But imagine looking at a city skyline from far away: All buildings would then be shown in low resolution, but the terrain tiles right in front of you would be shown in high resolution (even though you donât care about the terrain - you want to see the skyline, and it should look nice). One could similarly make a case for a strategy where one uses a higher resolution for things that are near the center of the screen - because thatâs what you are looking at.
So there are many possible strategies for tweaking the computation of the SSE, depending on different criteria: The distance, user-defined thresholds, view direction, maybe the duration that youâve been looking at something, and maybe you even want to dedicatedly âpin downâ one certain building/tile to have a higher resolution, regardless of all other criteria. And all this might be part of future developments. But implementing all this involves generalizing the algorithms, but also (importantly) offering it in a form that is easy to configure for users (e.g. for defining these âthresholdsâ, and combining the different criteria in a sensible and intuitive way).
@Marco13 @Kevin_Ring Hey guys - thanks for your initial input. Weâve tried to play with the maximumScreenSpaceError using an exponent as mentioned in the linked github issue. We still wish we could figure out how to âreleaseâ tiles from memory faster. Seems like weâd have to wait 1 min before things get released, if you have an idea please let us know, maybe thereâs a way to trigger garbage collection faster? Meanwhile we have a related question on World Terrain Asset ID 1: weâre using it as a base over which we place the cities. We noticed that when we go into wireframe mode in the UE editor, that itâs loading tiles that are far beyond the current horizon. Likewise when we go in app, thereâs a high usage of memory allocated for Word Terrain, even though weâre only using it as a base, while actually navigating the 3D City Tiles above it. How can we load tiles in a defined radius? Again, sounds like we have a particular use case so we need the ability to dictate zones where tiles wonât load.
@Marco13 @Kevin_Ring can you help with this?
A few reasons for why it is difficult (for me) to give focussed, goal-oriented advice:
It is not yet entirely clear what the actual, underlying reason for the error is. You suggested that it is related to the system running out of memory, and that may indeed be the case. But we donât have the possibility to analyze this in the actual environment where the problem appears. Therefore, every attempt to solve the problem will involve some guesswork. And we have already seen examples of such guesswork:
- The problem might be caused by the garbage collector not running often enough or early enough. You mentioned ââŚwe canât run the garbage collection because it reduces the performance of the applicationâ, but ⌠well, when you allocate memory indefinitely, then it will of course crash. At some point, garbage collection (used as a synonym for âfreeing memoryâ) has to happen. The questions that have to be answered in order to find a sensible solution are: Who has to release which memory at which point in time under which conditions?
- The original question was about controlling the number of loaded tiles. The maximum screen space error is the main mechanism for controlling that. Althought there is no direct relationship between this value and the number of loaded tiles, so it will require some experimentation. You cannot say which value you have to use to reliably prevent the crash (i.e. to reliably stay under a certain total allocation amount).
- The caching that is controlled via the
maximumCachedBytes
may have an impact, but tweaking it will not solve the underlying problem, if the problem is just that too much data is loaded - The idea of loading tiles with a higher detail in a certain radius will also not solve the problem, because you donât know which radius and which level of detail you can use before it crashes.
Of course, it would be great if we could just insert some if (occupiedMemory() >= 0.9 * availableMemory()) doNotLoadThatTile();
somewhere, but ⌠itâs not that simple, and without a careful analysis of the problem, every attempt to solve it may just consist of exposing parameters that can be tweaked (and where tweaking them might alleviate the problem), but not solve it in a reliable way.