Hi,
I’m hoping for some general advice from the community on optimising the performance of an Unreal Engine project that I intend to run in Play Mode.
The PC I’m planning to run the project on has a high-end GPU (48Gb VRAM), Xeon processor and plenty of RAM (512Gb!). I want to set the project up in the best possible way to fully utilise these resources, as none seem particularly stressed currently at runtime.
The project contains two tilesets (Cesium World Terrain and Google 3D Tiles) which I’ve programmed to swap on a button click. I achieved this by toggling on and off “Set Actor Hidden In Game”.
The Cesium World Terrain tileset has several “Cesium Ion Raster” and “Cesium Tile Map” overlays which I’ve programmed to turn on and off by changing the “Material Layer Key” (e.g., one set to “Overlay0” and the rest “NotShown”).
I’m currently running both tilesets and overlays at default settings, (“Max Screen Space Error = 16, Max Texture Size = 2048) and runtime performance isn’t great, lots of waiting for tiles to load. Network connection might be a bit of a bottleneck in my setup - can I reduce the demand on the network connection by preloading data and caching to improve performance (I don’t believe I can set up local datasets).
A few questions:
(1) What general options within the tilesets and overlays will help me fully utilise the available resources?
(2) Are there any setting within Unreal Engine that I need to change to allow it to access more resources? For instance I’ve read online that I need to increase “r.Streaming.PoolSize” but don’t notice much of a difference.
(3) Any ideas on better ways to switch between the tilesets or overlays. For instance, “Set Actor Hidden In Game” does not feel like the most performance-optimised approach, but I’ve not been able to come up with a better alternative.
Any assistance or advice would be greatly appreciated ![]()