I’ve been researching issues with my Cesium application’s performance. The trouble is centered entirely around the loading of tiled data.
I have a timeline (not Cesium’s) which dictates which ImageryLayer is loaded and the various other customizations that apply to that layer (dealing with weather data in a highly customizable application). Because the data is temporal, animating is a highly desired feature. I’m also serving real-time data which updates as frequently as every 30 seconds! I manage multiple layers by simply shifting the alpha to 0 for layers that are no longer visible. This has the caveat of not destroying graphics resources and the layer remaining in memory, which is also a pro for me because I need a fast recall of the layer for animation. However, I’ve noticed that if I pan or zoom, network requests are sent for the layers with an alpha of 0. I can’t set show = false on the layer because I end up with the layer taking 1-2s to actually appear after setting show = true.
tl;dr: Too many tile requests due to needing to sustain layers in memory and Cesium generating tile requests for hidden (alpha = 0, not show = false) layers. If I use show = false instead of alpha = 0, my layers take additional time to display which leads to an incredibly choppy animation. I need to reduce these tile requests by not loading tiles for layers with an alpha of 0.