Increasing memory usage

We are observing what appears to be continuous memory growth during long runtime sessions while streaming tiles.
What we are using:

  1. Cesium for unity running on iOS
  2. Sentinal-2 imagery

Over extended runtimes (almost 2-3 hours), the application’s memory usage steadily increases as the camera moves through the world. Even when reducing the tileset cache size to the lowest possible value (or near zero), memory usage continues to grow linearly during runtime. Lower cache values slow the rate of memory growth but do not stop it completely. This behavior suggests that some native resources or buffers associated with tiles may not be fully released when tiles are unloaded. In long-running sessions this eventually leads to memory exhaustion.

To temporarily mitigate the issue, the only reliable workaround we have found is periodically forcing a full tileset refresh (destroying and recreating the tileset), which flushes memory and stabilizes usage again. However, this is obviously not ideal for production applications that need to run continuously.

The behavior seems very similar to the issue discussed here:
https://community.cesium.com/t/ios-memory-crash-with-google-3d-tiles/33037/4
In that thread, it was suggested that some tile resources may persist longer than expected during streaming scenarios.

For context, our use case involves a long-running simulation where the camera continuously moves across large areas, causing constant tile loading and unloading. Over time, the total memory footprint keeps increasing instead of stabilizing.
We seen similar issues when we are using on a windows machine with cesium for unreal

Let us know if their are some recommended settings or debugging approaches to resolve this issue?