Image Layer Caching with Time Dependent URLTemplateImageryProviders

Hi all,
We are developing an application with time-dependent imagery layers. In short, we have a collection of UrlTemplateImageryProviders that each correspond to a different moment in time. We then have a timeline widget on our frontend that the user can scroll through. As the user scrolls, we display the correct ImageLayer for the selected time.

In researching ways to implement this, we came across this post in which someone was accomplishing something similar based on @Kevin_Ring’s suggestion. We have implemented this almost exactly. In our javascript, we have a datastructure containing every ImageLayer for all possible times. By default, all ImageLayers have show set to false so they are not rendered. As the selected time moves forward, we set show to true for the next N ImageLayers, and the alpha to 0 for all but the current layer. This works as suggested in the post, where we do not observe flickering. However, this approach has some drawbacks that I wanted to ask the community about.

Our issue:
If we have a looping set of ImageLayers (think a weather radar), as we play through them, the URL template providers fetch the needed imagery and cache it as long as show is set to true. However, once we set show to false because it is outside of the next N providers, it seems to purge the imagery it has cached. This is similar to this post. This is bad for performance if we are looping through more than the N preloaded sections, as it effectively evicts the imagery on each loop causing a full re-fetch of imagery and more network traffic.

One fix:
We could set show to true for every image layer, however this causes all of them to be rendering, despite alpha being set to 0. That means that as the user scrolls from one area of the map to another, we issue (number of imagery layers * number of map tiles) requests all at once. This clearly doesn’t scale too well. There’s also the rendering speed issue of drawing that many sources at once.

Another fix:
We could implement the first fix and limit the range of time the user can see - and therefore the number of image layers needed. This has to happen to some degree no matter what as we have a finite amount of caching available on the client anyway, but we would have to significantly restrict functionality.

Our question:
Is there any tooling available in the library that would allow for more granular control of cached imagery? Or is there a more performant alternative to what we have described here? If not, would there be a clear path to a feature improvement to Cesium that we could potentially contribute towards?

Thanks in advance.