How to Monitor Resource Consumption

Hi there, I was wondering if I could get some help with monitoring resource consumption in Cesium.

In my application, I’ll be using one CustomDataSource populated with several hundred entities. I’ll also have numerous BillboardCollections that contain billboard primitives, each with a canvas element as an image.

I plan to cap the use of each of these objects in order to ensure that the underlying WebGL resources NEVER run out, to ensure that a WebGL Context Loss does not occur. I will have complete control over the way the browser will be used for my customers, and I will constrain that usage to ensure that some unknown WebGL application does not consume resources.

So on startup, I’ll configure things such as MAX_NUM_ENTITIES and MAX_NUM_BILLBOARD_PRIMITIVES and so on. I’ll then implement application tier mechanisms to avoid exceeding those maximums.

Something I’m struggling with is measuring the consumption of resources in order to determine if resources are consumed steadily and over time. This, of course, could happen due to a leak in my code, or possibly a leak in Cesium. Alternatively it could be due to a bug in my code that is attempting to limit resource usage to the configured maximums.

Ultimately, the ability to continually measure resource consumption will ensure that I can detect such leaks or bugs at design time, or during product integration & test.

So my question is: what are the techniques that I can use to monitor resource consumption? I understand that I must monitor GPU resource consumption. I also must monitor memory consumption, and I’d like to monitor Cesium’s resource consumption as well. Does Cesium provide methods for querying for the critical resources associated with all of the objects I listed above?

Thanks!

This sounds like it could be a pretty useful feature for Cesium to have! Unfortunately, it does not, and I think it may be quite an involved task to setup something like that. I think a large part of the resources consumed by any particular entity or primitive depends a lot on the browser/GPU and what’s happening behind the scenes. For example, modern JavaScript engines can do a lot to optimize away a lot of things or pool resources, so it’s not necessarily completely possible for a JavaScript application to have concrete data on this as far as I’m aware.

Here’s a discussion on the WebGL mailing list discussing the challenges involves with doing this and some suggestions:

https://groups.google.com/d/msg/webgl-dev-list/TrPjvxVk5rc/dWgEb7eIAAAJ

If you find some good solutions, definitely please share! The Cesium approach of doing this has been to be use as few resources as needed to render any particular scene. This is kind of the ideology behind 3D Tiles, which does have a “maximumMemoryUsage” parameter:

https://cesiumjs.org/Cesium/Build/Documentation/Cesium3DTileset.html#maximumMemoryUsage

Notice that here it also says it’s just an approximation based on the known size of the images and geometry buffers.