Memory leak

1. A concise explanation of the problem you’re experiencing.

We are loading and unloading geojson datasources using dataSources.load() and dataSources.removeAll(). After a few iterations, Chrome will crash. Its pops up the message “Aw, Snap!, Something went wrong while displaying this webpage.”

I am thinking that removeAll() leaves something behind, so the memory footprint keeps getting bigger.

2. A minimal code example. If you’ve found a bug, this helps us reproduce and repair it.

We have tried to create a sandcastle to recreate the problem. It reloads the same dataset a few times. On my browser it crashes at around 5 or 6 times, but this will vary depending on your browser.

3. Context. Why do you need to do this? We might know a better way to accomplish your goal.

We are creating geometry in the browser. We do not want initialise a new viewer each time since this is disruptive for the user. Instead we want to just remove old geometry, and display the new geometry.

4. The Cesium version you’re using, your operating system and browser.

Win 7, latest Chrome, snadcastle


I made your callback function asynchronous and added an “await viewer.dataSources” line and now it doesn’t crash. I’m not sure why this works, but here is the edited Sandcastle.

That Jane, that is interesting, I also do not see how come that works now.

The sandcastle we created was just to try and recreate the problem. In reality in our case, there is no loop, it is a user that selects a new model to display. After a few models, the system crashes.

Perhaps a related question, is there to see how much memory the Cesium viewer is using in Chrome?

(We tried using Chrome dev tools, but it is quite hard to get this information, there are all sorts of issues with the dev tools.)

What about this destroy option?



Is this a better way to make sure the memory actually gets released? Here is the API

Also, what is the difference between:






The reason the destroy flag is there is just to give you greater control over the lifetime of the objects, in situations where you might be caching them, or re-ordering them etc. When destroy is set to false it’s up to the developer to ensure those resources are destroyed.

I think there is a chance there might be a memory leak. If you try this code in a Sandcastle:

var viewer = new Cesium.Viewer(‘cesiumContainer’);

var data;

var handler = new Cesium.ScreenSpaceEventHandler();

handler.setInputAction(function() {


console.log(“Removing all and re-adding”);


var ds = Cesium.GeoJsonDataSource.load(data);



}, Cesium.ScreenSpaceEventType.LEFT_CLICK);

fetch(‘’).then(function(response) {

return response.json();

}).then(function(myJson) {

data = myJson;



Every time you click, even though it’s supposed to remove everything, you can see memory usage going up in the Chrome dev tools. I would open a GitHub issue for this, and until someone can take a closer look, you might be able to dive into the source code yourself to trace the lifetime of the loaded data, where it gets stored and where it might not be getting completely removed.

Thanks Omar for the Sandcastle/

I have been testing it out, but the memory in Chrome seems to stay more or less constant. It goes up for a few seconds, but then when the GC kicks in, it seems to go back to a stable state.

Oh, nice catch!

I wonder if the problem in your application then is that it is just trying to reload the new data source too fast? Which sounds like it be a slightly different problem. If you add some arbitrary delay does it still crash?

I think we have figured out the problem. In our system, we were creating a new event listener each time we loaded new data. The event listener was for the zoom button, to zoom in on the model.

Now we have changed our code, and we only create an event listener once. Now it does not crash any more…

(That does not explain our crash in the sandcastle. But as you said, that may be due to loading data too fast. But never mind, it is working, we are happy :slight_smile:

Thanks for all the help.