Major memory leak with EntityCollection and Entity removal

Firstly, see here:

It may be worth considering changing:
this._hash[key] = undefined;
delete this._hash[key];
As it currently creates a large number of 'key: undefined' entries when you delete a large number of entities.

However, this isn't the memory leak I am talking about.

Currently, if you add and then remove a large number of entities (as you may have to in live data displays, for example), the memory usage builds up.

It appears that a reference to the entities is still stored in a GeometryVisualiser so they are not cleared by the GC.

For example, after creating and deleting 5000 entities I could still see 5000 entities in:

Running remove() on the EntityCollection does not appear to correctly remove the data from the primitives. And they appeared to be references in most of the visualizers.

This breaks my usage of Cesium as I am using it for live data, so I need to create and delete entities regularly. Is there any workaround for this? Is this something that you guys are already aware of?

I have included sandcastle code below for replicating the issue. It prints out the collection to console after each change, so if you are using chrome you should be able to view the datasource in the console and notice how the primitives under visualizers are holding on to a reference to the entities.

Creating and deleting 20k entities and then running a javascript Heap Profile in chrome will show you a large amount of memory retained in Entity objects.

If you wish to replicate it, paste the following HTML into the sandcastle:
    @import url(../templates/bucket.css);
<div id="cesiumContainer" class="fullSize"></div>
<div id="loadingOverlay"><h1>Loading...</h1></div>
<div id="toolbar">
    <div id="addEntities"></div>
    <div id="removeAllEntities"></div>

And paste the following javascript into the sandcastle:

var viewer = new Cesium.Viewer('cesiumContainer');
var testDataSource = new Cesium.CustomDataSource('test');

var createQuantity = 5000;
var created = 0;
var deleted = 0;

// Add entities
function addEntities(){
    for (var i = created; i < (created + createQuantity); i++) {
        var entity = testDataSource.entities.getById(i);
        var lng = -80.12;
        var lat = 25.46;
        var name = 'test' + i;
        var type = 'type' + i;
        var description = ' test description words ' + i +
            ' test description words ' + i +
            ' test description words ' + i +
            ' test description words ' + i +
            ' test description words ' + i;

        // If no heat dot found, create it
        if (typeof entity === 'undefined') {
            entity = testDataSource.entities.add({
                id: i,
                name: name,
                entityType: type,
                description: description,
                rectangle: {
                    coordinates: Cesium.Rectangle.fromDegrees(lng - 0.0001, lat - 0.0001,
                        lng + 0.0001, lat + 0.0001),
                    material: Cesium.Color.WHITE.withAlpha(0.5),
                    extrudedHeight: 100
    created = created + createQuantity;

// Remove all
function removeAllEntities(){
    for (var i = 0; i < testDataSource.entities.values.length; i++) {
        var entity = testDataSource.entities.values[i];
        deleted += 1;

Sandcastle.addToolbarButton('Add ' + createQuantity + ' Entities', function() {
    console.log('Created: ' + created);
}, 'addEntities');

Sandcastle.addToolbarButton('Remove All Entities', function() {
    console.log('Deleted: ' + deleted);
}, 'removeAllEntities');

Thanks for the thorough explanation and sample code.

You are absolutely correct that AssociativeArray should delete the key rather than setting it to undefined. I think that’s actually a much bigger problem than you indicate, since it means we have an object growing with an unbounded number of keys (which is something that causes Chrome to eat up memory and kills performance). I’m surprised we haven’t noticed it before. We will definitely fix this soon, if not before the next release.

For the second case, I think there may be a problem here, but I don’t think it’s exactly what you describe. Using your example code, I could create and destroy 5000 Entities multiple times, but the heap dump only ever showed 5000 entities total. I think what you are seeing is some temporary caching, but I need to look more into it to be sure (I would love to be wrong because that means we can easily improve memory usage by fixing the problem). The object you are referencing: testDataSource._visualizers[1]._primitives._primitives[0]._pickIds is actually just scene.primitives[0], which is why it looks like all of the visualizers have it (they all have a handle to the main scene primitive collection). Once I remove the entities, this Primitive gets destroyed.

Keep in mind that each type of Entity visualization has a difference code path, so it’s also possible that rectangle has a leak (since that’s what you are using here) while other visualization does not.

I’ll hopefully take a closer look at this tonight and open a pull request for anything I find.



Thank you for your swift reply!

It looks like adding the following monkey patch does appear to fix part of the ever growing memory usage due to the hilarious number of hash keys that I end up with:

Cesium.AssociativeArray.prototype.remove = function(key) {
    //>>includeStart('debug', pragmas.debug);
    if (Cesium.defined(key) && typeof key !== 'string' && typeof key !== 'number') {
        throw new Cesium.DeveloperError('key is required to be a string or number.');

    var value = this._hash[key];
    var hasValue = Cesium.defined(value);
    if (hasValue) {
        var array = this._array;
        array.splice(array.indexOf(value), 1);
        delete this._hash[key];
    return hasValue;

You are correct that when adding and deleting a bunch of entities, the number of entity objects in the heap profile will go down at some point, so it doesn't look like that is a memory leak as such. I just got thrown by the fact there were still Entity objects around at all after removing them.

It still seems to cause an issue sometimes if a large amount of entities are created. The Entity objects will hang around for a long time and the memory usage will be hundreds of MB higher than you would expect for the number of entities actually in the collection.

Try clicking the create entity button a few times and creating 20-30k entities and then removing. The memory usage hangs around, and isn't always released when you then create and delete another 5k.

Thanks for your help!


I just opened a pull request to take care of any potential issues I uncovered: Your test case no longer shows any Entities or Primitives in the heap. I’m not sure how much the heap fix would really affect real world uses, since the memory would be reclaimed when you removed the data source or re-added any rectangles; but maybe there’s a situation I’m not thinking of (and there’s no reason to lazily keep it around if we don’t have to).

The AssociativeArray delete change will definitely improve real world use cases (as you just mentioned in your reply).

Thanks again for the report.

Thank you, I really appreciate it!

Well i suffer from the same problem in my application , it seems like the entity doesnt really freed from the heap, even after i removed them from the entity collection.

My entites are points on the map..
There is any solution?