KML big File size not working..

Hi Matthew,

What should be the biggst KML file size which cesiumjs can handle ?

I have KML file more then 10mb to 50mb will that work ever?
I had reported that issue few weeks ago.

I need to know should i wait for fix or give up ?

Thanks

om

I have had problems with large KMLs as well. I have attempted to load 50k points, only to have cesium crash.

Hey, Im having the same issue as well. Im working on 3d city visualization on Cesium and the browser keeps on crashing. I guess the solution would be to break down the kml file in the form of tiles, which by the way AGI is currently working on(3D Tiles). I referred to this link (https://developers.google.com/kml/documentation/mapsSupport) which suggests that max. sizes allowed on Google Earth for kml and kmz files are 10mb and 3mb respectively. So I think similar constraints must be there in Cesium as well. If someone could help in this regard, it will be very appreciated. I really can’t wait till next spring for 3D tiles to be effective.

Thanks.

Sorry for taking a while to reply, but the answer here is a bit complicated and nuanced and I wanted to make sure I provided everyone with a good explanation.

The truth is, that the size of a KML is not a good metric for whether or not Cesium will be able to load the data, it’s what is in the KML that matters. Also keep in mind, that the size of a KML file means something very different than the size of a KMZ file. Since KMZ files are compressed and XML is so repetitive/compressible, a 1 meg KMZ file could easily be 20 megs of KML once decompressed. You can see this yourself by renaming .kmz to .zip and unzipping it.

For a real world example, Michael posted a 3.5 megabyte KMZ file that once decompressed is actually 59 megabytes of KML (which is about 1.3 million lines) containing around 154,000 placemarks. Cesium currently runs out of memory while trying to process this data.

So what does affect Cesium’s ability to load a KML/KMZ file? There are several things.

Most importantly right now is probably the number of placemarks. This is partially due to inefficiencies in Cesium and partially due to the nature of browsers (which I’ll discuss more of in a minute). For example, I’ve loaded KML files as large as 70 megs but with only ~20,000 highly detailed polygons total. Cesium can actually load this data without a problem (but it does take a few minutes). This is from my memory, I’ll see if I can dig up an actual file for better comparison.

The type of placemarks have an effect as well. For example, Polygons probably take longer to load, but you can load a lot more of them compared to Point geometry. You could probably get away with 30,000 polygons but you might choke on 30,000 markers.

The nature of JavaScript and the way browsers work is also a large part of the problem. First, JavaScript is simply slower than native languages, processing KML in the browser is never going to be as fast as processing in Google Earth, which is written in C++ but in time we may get close. Second, browsers enforce artificial memory limitations on individual web pages. While Google Earth is free to use as much memory as the operating system allows, web pages are usually only allowed between 800 and 1600 megs of RAM (but it depends on the browser). Third, JavaScript itself uses more memory to store the same amount of data compared to C++, so memory usage in a web app is naturally higher.

Another important thing to keep in mind is that there are a lot of crazy KML files out there than go out of their way to try and express data in a way Google Earth can understand. For example, Michael’s 60 megs of KML appear to create a simple outline of a tunnel. Cesium’s PolylineVolume can probably do the same thing instantaneously in a couple of lines of code. Since Google Earth doesn’t support polyline volumes, whomever created that KML file had no choice by to shoehorn the data into Google Earth in a way it could understand. (And even Google Earth starts to chug with that particular file) This is an important point to make, just because Cesium may have problems with the KML doesn’t mean Cesium can’t easily perform the same visualization that the KML represents. It may just require a different (and usually simpler) approach. Obviously this doesn’t help people with a library of KML files that they didn’t create themselves, but it’s still an important distinction.

My overall point is that there is no simple answer to “can Cesium handle this KML file?”

It’s not all doom and gloom though. As browsers and JavaScript continue to improve, I think we’ll see RAM usage come down and performance go up. There are also a lot of things we can do on the Cesium side of things to be more efficient about Entity visualization but these are fairly large changes and some of them may be breaking. It also requires a bit of experimentation and research. They will most likely be part of a much larger Cesium 2.0 effort which we currently don’t have a time-frame for. They will eventually happen because I want this as badly as you do, but it’s a matter of time and priority. We will continue to improve KML compatibility and performance when we can in the meantime.

As an aside, I strongly recommend anyone who can to use 64-bit Chrome, it will give you the best performance and allow the most memory usage of any of the browsers (as far as I am aware). A lot of people are using 32-bit Chrome and selling themselves short.

Finally, as someone already alluded to, the real solution is 3D Tiles. No matter how much we improve KML or Entity support, browsers and browser applications are simply a different beast than native and you need to have a different approach for visualizing massive datasets at scale. A lot of things that Google Earth can get away with by “brute forcing” are not possible in the browser because of JavaScript performance. Take the NYC demo. It currently includes 1,140,378 models. If you were doing this in Google Earth, that would take up about 10.3 GB of data on disc (the Collada files). Impossible to stream down for the web. Once we processed that into 3D tiles, we were down to 345 MB on disk broken up across 4,199 tiles. That’s smaller than most imagery sets. Normally when you batch files like this you end up with reduced interactivity, however with 3D tiles every one of these 1.1 million buildings is still selectable and has its own set of meta-data embedded with it. That level of selectability is something even Google Earth’s native building implementation struggles with. 3D Tiles support is in heavy development, but we are actively working on it and ultimately it will be able to handle everything KML can and more (such as point clouds).

Hopefully, that will answer any and all questions people have about KML loading performance, if not, please let me know and I’ll try to clarify further. Thanks.

Thanks you very much for such a comprehensive answer Matthew. I really appreciate for sharing this information. But I still have a small query. What if we could perform tiling of the large kml/kml files and try to load the tiled kmz files on Cesium, wouldn’t that be a solution? And if so, what would be a proper way to tile the huge kml/kmz files. I know one way is to use 3D CityDB’s importer/exporter tool which consists of KML/COLLADA export option, but Im facing some problems of my own using it. Can such kind of approach be made?

Great writeup Matt. I’ve got a lot of legacy apps and kml to look at, and porting everything to new formats will take time. What kind of optimizations could be done in Cesium to support processing a large number of entities, or even kml-specific optimizations to help parsing large kml files?

I am working with BIM in infrastructure projects like pipes, roads, streets, highways. Now I have a model of a village which KMZ have a size of 7Mg and in KML 150Mg.

Is it imposible that works with Cesium? In Google earth run very well but I would like to use Cesium. Are there any news about this problem?