LoadJson Limit to number of features?

Hi,

I'm loading parcel data from a county database that can hold upward to hundreds of thousands of features. I'm using a query feature to return parcels that meet certain criteria. Depending on how broad the query is, it can return features in the 10s of thousands. For a query that returns say about 50,000 features in geoJson format:

I put a breakpoint on the first line of code in the promise of Cesium.LoadJson. On Chrome, the browser crashes (Aw Snap!) before the first line of code is reached. In Firefox, the browser doesn't crash but the first line of code is never reached though the app is still functional.

Is there an upper limit to the number of features (or just size of the geoJson in general)? If so, is there a way to determine what that upper limit is? Is it a browser limitation? Or the local computer limitation? Is there any way to tweak that limit?

Ideally I'd like to design the application so that it will manage this upper limit on queries. Any enlightenment would be appreciated.

Thanks!
Joe

Unfortunately, this question has no one answer because of the number of factors in play.

  1. It goes without saying that more powerful hardware is going to be able to handle larger data sets. Both CPU and GPU play a factor here. Right now Models/Billboard/Points/Paths/Labels are CPU limited, while everything else (Polygons/Polylines/etc…) are GPU limited (once loaded). If all you have are polygons and lines, you can load more data than if it’s all points.

  2. The browser absolutely plays a factor here. I can’t say exactly what your 50,000 feature issue is, but it’s possible that the browser itself is choking on the size of the JSON file your are trying to parse. Number of overall features isn’t an issue here because it also depends on how much meta-data is in each feature. It’s more about the overall size of the GeoJSON data you’re returning more than anything else. For huge data sets, using a 64-bit browser will also work much better because you’ll have more memory to play with. In our experience, 64-bit Chrome is the best platform for Cesium in general. That being said, Chrome seems to have issues on JavaScript objects with a massive number of properties (which Cesium currently uses to mimic a hash set functionality). I’ve been meaning to look into submitting a bug for this. Ultimately, GeoJSON isn’t an ideal choice for large data sets.

  3. We are always looking at ways to better optimize Cesium, in particular the Entity API which has recently become our primary top-level API (and is what is used to load GeoJSON data). I recently ran some tests and uncovered some inefficiencies that I want to address which should dramatically improve load times for large data sets, I just need to carve out time to work on it.

Hopefully this sheds some light on the subject for you. I can’t give you a hard number for your app without knowing more about it. I would look into the size of the GeoJSON you are returning and benchmark how long a call to JSON.parse takes on it (and whether it’s choking there or not). If you want to share (or you can email me privately) a sample of one of the large files that’s failing, I can take a look at it and also use it as a test case for optimizing Cesium in the future.

Thanks Matt,

I assumed it was probably overall size of the dataset but thought you might be able to provide some insight as to were it might be choking. I'll be more than happy to send you a sample file. I do use 64 bit chrome though I've had to hold back on updating my version because the newer versions insist on using the software rasterizer even though hardware acceleration performs far better. But I'm unable to force newer versions to use hardware acceleration. I also simply need to upgrade my computer. :slight_smile:

I use Geoserver as my GIS server. Do you have a recommendation for some other source than GeoJSON that will perform better?

I'll take your suggestion and take a closer look at JSON.parse. And look for my email coming with a sample file.

Thanks!
Joe

Joe,

Assuming that your goal is to display as many parcels at once as possible, you could characterize your system’s performance when the app starts. That could include creating increasing numbers of representative geometries until you’re no longer happy with the performance (e.g. too slow or it crashes). There are a lot of variables in play here, so this could be quite an exercise.

Alternatively, you could change the app’s strategy to require the user to manipulate the query to limit the number of items returned to a more reasonable value (e.g. apply spatial element to query like parcels inside a polygon the user draws).

Hope that helps,

Scott

Scott,

Yeah the problem is that it will run on any random user's system, so I can't optimize it for any particular machine. I would need some way to measure it relative to the performance characteristics of the system it is running on or do as you say and put more control in the user's hands with warnings/guidance. I'll probably end up doing the latter but also design the application to attempt to keep the scope within a range where it won't download unreasonable amounts of data, such as limit certain queries based on zoom level, etc.

Thanks!
Joe