I am attempting to display weather data in cesium via some cubes in the sky. I’ve found many examples achieving this using 3D Tiles, the closest being these:
I am creating a GLTF file (using pygltflib) from a large weather file. A GLTF of 1 million primitives (at some lat/long/alt) can be uploaded to Cesium Ion without issue, but 2 million (~984mb) seems to be too large and displays an error in the Cesium preview:
Assuming the above issue is due to filesize, it makes me think I’m not optimizing my GLTF, and makes me question if 3D Tiles is even the correct solution for my goal.
I may need to change the displayed colors based on weather at a specific time. Does a temporal component mean I should instead be considering a CZML solution (I don’t think so)? (Cesium Time Animation using CZML – Cesium)
Will a GLTF 3D Tile solution allow me to access and update properties of various primitives (efficiently) based on the current Cesium timestamp?
Cesium Ion doesn’t display the filesize of “Google Photorealistic 3D Tiles”, but I have to assume it’s massive; significantly bigger than the 2 million points I am trying to upload (and I’ll want to create even larger files in the future).
It makes me think I must be structuring my GLTF incorrectly and that I need to create either a tileset.json (not sure how yet) or some other file format based on clues from GitHub - CesiumGS/3d-tiles-tools.
If I understand your goal correctly you would like to display 4d (3d + time) weather data. While CZML does support time dynamic data, it does require the whole file be loaded before the data is displayed. If you have a large amount of data this doesn’t scale well. 3D Tile would be the best format for a large amount of graphics. Though it will require some additional work on the viewer to manage which 3D Tileset should be visible at what time. This is an area that Cesium is interested in as well.
It sounds like the problem you ran into above is not displaying the data as a 3d tileset, but getting the breaking up the data into tiles. 3D Tiles can handle massive amounts of data. Google Phototrealistic 3D Tiles is a 3D Tileset that covers the whole world.
I have opened up an issue for our team to investigate why this particular asset did not tile successfully.
In theory, there is no upper limit for a 3D Tiles data set. And in theory, there also is no upper limit for the size of a glTF file. But this is somewhat unrelated to Cesium ion: Depending on the exact contents and structure of the glTF, the Cesium ion tiling process might not be able to process certain files and convert them into a “reasonable” representation of 3D Tiles. It’s hard to be more specific here, but in doubt, imagine a file that ‘does not make sense’ - like one that contains a single mesh with hundreds of millions of triangles where all vertices are at (0,0,0)…)
However, I’m going out on a limb and make a guess of what might be the reason for the issue here:
Apparently, when calling gltf.save("example.gltf")
in pygltflib, then it writes the glTF as an embedded glTF. In this representation, the binary data (for geometry and textures) is stored as a base64-encoded data URI string. This is wasteful (because it increases the file size by roughly one third!), and can easily cause trouble for certain libraries, when this string simply becomes too long.
and upload the resulting .glb (glTF binary) file. This is more likely to work.
But note: There is a strict upper limits for GLB files. Based on the specification, a GLB file can not be larger than 4GB. Most tools will have trouble when the file is larger than 2GB. So 2GB should be considered to be the upper limit for the size of a GLB file in practice. If you want to put “more data into one glTF”, then there is an intermediate option, where you write out the .gltf (JSON) file, but do not use the ‘embedded’ (data-URI based) representation of the binary data, but store the binary data in dedicated (.bin) files instead. I’d have to look at some details of pygltflib to see how this could be accomplished there.
A small update: There are some hints that certain parsing libraries may have problems with JSON files that are larger than 512MB in general. If this is the case, then … converting the data to a GLB would not help, and other approaches may have to be considered. This is currently being investigated further.
Thanks for the effort and info on this Marco.
To attempt to assist in debugging, I’ve uploaded two new GLTF files as separate 3D Tiles.
assetId: 2996301 has 10k points and is 2.42mb
assetId: 2996303 has 4M points and is 991.18mb
Both were created using the same script using pygltflib.
If there’s theoretically no upper limit to filesize, it makes me think my GLTFs are structured incorrectly. Are you able to make use of these uploads while investigating? They should have identical formats, just more data.
Also, we are not strictly locked to pygltflib, it was just the tool that looked easiest to use for structuring data from a CSV into GLTF format. If this ends up being a weird issue with pygltflib, we’d happily accept a suggestion for an alternative tool.
Let me know if there’s more info I can provide.
Thank you.
If there’s theoretically no upper limit to filesize, it makes me think my GLTFs are structured incorrectly.
There is no limit in theory, but there are limits in practice. And this does not refer to the limits of the file system (which may be 8 petabytes or so), but limitations of common libraries for JSON processing that are far lower than I would have expected. Specifically, it appears that there are libraries involved that do have a hard limit of 512MB for JSON files.
So the structure of the file should be valid. (Maybe with some room for optimizations, but that still wouldn’t scale indefinitely). And you mentioned
assetId: 2996301 has 10k points and is 2.42mb
and I assume that this should run through properly. So you can still use pygltflib to create these files. But if possible, you could consider breaking these files into smaller chunks. Something like <500MB should work, but from your description, it sounds like there is some leeway in how many elements you put into a single file.
A step beyond that: In your first post, you mentioned some possible requirements regarding time-based modifications of the colors of individual primitives. It may not be entirely trivial to achieve this based on the glTF files. But it might be necessary to gather more information about that (for example, where the information about the colors is supposed to come from, or how to identify the primitives that should receive these colors).