Hi Team,
I have some drone data with heavy in size like more than 50MB which renders very slow on the globe.
I want to reduce the size of the data, Is there any way to reduce the data?
For instance, Cesium recently added Google data (Cesium Sandcastle) which is very light in weight and very fast in rendering the data on the globe. I want to make my data similar to Google.
There are many possible reasons for why rendering a certain model or data set can be slow. And there may be two goals that can be tackled more or less independently, namely
Improving (rendering) performace
Reducing the data size
I said āmore or less independentlyā, because reducing the data size can be beneficial for the āoverallā or āperceivedā performance - namely, when the data is downloaded more quickly. It may also have an effect on the pure rendering performance, but for that, one would have to identify the actual bottleneck here.
You mentioned āglb, b3dm, tilesets etcā - but the exact recommendations here will strongly depend on what type of model/data this actually is. Roughly:
When you want to reduce the size of a B3DM file, then the actual reduction of the data size will be achieved by reducing the size of the contained GLB data. The GitHub - CesiumGS/3d-tiles-tools library offers some options here, but the details depend on the exact input and desired output
When you want to reduce the size of a whole tileset, this will also boil down to reducing the size of GLB data. The 3d-tiles-tools may be helpful here, but there are even more degrees of freedom for how to apply it to achieve the desired goals
@Shashi hello,
you forgot to specify in what way you load the data and in what format they are. Is it a download from your server as a polygonal primitive or entities?
Or is it uploading as a 3DTiles or 3d model via the Cesium ION streaming service?
Believe me - the results will be very different - especially for large models.
I hope it is help you.
@dihand
Iām not adding data through Cesium ION. Weāve data on our server add adding data through entities.
Weāve b3dm data converted into a tileset.
@Marco13
Iāve applied the tools provided by you and itās helpful to reduce the size but the problem is that once it compresses the data we canāt re-compress it. Is there any possibility to recompress the data (glb, glft etc) again and again?
@Shashi - Iām glad who it is little helped you.
Not sure if glb can be compressed even more - they are highly optimized. Although the format supports instances - so if you have repeating details and there are a lot of them - this can really help - I worked mainly with mechanisms(glb from CAD) on the planet and buildings - these are sizes from 100mb - BUT we had a couple of examples of point clouds of a couple of gigabytes - through the ION service they work quite correctly and quickly
You can see more glb information from KRONOS:
@Shashi - I think you need to take a better look at your file download scenario - ION uses streaming - similar to how you watch a big youtube video - so it all works out of necessity - if the download happens in the usual way (fetch() or XMLHTTPRequest)- then most likely you will wait a long time for a full download big file.
It was like this for us - a large mechanism of 250 megabytes began to load much faster if we loaded it in parts and only those that were visible at the moment - excluding internal parts - after this - we was test 50 similar models in one scene - all not bad worked
I think that it is important to be clear about the goals here.
Do you want to improve the rendering performance?
Do you want to reduce the size of the models?
If it is the latter, then there are further details to zoom in (pun intended):
Do you want to make a single GLB file as small as possible?
Do you want to make sure that a user can quickly render the model in CesiumJS, without having to wait for one, large download, but still zoom into details where necessary?
If the goal is to make a single model as small as possible, then there are many options. One āoverarchingā question is: Is the model large because it contains lots of geometry, or is it large because it contains large textures?
There are different compression methods for geometry (lossy or lossless), related to simplicifation or quantization or sorts of ārun-length encodingā. And there are different methods for texture compression (again lossy or lossless), and they may focus on the file size or GPU memory size.
Regarding the question
Iāve applied the tools provided by you and itās helpful to reduce the size but the problem is that once it compresses the data we canāt re-compress it. Is there any possibility to recompress the data (glb, glft etc) again and again?
The answer is that all compression approaches involve trade-offs. One could dive into theoretical details here (deeeep details). But from the user-oriented perspective, the question is always the same:
What are you willing to sacrifice?
When you have a large model with detailed geometry, then you can simplify the geometry. The file will be smaller, but details will be lost. The limit case is that you represent your large, complex model as a single box, consisting only of a few bytes.
Similarly for textures: You can have a 4000x4000 pixels PNG file as a texture, or replace it with a 500x500 pixels JPG file that is stored with a āqualityā of 0.5. The file will be smaller, but the textures will appear more blurry. The limit case is that you donāt really have a ātextureā at all, but rather a ā1x1 pixel textureā just that determines the color of whole model.
More information about the kind of data and the goals will be required here, in order to give fucussed recommendations.
it seems to me that all this leads to the question of the term LOD(Level of Detail) - it often saves the situation in games and a partially similar algorithm is used in tiles - and this will reduce the file size BUT it will just be a reduced copy of the original - if I understand the essence of the questions and answers correctly, of course. )
Thatās one core part of the question (although it was not explicitly addressed in the options that I mentioned). When the input is a 500MB GLB file with 50 million vertices and many high-res textures, then itās simply not possible to reduce its size to 1 MB with some automagic compression method like while (file.size() > 1000000) compress(file)
and expect any sensible result. So itās important to know more about the data. Because it might very well be possible to upload this GLB file to Cesium ion, and receive a 3D Tiles tileset that basically has a ābuilt-inā concept of LOD, and provide this (originally 500MB) file for a client in a way that loads and shows details on demand.
There are still options for compressing the data locally. For example, with gltfpack or glTF-Transform. If this does not bring the desired results, you may have to describe in more detail what you meant by
once it compresses the data we canāt re-compress it. Is there any possibility to recompress the data (glb, glft etc) again and again?
As I said, thereās always the question of which degree of simplification is acceptable before the (visual) quality of the data becomes too lowā¦