I have a shapefile containing about 50,000 building footprint. The aim was to view them as 3dtiles in Cesium.
The steps i did was
Extrude the building footprints based on it’s attribute height and export to collada (single file). this is done in Arcgis pro. Viewed in Cloudcompare. looks good.
Convert using collada2gltf to get glb file. Result in a single file,say building.glb
Convert glb file to 3dtiles via glbTOB3dm. Result in a single file, say building.b3dm
File size is the same as the glb file.
I guess i missing something here. Looking at the samples context capture Marsellie, the b3dm are small and many.
My questions are do i export Step 1 as a single collada file for every building.
If yes, after reading on the list abt tilesets, how do i go about creating the tilesets to bind the b3dm files together.
Yes you will need a different approach, since storing all the buildings in the same tile is equivalent to just loading a model of all the buildings.
Another approach you probably want to avoid is storing a single building per tile, as this will result in poor performance overall.
To build an efficient tileset you will likely need to “batch” buildings together into the same tile. If you want these buildings to be individually selectable, the glb needs to contain a batchId attribute as described in the spec. Finally you will need to build a tileset.json file that references these tiles and provides bounding volumes.
In general terms, how do you determine the number of batches in a dataset? Is it based on no. of objects or b3dm file size.
Further, is this “splitting” of dataset done before converting to gltf/glb?
The number of batches is highly specific the tileset generator, since there is a tradeoff of reducing the number of draw calls (each batch is usually 1 draw call in Cesium) vs. keeping the tiles a reasonable physical dimension vs. not allowing the b3dm file size to get too large.
The splitting can be done before or after depending on the generator.