I wanted to share an update on a new feature that’s nearing completion in CesiumJS and invite early feedback from the developer community.
We’ve been working on support for displaying panoramic images directly in a Cesium scene, with support for both equirectangular and cubemap image formats. The goal is to make it straightforward to place immersive pano imagery into a 3D context and integrate it naturally with the rest of the scene.
The implementation is close to feature-complete, and we’d love for interested developers to:
Try it out
Kick the tires
Tell us what works well (and what doesn’t)
You can find the work-in-progress on GitHub here:
At this stage, we’re especially interested in hearing:
What would you be looking for in a panoramic imagery feature like this?
Are there workflows or use cases (e.g., inspection, visualization, storytelling, analysis) that you’d want to be well supported?
Any expectations around performance, API design, or integration with existing Cesium concepts?
The initial feature has support for loading panoramas from Google Streetview APIs with a GoogleStreetviewProvider class. Are there other sources of panorama imagery we should also have “provider” classes for?
Anything missing that would make this more useful in real-world applications?
Even high-level thoughts or “it would be great if…” ideas are very welcome — this is a good moment to help shape how the feature lands.
Thanks in advance for taking a look, and we really appreciate any feedback you’re willing to share!
Very interested in this initiative! We’re planning to integrate ground-level panoramic imagery into our CesiumJS-based digital twin platform.
Our context: We provide territorial development tools for 50+ municipalities across Switzerland. Our platform delivers aerial photogrammetry (photomeshes, true orthos, LiDAR processing) and we’re looking to complement this with street-level imagery.
Key use case: Metric measurements from imagery
What would be particularly valuable: the ability to take measurements directly from 360° panoramas by projecting onto underlying 3D metric data (photomesh or LiDAR point clouds).
This would enable:
Measuring street furniture, facades, road widths by clicking in the panorama
Getting true metric values by raycasting against the 3D geometry
Cross-referencing ground perspective with aerial/ground survey data
Providing survey-grade measurements from simple photographic captures
Creating annotations combining visual context and precise dimensions
Technical approach: Instead of trying to derive metric information solely from the 360° image, use it as a visual interface while the actual measurements come from projecting rays onto existing 3D assets (photomesh tiles, LiDAR, etc.). This gives metric value to the imagery without requiring stereo rigs or complex photogrammetric processing of the panoramas themselves.
Requirements:
Ray projection from panorama pixels to 3D tileset geometry
Coordinate precision
Annotation tools with measurement display
Seamless switching between aerial view and street-level perspective
Are you considering this type of hybrid approach where panoramas serve as a visual interface for measurements on 3D metric data? This would be a game-changer for municipal workflows. Happy to discuss implementation details or beta test.
Excited for this feature to be in CesiumJS, I saw a few presentations at the developer conference last year using panorams and this will make it easier for others to integrate this data.
Cesium ion has a use case for displaying normal photographs (not panoramas) in a 3d context. We know the camera position and orientation (and probably some other information about the image geometry). I’m wondering if we could use this feature to take the view to the camera position and display the photo floating in space so it appears co-incident with the 3d graphics.
Thanks @Theo_Benazzi for your interest and sharing your use case!
Our initial approach is to render the panoramic imagery behind other elements in the scene. Your suggestion to project the image onto the 3D assets in the scene is interesting and something we would like to experiment with. We think it is currently possible to project the panorama (same way we project textures onto terrain from above for example) but the process is not straightforward to get working so first step is to prove it out (if anybody wants to take a crack, by all means).
For measurements, as long as the scene has 3D data such as a lidar point cloud gathered alongside the image, we anticipate measurements will work the same. We are eager to hear feedback on this point though since it sounds like measurements or (location based) annotations are a top use case that having panoramas in scenes will enable.
Please let me know if there are any technical details I missed or further questions & feedback this raises!