Occulus Stereo implementation

I’ve got a couple of stereo devices that use the iPad, iPhone, android tablets and/or android phones. I suspect that the devices will not return the correct information from navigator.getVRDevices. For one the devices are not VR devices, but are VR capable. I really don’t think the browser API is correctly implemented with VR in mind. This is not cesium issue at all, since this API is part of navigator. I guess my issue is that VR/stereo is more of an application developer issue instead of a browser issue(altough one could argue that the browser would need to possibly implement different user event listeners given the hardware). Anyway, I was very happy to see the Occulus support, because it gives me a heads up on supporting our VR devices.

Question: If I wanted to create a group with two camera children separated by the IPD, and have each camera render to half of the canvas would that be feasible? Or should I create two canvases that split the container with each camera rendering to a canvas? Or is the scene and the canvas somehow tied at the hip? I was a little surprised to see how the cameras were being rendered and the left was being copied. Regardless, it seems like the solution should be fairly straight forward given this example. I’ve done most of the VWF-Cesium integration so I’m pretty familiar with Cesium. I was just hoping for some pointers on how to get this working.

Scott

Scott,

The ultimate goal will be to use one canvas (and one Cesium scene) and render two different cameras/projections into two different viewports in the canvas. This will allow, for example, to share WebGL resources between each eye, and to drive culling and LOD by one large frustum instead of one per eye, which leads to inconsistencies while loading. However, this is going to require some internal Cesium changes that I don’t think anyone is working on right now. So the best short-term solution is to use two separate canvases, each with its own Cesium scene.

By the way, is the Cesium demo still up on the VWF page? I couldn’t find it here: https://virtual.wf/demos.html

Patrick

Gotcha, Thanks for the reply.

The demo is still on the server, but there’s no link on the demos page. There is a timeout issue loading over https, that needs to be addressed. I was able to just load the demo over http, however.

http://demo.virtual.wf/agi/cesium/

I’ve been trying to keep up to date with agi’s releases. I’ve kind of had to spear head this effort on my own due to budget constraints. We’re trying to release version 1.0 of VWF and there are several outstanding issues. :frowning:

Here’s the latest cesium update branch that I’ve submitted. I’m not sure if this will fix the https issue or not, but I did recently update our nodejs server to address a timeout loading issue. I would have expected that fix to be on the main site by now, but I’m not sure if that in the ruby or nodejs server. I’m going to catch up with the team this afternoon, and I’ll check on all of the above.

Scott

Thanks for the update. This demo is working for me.

Patrick

Perhaps that ultimate goal could be extended to surround view as well. Can one web-browser window be extended across multiple displays? While stereographical displays would use one canvas&scene to display 2 parallel view-ports, perhaps a surround cube could use one 4x wide canvas extended across four screens, each screen with it’s own view-port separated by 90 degrees viewing direction.

Here’s a VisionaiR360 presentation showing surround view

And a silent video

(this uses projectors, but the same thing could be done with large monitors encapsulating a much smaller room)

Similarly for half-surround displays:

180deg 5 vertical displays creating a partial octogon

180deg 7 vertical displays creating a partial dodecagon

https://www.youtube.com/watch?v=WAvfPkWtPV0 Liquid Galaxy,Google Earth,Leap Motion

https://www.youtube.com/watch?v=5EHtqZxc2Uc Liquid Galaxy,Google Earth

https://www.youtube.com/watch?v=eSRCS8rV3pI Liquid Galaxy,Google Earth,Space Navigator

Queensland University of Technology have been using Cesium on their Cube wall for G20. I believe they were using code derived from the frustum offsetting in the postprocess hook branch (but I could be wrong).

http://www.thecube.qut.edu.au/
http://geospatialworld.net/Interview/ViewInterview.aspx?id=31221
https://www.youtube.com/watch?v=SwoFJS_oz3E

Chris

Thanks for sharing, Chris. Do you have contacts there? I’d love to showcase this on the Cesium website.

Patrick

While that interview you linked to does mention Cesium, I can’t find any other link that connects G20 with Cesium. G20 seems to be a dataset. You can download a KML to view the dataset in Google Earth (standalone version) https://www.dnrm.qld.gov.au/mapping-data/queensland-globe/globe-for-g20/install-globe
It seems to be a network link. Perhaps Cesium could also access the G20 dataset, I’m not sure.

Hi All,

Yes, last year our visualisation team (ViseR) here at QUT developed the Cube Globe in collaboration with the QLD government - using Cesium to display the QLD government open data, which is referred to as Queensland Globe (https://www.business.qld.gov.au/business/support-tools-grants/services/mapping-data-imagery/queensland-globe). The Cube Globe was designed for display on the QUT Cube environment (http://www.thecube.qut.edu.au)

For the large "wedge" projection space we did use the offset frustum branch, to implement the continuous view across the wedge (110deg). The two halves of the wedge are running on separate PCs with camera updates synced across websocket.

Queensland Globe is essentially a collection of open datasets currently delivered as KML for use within Google Earth. However for the Cube Globe we accessed the underlying datasets (shp/csv etc..) which we served then from Geoserver to be consumed by Cesium.

Patrick, we would be honored to have Cube Globe showcased on the Cesium site. Video here: https://vimeo.com/112011647

cheers,

Allan

Wow, very cool. Will contact you for the showcase.

Patrick