I’ve been slowly working away in the last few months on porting Cesium to iOS. Things have been a bit slow as I’ve been getting to grips with Swift and working on a few other projects.
I’m pleased to report that large parts of Core and Renderer are now complete, and the engine is now able to render an untextured globe with Ellipsoid geometry at 60fps.
I’m planning on implementing textures and the Bing maps provider next, then expanding the camera controls with a view to adding gesture support.
Thanks so much for the update. Excellent progress. Is this using the ellipsoid geometry in Cesium? Or is this the ellipsoid terrain provider that is tessellated based on the view?
Once we have an apples-to-apples base globe, I am very curious to see performance/memory/power-usage between Cesium/WebGL and Cesium/iOS.
I’ve implemented the Ellipsoid terrain provider (and the underlying HeightmapTerrainData) so should be reasonably similar to the default Cesium view without terrain. I just need to finish off the LOD/upsampling algorithms as its currently only rendering tile geometry at level zero.
Certainly performance at this stage is good with 60fps and approximately 25-30% CPU use with an optimised build.
Just to give an update on the iOS port I’ve been working on. I’ve just finished an initial implementation of the Bing Maps ImageryProvider and the necessary Renderer and Scene to enable rendering to the globe. Looking pretty good I think… Performance isn’t great in the simulator as it uses a software renderer, so run on device for best results (although you’ll need an iOS developer account to run on device).
Do you know what the FPS is for different views (zoomed out global view, zoomed in looking straight down, and zoomed in looking towards the horizon) on actual hardware?
Are you interested in working with a student on this over the summer? If you are able to mentor and provide a project description, we could include it in our Ideas List for Google Summer of Code.
No worries. Performance with a release build on my iPad Air 2 is 60fps at 2048x1536 with about 70% CPU utilisation (~11.4ms CPU time per frame). I feel like this could be better as a lot of time is spent setting and calculating uniforms, particularly when these aren’t changing between frames.
I haven’t implemented Camera.lookAt yet, but hopefully I can get that done this morning and let you know. Given there’s some renderer overhead I’m hoping it will stay at 60fps though.
I would be interested in mentoring yes, what priorities were you thinking for a project? A lot of the work is fairly uninspiring translation from Javascript to Swift, but there is a bit of a challenge in going from a dynamically-typed to statically-typed language. It’s also a good opportunity to learn more about coding for iOS and for OpenGL, I know I’ve learnt a lot!
I’ve now tried looking straight down via Camera.LookAt and get 60fps at an altitude of 10,000m. Looking at the horizon with Camera.LookUp is a bit slower (~38fps) but again it’s CPU bound with 26ms CPU time and 5ms GPU time per frame. I’ve also noted an odd rendering artefact when looking at the horizon, presumably it’s something to to with LOD.
I feel like this could be better as a lot of time is spent setting and calculating uniforms, particularly when these aren’t changing between frames.
What version of Cesium are you using? We’ve added some optimizations to uniforms in the past few months.
We will also significantly improve the performance of setting uniforms this summer, by separating static/dynamic ones and by using uniform buffers when available (WebGL 2, ES 3.1).
I would be interested in mentoring yes, what priorities were you thinking for a project?
Totally up to you. Take a look at the projects already on the Ideas List and send me (pjcozzi@siggraph.org) the info for your project, including a screenshot like the others, and I will post it. Student applications start in about two weeks and students are already looking at organizations so the sooner we can add your project, the better the chances of us finding an interested student.
Looking at the horizon with Camera.LookUp is a bit slower (~38fps)
This is expected since the horizon view generally loads the most number of tiles. In the future, we will use fog for distance culling.
What version of Cesium are you using? We’ve added some optimizations to uniforms in the past few months.
Embarrassingly, a checkout of the master branch from just before 1.1 was released. I’ve been putting off upgrading my copy while I was getting the renderer an core up and running, but I’m well overdue to upgrade so I’ll focus on upgrading to parity with 1.7 before I plunge into anything else.
We will also significantly improve the performance of setting uniforms this summer, by separating static/dynamic ones and by using uniform buffers when available (WebGL 2, ES 3.1).
Sounds good, hopefully Apple will include ES 3.1 in iOS 9. I’ve also got some fairly basic optimisations to make such as implementing partially applying RenderStates etc.
Totally up to you. Take a look at the projects already on the Ideas List and send me (pjc...@siggraph.org) the info for your project, including a screenshot like the others, and I will post it. Student applications start in about two weeks and students are already looking at organizations so the sooner we can add your project, the better the chances of us finding an interested student.