I’m trying to get a handle on using multiple simultaneous views with Cesium, and the culling/LOD seems to be working against me here. We have a setup with Cesium and nDisplay and multiple users viewing a common scene. However, it seems like the view-based culling of tiles appears to be based on the first camera that is initialized. If we take over with a debug camera, the culling at least follows the movement of that camera, but still has the FOV of the first camera initialized (which is much narrower)… as a result, you get some very strange culling because your camera’s aren’t quite matched up, but it makes sense if one lines up with the orientation and FOV of that first camera. Likewise, if I go in and edit that “first” camera over while running in editor to line it up with say, the main player camera, sure enough it fixes everything for the main player camera such that no tiles are needlessly culled and the LOD adjustments look just fine.
The problem with this is that the reason for all these cameras is because our project, for one, targets HMDs, so we anyway have left-eye and right-eye views to synthesize. But it also needs to support multiple independent users, who themselves may be looking in completely different directions. So trying to support all of them simultaneously means having to either run multiple instances, change cameras for every render pass, or setup a single encapsulating camera that captures all the view frusta.
For the record, I have tried this with Unreal 5.3.0, 5.3.1, and 5.3.2. With 5.3.2, everything looks fine in editor, but I get a straight black screen in the main play window. This is further complicated by the fact that if I click on the camera that is used when rendering the main play window, the preview of that camera is actually correct and the scene is visible, but not in the main window. At least with 5.3.0 and 5.3.1, the preview and the main window match.
Possibly related to this (or not?) is that the Cesium SunSky objects are never visible no matter how they are set up. In fact, the Unreal built-in BP_Skysphere is visible in editor, but is never visible in play mode for me so long as there is a Cesium terrain in the scene. But the Cesium SunSky isn’t visible in editor or in runtime. Wonder if anyone else has run into this same issue.
This is a tricky case. As far as I know, Unreal doesn’t have great tools to control which objects are shown in which view. So selecting a completely independent set of tiles per viewer isn’t really viable. Or at least, at least it didn’t seem to be last time I looked into this.
Selecting tiles using the union of all the views should mostly work, though. Cesium for Unreal, by default, uses all of the cameras associated with all of the player controllers. It should also account for stereoscopic views. The code is here, and if it’s not producing the right result in your case, it’s worth stepping through there to see what is going wrong.
There’s currently no way to disable this automatic camera selection logic and only use custom, programmatic cameras. That would be a nice addition, and we’d welcome a pull request that adds it.
I’m not really sure what’s going on with the SunSky problems you mentioned. Unreal’s SunSky has several limitations that make it not-quite-perfect for use on a globe. For one, it’s always a perfect sphere when the Earth most definitely is not. The CesiumSunSky has some logic to adjust the atmosphere parameters based on the viewer position in order to hide this fact and make the atmosphere look decent from all views. But there’s no way to make it look good in every view if you have multiple simultaneously. It’s also currently quite limited in that it always uses the position of Player 0’s Pawn to set the parameters.
I see. Looking through it, it looks like it would be fine if we could encapsulate each potential camera view in a Player actor. I’m actually a little uncertain about the stereo because it looks like it’s using the builtin Unreal stereo mode with CalculateStereoViewOffset and so on. We actually set up independent cameras because the HMD in question is designed to support dual IPD and even amblyopic users and so on, so the two cameras are not guaranteed to align to a perfect ideal symmetry. But that is also why a lot of stuff is setup through custom blueprint classes that use some stored calibration data to reconstruct stereo cameras and position them relative to each other according to various other configuration data. Of course, this is why we use nDisplay with a node for each possible viewer.
In our use cases, there is common shared motion between all the viewers, which is done by having the actors as children of the same ULocalPlayer instance. While each viewer is wrapped in an Actor underneath a Player Pawn, there isn’t really a PlayerController associated with each of those Actors. Might have to consider spawning mock PlayerControllers in the blueprint where we initialize cameras for the viewers, which sounds like an ugly hack in my view. That’s aside from the fact that I am, tbh, a bit of a luddite when it comes to Blueprint…
Out of curiosity, I wonder if it would make sense to detect the presence of nDisplay Config assets in the UWorld? Yes, we’d have to link to the nDisplay SDK (#ifdef guards for whether or not nDisplay is enabled in the project, of course), but if you get that one nDisplay Config and iterate through the Nodes in the config, that should have all the camera views setup in it. I’m spitballing as I say this, since I haven’t looked into the nDisplay SDK to be sure if this would work, but I believe that we should be able to add a few hooks to get state from the IDisplayCluster, and thereby get the individual Camera references.
Regarding the SunSky thing, I’m at a loss for that one, too. The Unreal BP_SkySphere is not my preference any way, but the fact that even that doesn’t show up in play mode even though it shows in editor is perplexing. Why the Cesium SunSky would not show up at all is a bit strange to me. I will say that things like the atmospheric effects on the terrain and such are present. The terrain shows atmospheric fog and a sunlight and so on. Likewise, if I move the object around, or scale it, it has actual effect on the view. In spite of that, an actual skybox/dome is not at all visible.
Addendum – I tried to see if there was something wrong with how I set it up, but I got the same results even in empty scenes when I add a CesiumSunSky to the scene or following tutorials to the letter. Again, the lighting and atmospheric effect on geometry is fine, but no sky shows in play mode or in editor.
We actually set up independent cameras because the HMD in question is designed to support dual IPD and even amblyopic users and so on, so the two cameras are not guaranteed to align to a perfect ideal symmetry.
I have to be honest: I don’t really know what you just said. You’re clearly at a level of camera sophistication here that we, unfortunately, are not.
But I think “an option to display automatic use of player controller cameras for tile selection” would be useful to you, because then you could take control of the whole thing yourself. I don’t think I can justify making that a high priority for my team in the short term, but I think it’s probably not too difficult if you want to take it on. We’re happy to help if you run into trouble.
There are probably some bigger improvements to be made here as well. You mentioned integration with nDisplay, for example. I’m a little wary of that due to the dependency. I don’t think a simple #ifdef will work with Unreal’s compilation model, unfortunately. But perhaps a separate plugin or blueprint library could form the bridge between nDisplay and Cesium custom cameras?
Well, to simplify, the dual IPD thing is just to account for the fact that most human faces (and by extension, eye positions) are not perfectly symmetric, but most commercial HMDs adjust lens positions symmetrically and the built-in stereo eye cameras in most game engines will just assume that you are going to apply equal and opposite offsets to left and right eye. Dual IPD just means we don’t make that assumption and measure left and right eye offset independently, which means that they effectively act like two independent cameras. In any case, it’s not super important to modify Cesium or Unreal specifically for this unless such a quality of HMD optics becomes mainstream (which I don’t think it will any time soon). nDisplay already supports it as it is, and it allows us to do so for multiple users at the same time. Perhaps a safer idea would be to supply camera positions through, I suppose, a custom PlayerController class that gets camera positions from the nDisplay Config Node for a given user/eye. Does that sound reasonable?
Based on how things are looking right now, I do think that the custom blueprint/plugin classes may well be the least disruptive way to go about it. We already have custom plugins as it is, and the code within the project already links to nDisplay as uses it as is. Moreover, I think we’ll have to do some manipulations involving our plugins/BP classes as it is e.g., take settings from config files to override the origin point in the GeoReference.
Also a little update on my completely non-visible SunSky issue – Something that came to mind and I actually verified today was that there is no such problem on Windows. The SunSky shows up just fine on Windows. Granted, I can’t really try it out in runtime with our project because our target platform is Linux, and a number of dependencies of our plugins will only build on Linux, but on Linux, I couldn’t even get it to show in Editor. That’s where I’d been working almost the entire time up until now, so it never crossed my mind to check other platforms since we aren’t able to run on them anyway. Not sure if that is a Cesium problem, an Unreal on Linux problem, an nDisplay problem, an Alma Linux problem, or some unholy permutation of these variables.
Perhaps a safer idea would be to supply camera positions through, I suppose, a custom PlayerController class that gets camera positions from the nDisplay Config Node for a given user/eye. Does that sound reasonable?
Yeah if that’s workable in your project, it has the advantage that it doesn’t require any changes to the Cesium for Unreal plugin.
Not sure if that is a Cesium problem, an Unreal on Linux problem, an nDisplay problem, an Alma Linux problem, or some unholy permutation of these variables.
I’m not sure either. You’d probably need to do some iteration with the built-in SunSky in a simple project on different platforms to try to narrow down where it occurs so that it can be reported to the right place.
So, regarding the SunSky thing, I’ve been running a series of experiments and managed to trim it down to the presence of our own module (the one that interfaces with nDisplay and several other devices and so on). The reason this was still working on Windows was simply that several components failed to initialize (which is sort of expected on Windows anyway) so it falls back to the default built-in game mode. For the time being, I’ve been trying to methodically single out individual causes one by one and even trying to reconstruct the game module in an empty project (with Cesium) bit by bit, as well as building a custom version of the Cesium plugin wherein I’ve just thrown in a boatload of extra logging to look for discrepancies between a working and non-working project. I don’t believe it is affected by things like configs affecting our camera FOVs or the Blueprints that affect view position/orientation because they should only affect things in play mode and not in editor view.
Just curious if you have any clues as to what I should look for that could potentially completely prevent the SunSky from being visible. Even when I create an empty level within the project, I can see the widget and the bounding box, but sky itself is completely black. And it definitely functions in the sense that it lights objects in the scene.