So i’m having a strange error with tilesets(that load just fine in web cesium). Parts of tilesets in unreal are missing seemingly at random(though more so on transparent components). Come to think of it, tileset transparency doesn’t seem to work in unreal plugin at all.
Maximum screenspace error, culling or any other settings do not seem to affect this at all.
So i guess my question is, are there any known gltf features that work in web, but are unsupported in unreal? Github repo has tickets for BLEND support, but these were apparently implemented long ago.
Oh yes, i do get a bunch of “LogPhysics: Warning: Failed to cook TriMesh: CesiumGltfComponent.” warnings, if that is somehow related to graphics.
So my question really is whether there’s an overview of which features are not yet in the UE plugin, so I could potentially adjust tilesets accordingly. And which features are in development right now, maybe i could help speeding them along.
If you could share some workarounds for missing features i would be grateful for that.
Yeah alpha blended models won’t work out of the box. The issue you mentioned was marked as a duplicate of another one, and that other one is still open.
Unreal Engine requires totally different materials for translucent versus opaque. We didn’t want to create an explosion of materials, so we only supply opaque (which is by far the most common for 3D Tiles). If you need translucency, you can create a custom material instance:
Right-click in the Content Browser and choose Materials and Textures → Material Instance. Give it whatever name you want.
Double-click to open the Material Instance.
On the Details tab, change the Parent property to M_Cesium_BaseMaterial.
Also on the Details tab, open the “Material Property Overrides” section.
Check the box next to “Blend Mode” and choose “Translucent” in the box.
Save and Close the material editor.
Select your Cesium3DTileset in the Details tab find Cesium → Rendering → Material. Set it to the new material instance you just created.
I haven’t tested this extensively, but I believe it should work. Let me know how you go!
Also, do you have any hunches as to what could be causing this in a tileset? It’s the same exact tileset, yet polygos are seemingly missing. I played with the material settings, and it doesn’t look like the issue is normals or handindess or opacity, or anything else. Polygons are just not there.
I have googled around, and it seems unreal doesn’t really have any support for true blending? There are per material sort priorities, but we want translucency inside a tileset, so i don’t imagine that’s going to help. Do you know whether it is possible to disable deferred rendering for a material? I wouldn’t even mind incorrect blend order if i could get z-culling. Or perhaps you have some other approach in mind. Any tips would be welcome.
I’m not quite following what you’re doing here. What do you mean by “enabled alpha on baseColorFactor”? If you follow my steps above to create a translucent material, it should not immediately break depth testing (I just tried it). I don’t know if Unreal is doing the necessary sorting for correct blending of the translucent parts of your model. But I also I don’t understand what’s going on in your screenshot where the z-ordering of opaque elements looks broken. Can you elaborate on exactly what you’ve tried?
The issue with z-ordering occurs because, as i understand, we’re applying translucent material to the entire tileset. And if a tileset was batched in such a way that certain buildings are batched together, unreal can’t correctly sort them geometrically. In such circumstance, since unreal uses deferred rendering, and since we’ve told it that tilesets material must be blended, it draws opaque polygons without respecting depth. And if two components aren’t sortable geometrically, because their bounding boxes intersect, you get artefacts from my screenshot.
AFAIK, difficulties in rendering translucent elements are a general problem, not only in deferred shading. But I still don’t understand why opaque elements would not be depth tested (even with a translucent material), because in my experiments it seems that they are.
In any case, it sounds like you may know more about this than I do, so I’m probably not going to be much help.
@Kevin_Ring How would it know which elements are opaque? Does materials system trace alpha value? While it makes sense to look at our baseColorFactor and enable depth testing per component, i wouldn’t expect this from materials system. I tried setting opacitymask too but it doesn’t seem to impact depth testing.
I was testing with the Melbourne tileset in the examples project (which is entirely opaque) and giving it a translucent material. I didn’t see any obvious artifacts, which I took to mean it was doing correct depth testing. But when I looked closer just now, I saw some. I think it wasn’t obvious at first because Melbourne has very small tiles.
So, digging into this a bit more (but only a little), I think the proper solution is to separate translucent primitives from opaque ones, and use the appropriate type of material for each. That’s not too hard, but will contribute to the shader explosion I mentioned we’re trying to avoid. Perhaps there’s a more clever solution I’m not aware of.
In any case, I’ve updated the issue to mention this problem and link to this thread:
I don’t think we’re going to be able to fix this right away, but would welcome a pull request if you have an idea how to implement it!
This adds a new entry for a material on the tileset rendering tab - but hey everything that does blending would render front to back and then back to front anyways. No way around this.
I left the default masked material as the default for transparency, if people want transparency they can clone it and set proper blending mode for themselves, no sense multiplying bundled assets needlessly.