Forgive me this question is a bit long and requires a little background:
I am attempting to write a manual animation system so that I may treat animations independently of the Cesium timeline. Essentially my API will result in something like this: AnimationPlayer.update(time), where 0 <= time <= animation_duration. In order to accomplish this I have written some custom glTF parsing code to get at the necessary animation data. I store the original node transforms/scales/rotations, as well as the keys for each animated node. I update the nodes by using entity.model.nodeTransformations. The pseudo code for my AnimationPlayer.update goes like this:
original_trans = glTF.nodes[animated_node].translation;
start,end = getKeysForTime(time); //get the keyframes to interpolate for the current time
start_trans = start.translation - original_trans;
end_trans = end.translation - original_trans;
t = (time - start.time) / (end.time - start.time);
result = lerp(start, end, t);
entity.model.nodeTransformations[animated_node] = result;
The reason I have to calc start_trans and end_trans is because Cesium treats nodeTransformations as relative to the initial position in model space (and relative to the parent). For instance if a node's model space position is 25,0,25 setting the node transform to the same value actually results in the node being placed as 50, 0, 50.
So far everything works but I am running into some issues dealing with nodes that are rotated. I perform the same logic as translation but instead of using a difference to get a relative rotation I calc the inverse quaternion of the original rotation and then multiply the start and end rotations to get them relative to the original rotation. In this case the values for start and end are the same for the entire duration of the animation. However things are not animating correctly. I took screenshots of the issue in BabylonJS where the animation is correct (the planar polygon is contracted and then extended in the same plane, it is skinned to nodes that extend outwards in the x direction). However in Cesium it acts like the x-axis is pointing upwards and the planar polygon deforms incorrectly. Is there anything in my assumptions of how things should be calculated that would be causing this?
A few extra notes...I know it is odd that the mesh "hooks" animate correctly (the hooks rotations are all zero'd out). However that is the only difference with the nodes that affect the planar polygons (these nodes have a rotation in model space that was not zero'd out). It plays correctly in Babylon sandbox and it plays correctly when using standard Cesium API. Again though, the point with this is that I want manual control of the animations, if there is another way to accomplish this with Cesium API I am open to that as a solution but it does not seem so. Thanks in advance.
One small clarification I wanted to add, I realize it may be a bit confusing when I say a rotation is "zero'd out", what I mean by this is that the local transform for a given node is aligned to the world axis.
After performing several more tests it definitely seems there is odd behavior with how Cesium handles nodes (entity.model.nodeTransformations). When setting a translation on a node that has any rotation keys, the node is translated along its axes versus the parent node axes. Setting a rotation still performs that rotation about the parent axes though. If there are no rotation keys the node behaves how one would expect (at least according to the glTF specs for performing animation). That is, the node translations and rotations are applied relative to the parent node's axes / local coordinate system. My question is why is this? And second is there any way to perform the translation relative to the parent coordinate system if there are rotation keys defined on the child node?
I need to look at this more carefully to give you a good answer, but in the mean time I wanted to let you know it might make your job easier to directly create a Model primitive instead of using the Entity API.
Since it exposes the nodes directly:
And there is a private _runtimeNodes that has the matrix in addition to the rotation/translation/scale:
Sample code for creating a Model primitive is here:
If you’re curious, here’s where entity node transformations are mapped to the underlying primitive:
And where the Model computes the nodes’ transformations:
This is really helpful information to have, thank you. I will take a look at using model primitive when I get a moment and see if the issue still exists. I would prefer to use entity API in the long run just so I don’t have to re-wrap functionality like being able to easily set position or rotation or using callback properties, though I realize it is not the end of the world. The project I am working on ideally would like to make use of CZML for playing back some historical data and it doesn’t seem that there is a clear connection between Model and Entity or even Model and ModelGraphics to enable that capability.
I know this is a bit outside of the scope of this discussion but I have always been curious why the entity system for Cesium is so limited in a way (for instance one can’t even assign multiple models to an entity or create a hierarchy of entities). Is there any reason why the Entity system was designed like this versus taking cues from gaming engines like Unity and Unreal for management of Entities? Anyways thanks again for your help. I look forward to your findings as well with this issue!
I am completely with you on this Richard. In fact, I recently ran into a very similar issue, where I needed to use the Entity API’s time dynamic properties, but also gain access to the underlying model primitive. As a workaround, I modified the source here:
entity.modelPrimitive = model
To access the underlying primitive from the Entity API without having to rewrite everything.
As for the reason why it’s this way, I think I’ll defer that answer to someone who’s been on the team longer than I have that can speak to its history. From what I can gather, the Entity API was created for very specific use cases long before 3D models were implemented in the engine. This is one issue discussing whether models should be exposed:
Also, thanks for opening the GitHub issue! If it is indeed a bug it’s good to document it.
I have attached a test model to verify the behavior I am seeing. You can view the default animation in the Babylon.js sandbox app to get an idea of how it should look and to see the orientation of all the axes for the various nodes and meshes. In order to see the issue I am describing try moving the node named “Bone_Left” along the x-axis (via ModelGraphics.nodeTransformations in Cesium). It should extend outwards from the model (and pull the planar polygon with it) as you see in the animation but instead it will move along its own x-axis. You will see it move along the length of the model instead of extending outwards like in the animation.
Basic_Mesh_Test_v4.glb (138 KB)
I have decided to mark this as resolved for now. We have found work arounds based on how we rig the model for use within Cesium and my animation system now works. I have left the issue open on the Cesium github page as I feel it can still be resolved with better documentation and some guidelines when rigging models for use within Cesium. Cheers!