Issue with Rotation in Cesium for Unity

Hello everyone,

I’m encountering an issue with drawing fences in Cesium for Unity and would appreciate any advice.

Context:

  • I have two sets of fences drawn based on the direction of lines.
  • One set is close to the initial scene center point, and the other is far from it.

Problem:

  • When I first draw the fences, the scene center point is near the camera, and everything appears normal (see Screenshot 1).
  • However, when I rerun the program, the scene center point reverts to its initial position, causing the distant fences to appear with incorrect rotations (see Screenshot 2).

Current Approach:

  • The rotation of each fence segment is set using the direction vector of the line segments.

Question: Do you have any suggestions on how to maintain the correct rotation for the distant fences when the scene is rerun?

Thank you for your assistance!
录制_2024_07_25_14_35_41_63220247251438241
录制_2024_07_25_14_36_51_8262024725144152

HI @jh007,
Can you walk us through how you’re creating these fences? It sounds like the rotation is going haywire as a result of origin rebasing when the camera moves over long distances, but I can’t tell why that’s happening. Are you using CesiumGlobeAnchors? Is there a separate one for each fence section? Are you setting the position of the fence sections via the globe anchors, or via the regular Unity transform?

Thank you for your answer.My segment nodes and each of the fence’s preanchors are added with CesiumGlobeAnchors, but at initialization time, the orientation of each preanchors of the fence is set based on the orientation of the segment (Unity position of the first node minus Unity position of the second node), so when re-running the scene, The distant three-dimensional coordinates are very large, is it caused by precision problems?

I don’t think that would be a precision problem, though you should make sure that you’re doing the subtraction in double-precision (even if the values you’re subtracting are single-precision).

My best guess is that it has to do with the timing of setting the orientation. If you set the orientation via the Unity transform just before the CesiumGeoreference origin changes (which will happen automatically as you move if you’re using a DynamicCamera or another object with a CesiumOriginShift component), then that orientation will be accepted as truth even though it is no longer correct due to the new orientation of the Unity world on the globe.

You might be able to solve this by setting the orientation on the CesiumGlobeAnchor instead. You might even be able to set rotationEastUpNorth to a fixed value and not have to compute anything at all:

The quaternion.EulerYZX function may be helpful for constructing the quaternion from normal Unity Euler angles.

Thank you for your answer.For a single twin, adding the CesiumGlobeAnchor script is certainly fine. Here’s an example: Given a set of latitude and longitude points, convert them into 3D coordinates to serve as nodes for the LineRenderer. Based on the direction of Unity’s LineRenderer, generate closely adjacent cubes along the line at 1-meter intervals. The Z-axis of each cube should align with the direction of the line, and each cube should have the CesiumGlobeAnchor script attached. When these cubes are generated, if the camera is positioned very close to them, their rotation is correct. However, if the camera is far from these cubes, upon generating them and then moving the camera closer to them, the cubes’ rotation will be incorrect, as shown in the picture

Can you share the code you’re using to generate those cubes? If you’re setting an orientation via the CesiumGlobeAnchor, the effective orientation relative to the globe should be quite independent of the camera position. If you’re seeing otherwise, it could be a bug in the globe anchor, but it could also be an error in your script.

LineSpawn.zip (1.7 KB)

You have this function:

    public Vector3 GetLatLonToUnityPos(double3 lonLat)
    {

        double3 ecef = CesiumWgs84Ellipsoid.LongitudeLatitudeHeightToEarthCenteredEarthFixed(lonLat);
        return D3ToV3(cesiumGeoreference.TransformEarthCenteredEarthFixedPositionToUnity(ecef));
    }

But that function doesn’t convert Longitude/Latitude/Height to Unity coordinates, it converts to ECEF coordinates. Those two are not even close to similar. For one thing, Unity is left-handed and ECEF is right-handed. For another, Unity coordinates are centered at the CesiumGeoreference’s origin, while ECEF is centered at the center of the Earth.

This conversion method is correct. Don’t you see this code?:return D3ToV3(cesiumGeoreference.TransformEarthCenteredEarthFixedPositionToUnity(ecef));

Oh you’re right, I missed that! Sorry! Let me look closer at this code.

Ok, if I’m reading this correctly (my track record is not great! :laughing:) then it seems you’re adding a CesiumGlobeAnchor to each cube, but then you’re setting its position and orientation using the Unity transform rather than the globe anchor. That still leaves open the possibility that the CesiumGeoreference origin changes in between when you compute/set the position/orientation and when these get converted to ECEF by the globe anchor.

Unity does not provide a notification when the Transform changes. Yet when the Transform changes, we need to update the ECEF-relative transformation stored by the CesiumGlobeAnchor. The way we do this is with a coroutine that polls for a change in the Transform. But like all coroutines, this runs as part of the update cycle, not immediately when you change the Transform.

So, long story short, I think you may be able to solve your problem - and it should be less work, too! - by setting the longLatitudeHeight, rotationEastUpNorth, and scaleEastUpNorth properties on the CesiumGlobeAnchor after you add it, rather than converting to Unity coordinates and setting properties on the Transform. This will also allow you to disable detectTransformChanges on the CesiumGlobeAnchor, which disables that coroutine and will make your application more performant.


Thank you for your answer. My script can be used directly in any Cesium project. Have you tested it according to the method shown in my diagram, with different execution sequences, to observe the results? I think the core problem is that when my camera was in New York, I converted it into Unity3 coordinates based on a set of longitude and latitude points in Shanghai (because Unity’s origin was in New York at this time. The three-dimensional coordinates of this group of Shanghai points are very large), and the direction vector that determines the rotation of the object is subtracted by the three-dimensional coordinate phase, so the rotation deviation is caused

I inspected your code, but haven’t run it at all. It takes a lot of time to do that sort of thing; time I’d rather spend making Cesium for Unity better.

I don’t think doing the subtraction of points in Shanghai while the georeference origin is in New York should be inherently problematic, but you have to be very clear about what that vector means to you. I was under the impression that your orientations were initially correct, but then went haywire after a CesiumGeoreference change. If that’s right, then that points to a problem with the interpretation of the orientation later, rather than the orientation being inherently wrong to begin with (e.g., because the georeference origin is far away). I explained above one way that might come about (the coroutine that propagates Transform changes to the CesiumGlobeAnchor eventually).

Working with ECEF and the CesiumGlobeAnchor (rather than Unity coordinates and the Unity Transform) can help here because its meaning doesn’t change with changes in the CesiumGeoreference. That’s why I keep recommending that you use it.

Thank you for your response. I think you’re right; this issue is not directly related to Cesium but rather to the fact that the geographic origin is too far away when generating linear objects. This can be addressed by using a refresh strategy or by ensuring the geographic origin is nearby before generating these linear objects. Essentially, this boils down to Unity’s limited support for large-scale scene coordinates.