Rendering Issue

Hello, I have a problem with a video that we want to create. The globe looks normal in the editor but when I want to create the tracking shot I get these black texture bleedings. am in ue 5.0 does anyone have experience with the error. Friendly

Hi @3rne5t0,

The Unreal SkyAtmosphere is a sphere, while the Earth is not. The radius of the Earth is about 21km greater at the equator than at the poles. That means that with an accurately-rendered globe, any given SkyAtmosphere will be incorrect either up-close (e.g. too dense or a band of missing atmosphere at the bottom near the horizon) or when zoomed out (the artifact you see here, caused by the surface poking out of the atmosphere).

So first of all, make sure you’re using CesiumSunSky. It automatically adjusts the atmosphere at runtime to account for this. And make sure you haven’t disabled “Update Atmosphere at Runtime”.

However, even if you’ve done both of these things, this adjustment is inherently view-dependent. And it uses Player 0 position to drive the adjustment, specifically code like this:

  // Get the player's current globe location.
  APawn* pPawn = UGameplayStatics::GetPlayerPawn(pWorld, 0);
  return pPawn->GetActorLocation();

So if your actual view is driven by some other camera, the atmosphere will not be adjusted correctly.

The easiest solution for your video is to move the player pawn, at least temporarily, to the location from which you’re capturing the shot.

If that’s not possible, the next best thing is to play with the InscribedGroundThreshold and CircumscribedGroundThreshold properties on the CesiumSunSky from Blueprints, followed by a call to UpdateAtmosphereRadius. If you know you’re taking a zoomed-out shot, you’ll need to make sure the player’s location (even if the shot isn’t being taking from the player’s perspective) has a height greater than CircumscribedGroundThreshold (in kilometers), and that InscribedGroundThreshold is less than that.