Is there a way to calculate what the screen position of the center of the horizon is? That is, its position in pixels? I’d imagine it’s some function of altitude, pitch, and perhaps the camera’s vertical FOV, but I’m not exactly sure of the calculation. You guys are very familiar with this type of math - anything come to mind?
I whipped up a simple complete example that does this: https://gist.github.com/mramato/a5e403379c4989313545
It subscribes to the preRender even and then picks each pixel down the center of the screen with pickEllipsoid, the first earth pixel it finds is considered the horizon point and it puts a label there. If the earth does not intersect the center of the screen, the label is hidden. It’s also hidden if the first pixel is the earth, because that means there is no horizon in view. Let me know if you have any other questions about the example or you had something else in mind.
Wow, that works great! I’m really impressed and humbled that you took the time to create that example. Thank you.