Folks,
Check out the view from the International Space Station using the newest Cesium showcase, Windows on Earth:
http://cesiumjs.org/demos/woe.html
This was developed for NASA by Michael Lodge-Paolini and is also part of the exhibit at the Museum of Flight in Seattle. The previous version was developed with Google Earth.
Patrick
Thanks Pat. I hope some people around the Seattle area have a chance to check out the exhibit or explore the images taken by the astronauts
Cool stuff! It would be nice though if it would allow the option to roll at least the non-nadir views so that the horizon is on the top of the screen rather than on the bottom, being weightless it seems that an astronaut could easily roll themselves into any picture taking roll angle. A single overall view option would be nice as well: a wide FOV shot with all windows in view. From the overall view one could easily make sense of what direction each window looks toward.
The motion appears to be a series of line segments with sudden heading and direction shifts roughly every second. I wonder if interpolation could be implemented to make it a bit more smooth, at least with heading interpolation. Then again it’s so subtle that you do have to observe carefully to even notice it by looking at a land mass on the top of the left elongated viewport.
Good job Cesium! Keep up the good work. It’s fantastic.
Watching landmasses on the top of the left elongated viewport scroll down along a straight line (or geodesic) made me realized that of course the ISS isn’t changing course. The one-second-heading changes are just Earth relative orientation changes, not orbit path changes. The ISS isn’t an airplane where it has to face in the direction it’s traveling in. Perhaps the ISS is also changing orientation relative to its travel direction, I’m not sure.
The red line on the right elongated map is quite useful showing what the ISS will pass over not only on this orbit but the next one as well, slightly shifted due to Earth’s rotation.
Instead of swapping 2D overlays for window frames it would be neat to show the entire Cupola as a 3D model. Using a 3Dmouse one could simulate total 6DOF movement just like a weightless astronaut and line up any kind of shot through this model.
An ISS 4x4 rotation/translation matrix could hold the ISS position and orientation. The modelMatrix of the Cupola and the camera’s transform would both be based on that matrix.
The mm photo frames remaining as screen overlays fixed relative to the screen, maybe allow altering FOV to match the mm rectangles to the edge of the screen.
Perhaps a programmable camera could be placed in the Cupola for when the ISS crew is busy/sleeping, people could schedule shots using WOE. Each shot would consist of yaw/pit/rol/FOV/UTC-date-time of the camera (yaw/pit/rol being Cupola relative.) Perhaps the camera could be put on a robotic arm as well to be placed in front of any window for wide FOV shots without the Cupola frame being in the way.
We’ve actually been experimenting with a 3D Camera simulator for “Recreating” a camera orientation from the photos taken by the astronauts. It loads in the location of the image as a baseline and adjusts the FoV to match the lens’ properties, from there you can change the orientation of the camera accordingly to match the photo.

Seems you’re recommending we run it in the opposite direction feeding a camera that is programmed to orientate itself to those settings. It’s an interesting idea, but unfortunately the internal camera’s the astronauts use are handheld and they do not have a rig that would accommodate that feature.
That I was envisioning would be on the target selection side so the astronauts and CEO could put together a set of not only desired locations, but specific shots they want as well.
So the GPS data of the photo location/time is known, at least roughly as the ISS does move fast, then the user ‘jimmies’ the yaw/pit/rol/FOV around to make a match on WOE? Perhaps even the translation could be jimmied a bit in case that data is slightly off. That sounds likes like a fun activity!
Maybe a 3DOF camera swivel like this could be placed in the Cupola center when its not being used by the astronauts.
Maybe placed on a robot arm, like a mini CanadaArm for un-obstructive wide FOV shots. Continuous video would be nice as well.
To realize your idea, to visualize desired locations/shots how far can WOE accurately extrapolate future ISS locations/orientations (assuming no thruster activity) ? It would be nice to fast forward to the future in WOE and create a virtual shot that an astronaut or automated camera could make when it does become live.
We could project out about 3-4 weeks with reasonable accuracy, though we would have to update those projections daily until the day before the flyover would occur. Generally short of a mission related re-boost or orbit change to meet up with a capsule the projections are pretty accurate with only a few +/- seconds being adjusted accordingly.
It would definitely be a fun project.