So the GPS data of the photo location/time is known, at least roughly as the ISS does move fast, then the user ‘jimmies’ the yaw/pit/rol/FOV around to make a match on WOE? Perhaps even the translation could be jimmied a bit in case that data is slightly off. That sounds likes like a fun activity!
Maybe a 3DOF camera swivel like this could be placed in the Cupola center when its not being used by the astronauts.
Maybe placed on a robot arm, like a mini CanadaArm for un-obstructive wide FOV shots. Continuous video would be nice as well.
To realize your idea, to visualize desired locations/shots how far can WOE accurately extrapolate future ISS locations/orientations (assuming no thruster activity) ? It would be nice to fast forward to the future in WOE and create a virtual shot that an astronaut or automated camera could make when it does become live.