For funsies I tried out AITrack with OpenTrack to use my webcam as a head tracker to control the cockpit camera in MSFS. It was fun, but on the default settings I started to feel motion sickness due to its aggressive panning all over the place as I rotated my head, and requiring me to rotate my head one way and eyes the other way to continue to look at the monitor while turning seemed really unnatural.
I found it felt better if I disabled the yaw/pitch/roll axes entirely and set the scaling on the x/y/z axes to 1:1; it felt more like it was responding to my head movements naturally, allowing me to look around obstacles like the yoke or a vertical strut around a window.
However it still isn’t right, because it renders the camera view as if the monitor were translating itself in physical 3-d space along with my head. This is how VR goggles work, but not how desktop monitors work, so it wasn’t a natural projection of what I’d see “through” the monitor as a window to the simulated world.
If I understand correctly, what I in particular want out of a head tracker for use with a desktop monitor is to translate the camera’s eye point but not change the frustum (virtual projection rectangle), which should be anchored to the monitor which does not move in real or virtual space. This would skew, rather than translate, the camera view.
I don’t think this can be expressed correctly via the FreeTrack protocol in the traditional 6DOF.
Does anyone know if it’s possible to control the MSFS camera view more precisely via SimConnect or other means?
I’m mainly curious if anyone’s already been down this road; I doubt I’d have the time to dedicate to hacking this into shape by myself, and if it’s not possible to adjust the camera view correctly I’m just going to have to live without. ![]()