I have this vision that in VR I can use my HOTAS for manual control, and then take my hands off my hotas and reach out to adjust autopilot, fuel mixture etc without having to fumble for a real world controller I can’t see. Essentially use my real hands in a virtual environment.
The way I’d like to see that work is using hand tracking. Full disclosure: I work for Ultraleap, owners of the Leap Motion hand tracking technology. We’d rather like our technology to be used in a game like MSFS2020 which is already pushing the boundaries of technology, and we think our hand tracking offers a non-partisan and affordable (from HMD perspective) method for interaction with the world’s best hand tracking. There are nearly 1m Leap Motion Controllers out in the world and they could all be used for this. And no, no big licence fee attached, this isn’t a sales pitch.
So, devs if you’re listening, drop me a line at firstname.lastname@example.org if you want to chat about it, where we’re going with our tech etc and how we can help you design interactions that work (to avoid annoying gotchas like dealing with knobs/dials that are close together, how to get people to interact with them naturally and reliably etc).
Players, let us know if you like the sound of that. As a MSFS fan myself with a couple hundred hrs logged on FSX/FS2020, I know I do!