The UX is just built for using dual encoders and buttons on a system that doesn’t have a touchscreen - which is wise in an aircraft cockpit environment because it is more precise in turbulence and generally when things are moving, and has tactile feedback, so you are positive when you actually pressed something. It’s extremely awkward with a mouse as a result, but it’s not built for that. But I built an encoder control box for just a few bucks. The UX has much more sense when operating it with outer/inner encoders and a few buttons. Entering letters is still not ideal, but much less tedious with an encoder, and is actually a mini-game in itself. Encoders have different shapes (like real ones) so they are easy to find by feel. As I only fly in VR (and on a motion rig), a keyboard is not an option for me. Some aircraft do have a small extra keyboard built-in for when you need to enter letters. I think they don’t work in MSFS. The only thing I’m not emulating with hardware is the 12 lower buttons. It wouldn’t be feasible for VR, because you won’t be able to find and differentiate those buttons blindly. So VR controller or mouse is needed for those, but once set up for a specific aircraft in a preferred way, they rarely need to be touched. And naturally, things like heading, altitude, course etc. are much easier to set up with encoders than mouse clicks. You can vary the rotation speed and on very fast rotation you can program 1000s change in altitude for example. Anyway, I don’t want to go off-topic, but in terms of UI/UX design we have to remember that this is emulating the encoder and button-based interface.
1 Like