OpenXR Toolkit (upscaling, world scale, hand tracking...) - Release thread

Yes, BUT (!!!) - there’s been an exciting new development with Virtual Desktop software: it now relays Quest 3 hand-tracking data back to PC, emulating VR controllers. I tested it very briefly in MSFS, and it does work! Basically, it’s very similar to what the toolkit was doing, only it works better and more reliably.

The downside, of course, is that it only works as well as the VR controllers work in MSFS, which as we all know are far from perfect. I.E. it’s possible to grab and rotate a course or radio knob, but the precision is poor - but so it is with VR controller. That’s why I built physical dual encoder box for my motion VR cockpit setup. One other issue is that when I tested there were no gestures for A/B/X/Y buttons, so it wasn’t possible to switch between laser and direct modes in MSFS. But they were working on that. Also, any hand-tracking is less precise than physical controllers, so it’s a bit more wonky, but the convenience and immersion may be worth it. It needs more testing, once laser/direct switch is possible.

Update: Matt informed me that in the latest VD beta the gestures are updated: “tap left hand palm is Y, tap right hand palm is B”. So laser/direct mode switch should now be possible. From what I remember when I tested - menu gesture (palm towards your face + pinch) works for the MSFS menu, pinch works for the trigger and grab is a grab. Oculus menu gesture (palm towards your face + pinch) works as usual, and from the recent firmware v62 long pinch does view reset.

6 Likes