Hand Tracking in MSFS - Leap Motion / UltraLeap native support for WMR/OpenXR

Now that MSFS has VR controller support, we need hand tracking!

There is a large segment of MSFS community that would love to get their hands on hand tracking (pun intended), now that Microsoft Flight Simulator has VR controller support. To a degree it’s already possible when using SteamVR with Leap Motion (UltraLeap) controller emulation. However arguably the best mainstream VR HMD for flight simulation is Reverb G2, and it only works reasonably well in MSFS when set to WMR OpenXR runtime (the default). SteamVR OpenXR doesn’t work well at all. Leap Motion only works in SteamVR, besides controller emulation is not perfect. What’s needed is native hand tracking support that works in WMR/OpenXR and SteamVR, i.e. that we would be able to see out hands and fingers in VR and interact with all switches and knobs.

The tracking device is very affordable, under $100. And most of us use hardware yokes and throttles, so even though VR Controllers are much better than mouse, they are still not very convenient. Just using our hands would be so much better for real immersion and convenience.

Asobo, please develop hand tracking support with UltraLeap!

Here is a video of LeapMotion already working in MSFS. Imagine this with direct interaction, hands visible in MSFS:

(video reposted from here)

And here’s new generation of UltraLeap tracking - imagine this working in MSFS - that would be a game changer!

Some people managed to get Leap Motion in native more in FSX! It should be no problem in MSFS, just need a bit of work from Asobo.

Voted. The implementation demonstrated in the video is a good example of how game changing this could be for simulation if done well by a proper title…

It seems like a lot of the work needed is already done… maybe Asobo just needs to connect to this on their OpenXR implementation? There’s already a beta release download available.

As the first video shows, it works in SteamVR OpenXR, but not in WMR OpenXR - confirmed by UltraLeap person I’ve communicated with.

WE NEED SOMEBODY WHO IS WILLING TO CODE A DRIVER
I’m sure there are capable programmers among MSFS fans, somebody needs to step up to the challenge :slight_smile: If we’re going to wait for Asobo to do native support, it can take too long… We should still push for it, but having an alternative is not a bad idea.

Here’s a quote from an email exchange with an UltraLeap manager. He basically says that someone just need to write a driver for WMR/OpenXR:

“I believe you are correct and that it’s simply a case of someone actually doing it, but we haven’t investigated this ourselves. We haven’t invested resource into this kind of emulation because as a company we’d prefer to find a more rounded solution than simply emulating controllers (we think that interacting with hands is fundamentally different to controllers). That doesn’t mean a third party couldn’t pick it up if there was interest though.”

That was in reply to my email:
“…WMR does support OpenXR, but has its own runtime (as opposed to the SteamVR runtime), so what is the problem then? Is it just a matter of someone writing a driver or a middleware for LeapMotion, or is there a more fundamental problem with MS OpenXR implementation and there is no hope for a driver? Some other projects (Lucid VR gloves, virtual kneeboards like VRK et.) also are not ported to WMR, so I wonder if there is a problem with WMR.”

So UltraLeap already supports OpenXR, just not the WMR kind for some reason. Maybe some talented developer can get this working. It will give us controller emulation in MSFS. Ultimately native support, with visible hands, is the ideal solution, but that needs to be done by Asobo and can take a very long time… We’ve waited for a whole year to get VR Controllers, and it still feels more like an Alpha release, because many aspects are not working correctly…

I have limited knownledge on this subject, but from what I understand of the video, they use a SteamVR driver to inject Leap Motion input into StreamVR itself. This means that applications using the SteamVR APIs can see the Leap Motion as a controller. I believe that when you are using the “OpenXR runtime for SteamVR”, the OpenXR API calls are converted to SteamVR API calls. So when the application queries the controller input from OpenXR, it receives the input from the SteamVR API - ie the Leap Motion input.

As shown above (hypothetically), the OpenXR runtime is not really involved with any hand tracking. It passes the information through from SteamVR to the application. It just turns out that because the Leap Driver injected the data directly into SteamVR, it just works.

This is fundamentally different from what the UltraLeap API layer that you posted is doing. An OpenXR API layer is inserted between the application and the OpenXR runtime, in this case adding the hand tracking functionality through the XR_EXT_hand_tracking API. The underlying OpenXR runtime is not involved at all. The OpenXR runtime does not see the API layer.

When an application implements hand tracking via the XR_EXT_hand_tracking API, using the UltraLeap API layer will intercept the calls to the XR_EXT_hand_tracking API and respond to them:

But if the application does not make use of the XR_EXT_hand_tracking API, then the layer does nothing. The traditional controller APIs are still handled by the OpenXR runtime, which does not see the API layer anyway:

As said by the UltraLeap rep “we think that interacting with hands is fundamentally different to controllers”, and this is why the XR_EXT_hand_tracking API (exposing hand joints positions for all the bones in your hand) are completely separate and different from the controller input APIs in OpenXR (which only expose one position/orientation per controller).

When using an API layer, the underlying OpenXR implementation does not see anything. So it is not possible for the OpenXR runtime to access the hand tracking data. For applications that do not support input via the XR_EXT_hand_tracking API, and as pointed out by the UltraLeap manager, a third party “emulation” driver needs to be implemented. This could be in the form of another API layer that could be inserted “in front” of the UltraLeap API layer, and retrieve the hand tracking data via the XR_EXT_hand_tracking API from the UltraLeap API layer, before processing this data and exposing it via the controller input API to the application:

Here again, the OpenXR runtime is not involved at all with this process.

I am not 100% sure that everything I described above is 100% accurate (esp. the SteamVR and Leap bit), but I think it should be close. I hope it helps understand the reason why integrating the UltraLeap API layer cannot be done in the OpenXR runtime. I also hope it helps any developer out there to get started on investigating OpenXR API layers and how to create this emulation shim.

1 Like

Thanks a lot for the explanation. I hope it helps someone to understand it better. Barring a native support from Asobo, writing a controller emulation that works with Reverb G2 and other WMR headsets is our best bet for natural and convenient cockpit manipulation. It would solve all of the problems with the mouse and having to grab the controller often. Once Asobo irons out the bugs in controller implementation, it should be very practical and convenient. Just need somebody knowledgeable to invest some time into coding that. Not an easy task.

1 Like

Best implementation of VR and hand-tracking/controllers.

Definitely looks great. Not for the yoke of course, but everything else.

I should have stressed, of course, this was 2018. Agree on your view regarding the yoke. Tried it, but not offering real tactile support other than haptics feels weird.

Let’s hope Asobo will really take VR seriously. With Meta and all - it’s here to stay. The problems and missing features are really obvious to anyone who loves VR.

1 Like