Hand Tracking in MSFS - Leap Motion / UltraLeap native support for WMR/OpenXR

Moderator Note: Some other topics regarding hand tracking were merged into this master. Continuity may be lost.
https://forums.flightsimulator.com/t/please-add-support-for-leap-motion-controller-free-hand-tracking-in-vr/359150
https://forums.flightsimulator.com/t/hand-tracking-support/387903
https://forums.flightsimulator.com/t/support-for-hand-tracking-with-leap-motion/498685

Now that MSFS has VR controller support, we need hand tracking!

Update: There is a tool being developed for embedding Leap Motion hand tracking through VR controller emulation for OpenXR, please read the topic below. Alpha is available for testing and is showing promising results. There is hope we can get it working in that mode, until Asobo can implement proper native hand tracking.

There is a large segment of MSFS community that would love to get their hands on hand tracking (pun intended), now that Microsoft Flight Simulator has VR controller support. To a degree it’s already possible when using SteamVR with Leap Motion (UltraLeap) controller emulation. However arguably the best mainstream VR HMD for flight simulation is Reverb G2, and it only works reasonably well in MSFS when set to WMR OpenXR runtime (the default). SteamVR OpenXR doesn’t work well at all. Leap Motion only works in SteamVR, besides controller emulation is not perfect. What’s needed is native hand tracking support that works in WMR/OpenXR and SteamVR, i.e. that we would be able to see out hands and fingers in VR and interact with all switches and knobs.

The tracking device is very affordable, under $100. And most of us use hardware yokes and throttles, so even though VR Controllers are much better than mouse, they are still not very convenient. Just using our hands would be so much better for real immersion and convenience.

Asobo, please develop hand tracking support with UltraLeap!

Here is a video of LeapMotion already working in MSFS. Imagine this with direct interaction, hands visible in MSFS:

(video reposted from here)

And here’s new generation of UltraLeap tracking - imagine this working in MSFS - that would be a game changer!

https://www.reddit.com/r/TinkerPilot/comments/qrs8hd/the_new_gemini_version_of_ultraleap_hand_tracking/?utm_source=share&utm_medium=web2x&context=3

Some people managed to get Leap Motion in native more in FSX! It should be no problem in MSFS, just need a bit of work from Asobo.

Voted. The implementation demonstrated in the video is a good example of how game changing this could be for simulation if done well by a proper title…

It seems like a lot of the work needed is already done… maybe Asobo just needs to connect to this on their OpenXR implementation? There’s already a beta release download available.

As the first video shows, it works in SteamVR OpenXR, but not in WMR OpenXR - confirmed by UltraLeap person I’ve communicated with.

WE NEED SOMEBODY WHO IS WILLING TO CODE A DRIVER
I’m sure there are capable programmers among MSFS fans, somebody needs to step up to the challenge :slight_smile: If we’re going to wait for Asobo to do native support, it can take too long… We should still push for it, but having an alternative is not a bad idea.

Here’s a quote from an email exchange with an UltraLeap manager. He basically says that someone just need to write a driver for WMR/OpenXR:

“I believe you are correct and that it’s simply a case of someone actually doing it, but we haven’t investigated this ourselves. We haven’t invested resource into this kind of emulation because as a company we’d prefer to find a more rounded solution than simply emulating controllers (we think that interacting with hands is fundamentally different to controllers). That doesn’t mean a third party couldn’t pick it up if there was interest though.”

That was in reply to my email:
“…WMR does support OpenXR, but has its own runtime (as opposed to the SteamVR runtime), so what is the problem then? Is it just a matter of someone writing a driver or a middleware for LeapMotion, or is there a more fundamental problem with MS OpenXR implementation and there is no hope for a driver? Some other projects (Lucid VR gloves, virtual kneeboards like VRK et.) also are not ported to WMR, so I wonder if there is a problem with WMR.”

So UltraLeap already supports OpenXR, just not the WMR kind for some reason. Maybe some talented developer can get this working. It will give us controller emulation in MSFS. Ultimately native support, with visible hands, is the ideal solution, but that needs to be done by Asobo and can take a very long time… We’ve waited for a whole year to get VR Controllers, and it still feels more like an Alpha release, because many aspects are not working correctly…

I have limited knownledge on this subject, but from what I understand of the video, they use a SteamVR driver to inject Leap Motion input into StreamVR itself. This means that applications using the SteamVR APIs can see the Leap Motion as a controller. I believe that when you are using the “OpenXR runtime for SteamVR”, the OpenXR API calls are converted to SteamVR API calls. So when the application queries the controller input from OpenXR, it receives the input from the SteamVR API - ie the Leap Motion input.

As shown above (hypothetically), the OpenXR runtime is not really involved with any hand tracking. It passes the information through from SteamVR to the application. It just turns out that because the Leap Driver injected the data directly into SteamVR, it just works.

This is fundamentally different from what the UltraLeap API layer that you posted is doing. An OpenXR API layer is inserted between the application and the OpenXR runtime, in this case adding the hand tracking functionality through the XR_EXT_hand_tracking API. The underlying OpenXR runtime is not involved at all. The OpenXR runtime does not see the API layer.

When an application implements hand tracking via the XR_EXT_hand_tracking API, using the UltraLeap API layer will intercept the calls to the XR_EXT_hand_tracking API and respond to them:

But if the application does not make use of the XR_EXT_hand_tracking API, then the layer does nothing. The traditional controller APIs are still handled by the OpenXR runtime, which does not see the API layer anyway:

As said by the UltraLeap rep “we think that interacting with hands is fundamentally different to controllers”, and this is why the XR_EXT_hand_tracking API (exposing hand joints positions for all the bones in your hand) are completely separate and different from the controller input APIs in OpenXR (which only expose one position/orientation per controller).

When using an API layer, the underlying OpenXR implementation does not see anything. So it is not possible for the OpenXR runtime to access the hand tracking data. For applications that do not support input via the XR_EXT_hand_tracking API, and as pointed out by the UltraLeap manager, a third party “emulation” driver needs to be implemented. This could be in the form of another API layer that could be inserted “in front” of the UltraLeap API layer, and retrieve the hand tracking data via the XR_EXT_hand_tracking API from the UltraLeap API layer, before processing this data and exposing it via the controller input API to the application:

Here again, the OpenXR runtime is not involved at all with this process.

I am not 100% sure that everything I described above is 100% accurate (esp. the SteamVR and Leap bit), but I think it should be close. I hope it helps understand the reason why integrating the UltraLeap API layer cannot be done in the OpenXR runtime. I also hope it helps any developer out there to get started on investigating OpenXR API layers and how to create this emulation shim.

3 Likes

Thanks a lot for the explanation. I hope it helps someone to understand it better. Barring a native support from Asobo, writing a controller emulation that works with Reverb G2 and other WMR headsets is our best bet for natural and convenient cockpit manipulation. It would solve all of the problems with the mouse and having to grab the controller often. Once Asobo irons out the bugs in controller implementation, it should be very practical and convenient. Just need somebody knowledgeable to invest some time into coding that. Not an easy task.

1 Like

Best implementation of VR and hand-tracking/controllers.

Definitely looks great. Not for the yoke of course, but everything else.

I should have stressed, of course, this was 2018. Agree on your view regarding the yoke. Tried it, but not offering real tactile support other than haptics feels weird.

Let’s hope Asobo will really take VR seriously. With Meta and all - it’s here to stay. The problems and missing features are really obvious to anyone who loves VR.

1 Like

@RomanDesign, since I posted my explanation above, I’ve been prototyping the “hand tracking to controller” emulation OpenXR API layer, and after a couple of Saturday afternoons, I’ve reached some pretty encouraging results. Using a Leap Motion Controller and a Windows Mixed Reality headset (Acer in my case, but it will work with Reverb too), I am able to use my hands to grab the yoke and navigate the buttons on the control panel (like the video on the thread).

There’s a lot of tweaking/configuration to do to make it a pleasant experience (hand offset and gesture calibration), and also since I am not an MSFS aficionado, I’m not sure what a pleasant experience should be like :slight_smile:

I’ll try to make a video soon, and after a bit more testing, I’ll release the project on GitHub (it’s under 1,000 lines of C++ so far). I’ll be looking for people to test it too.

Note that this is a personal project and not affiliated with Ultraleap nor Microsoft.

9 Likes

I’ve done a bit more tweaking today, and it’s now minimally usable. Still more work to do on the config file for MSFS 2020, but since I don’t really play the game, I’m not sure what people want. CC @RomanDesign.

Here’s a quick video of it in action:

MSFS 2020 with Windows Mixed Reality and Leap Motion hand tracking - YouTube.

(You’ll probably notice that I have no idea what the buttons that I am touching do :smiley:)

The code is open source here:

There are instructions for those adventurous-enough to try it out. The Alpha release #1 includes a basic configuration file for MSFS 2020 (as used in the video). Please submit feedback on GitHub directly:
Issues · mbucchia/XR_APILAYER_NOVENDOR_hand_to_controller · GitHub

If any developer is interested to contribute, I’m happy to help getting started with the project!

4 Likes

This is very encouraging news! I wish I could test it, but I don’t have UltraLeap yet. But I’m eager to see other people’s results, especially with Reverb G2, and your video when it comes out.

@mbucchia Kudos to you for taking this on! You deserve a lot of respect for taking your time and working on that. If you like some feedback on what features would be optimal, I can share my thoughts on this, based on your video when you release it.

I can tell you that if this gets to a really workable stage, this will be great news for MSFS VR community!

2 Likes

ALRIGHT!!! So many of us have been waiting for this to happen and it’s exciting to think that it may be sooooo close! Thanks for your time and talents in making this a reality, even if in it’s infant stages.

1 Like

I’ve read the readme file and I’m very excited about the possibility… As soon as you have the video, I will watch it and will provide feedback. For now, just a couple of thoughts:

  • Keep in mind that the next update will hopefully fix the rough rotational controls on WMR, so don’t try to fix those - they are supposed to be bad in SU7. Does it actually work if you pinch your fingers and rotate?
  • Once everything works, one potential issue would be the lack of vibration feedback when you touch the control zone, that you get with a controller. I suppose that could be beyond the scope of an API layer, but it would be nice if controller vibration feedback could be translated to playing an audio file on a specific device (default or other). That way you could hear a “click” or any kind of a “touch” sound when your hand contacted the control area, for example. As an option, the sound could be routed to the audio device used for transducers. Feeling a “click” in your bottom, when you touch the control may not be super realistic, but it’s better than no feedback at all.

I agree with you on some of the difficult controller interaction.

What I found out during my testing is that because of the “aim” form of the control (you have to aim at a button/knob), it limited the type of gesture you could do. For example, I would like to do true pinching (where you use thumb and index to pinch), but I noticed that doing this gesture affects the aim too much (because when you pinch, several of the bones in the hand move slightly, thus deflecting the aim ray), and you end up “missing” the button or knob.

So if you look at the video I posted above, I used the “index bend” instead of pinching to solve that: the base of the index finger (the proximal phalange) is used to aim, and that bone does not move away when doing an index bend. It does feel a bit less realistic :slight_smile:

But yes once I do index bend to grab a knob, I can rotate it (for the comms controls) or push/pull it (like the throttle).

I’ve just read somewhere else on the forum that there is a way to change how the controller can be used (by pressing the B button or something like that) and go from “aim” to “grab”, so I’m going to look at that. Might be a better experience for hang tracking.

I haven’t played the game with the VR controller much, but what do you need besides a trigger? Any other action? You know how the HP controller has 3 buttons, a joystick, the trigger and the grab trigger. How many of these are actually useful in game?

It’s hard to configure several gestures at once because they tend to interfere with each other. I was thinking that:

  • Pinch or finger bend would act as a trigger to push the buttons or select a knob to turn;
  • Palm tap (using the index of one hand to tap the palm of the opposite hand) could be one of the buttons, for example to enter the menu.

Do you need anything else for cockpit interactions?

For haptics, it is possible to intercept haptics feedback by adding a little bit of code in the API layer, and playing a sound should be possible without too much effort.

1 Like

I watched the video - sweet! Looks very promising. You just need to implement direct interaction (B button). Which MotionLeap device are you using? I’m confused about the hardware options they offer and where to buy the right one…

Yes, absolutely - the “laser pointer” mode is (or should be) actually a secondary mode, and it’s only useful for interacting with switches that are blocked by your phyical hardware and aren’t directly reachable. It’s also not that easy to aim with the controllers. The primary and most useful mode is actually the direct interaction mode. I think it’s the one after pressing the B button. Then you reach for the switch, which lights up, and interact with it. That’s what is needed the most.

Actually, that is the purpose of the hand tracking - reaching and interacting, not laser pointing…

So, you need to map that B button for some gesture, to be able to switch between laser and direct mode in an effortless manner. Ideally without having to use another hand, that should be on the yoke. like, maybe lifting the thumb and stretching the index fingre in a “pew-pew” gesture, if that’s an option. Not sure what the options are.

Maybe having an option (gesture, or config file option) to disable the left hand entirely would be a good idea. If the left hand is on the hardware yoke, we don’t want the UltraLeap to interpret your hand as grabbing the virtual yoke.

Also, test the buttons on GPS units etc. - the cloder to “pressing the button” it can be, the better. So maybe there can be altertative gesture to the trigger, like for example both pinching index and thumb, and bending the index finger could be interpreted as a rigger press.

When I tested the VR controllers they often went to sleep in MSFS and disappeared, and I activated them by a “grabbing trigger”, then used the normal trigger from there on to interact. So maybe it’s a good idea to implement a grab as well. That would wake up the ghost controllers in the game without accidentally triggering any switch.

Oh, one more thing: using “windows” key on WMR controller is a must, because many of us bring the Desktop windows into the sim - I use it for PDF charts, making notes with pen tablet on chart PFDs directly, seeing any programs running along MSFS. It’s crucial to have the ability to bring it in. “Windows menu” should be one of the complicated gestures, because it’s only needed for setting up the flight, so two-handed gesture should be OK, something that’s impossible to trigger accidentally. While not critical, this would prevent us from using a controller at all, even when setting up a flight.

So, based on what I wrote so far, support is needed for the following, as far as I can imagine without actually trying it:

  1. Trigger - two gestures mapped to the same trigger button: pinch, and bending the index finger.
  2. Grab trigger - for activating the virtual controllers, and possibly operating door handles etc. via grabbing / fist gesture.
  3. B button - for switching to direct/laser mode.
  4. Windows menu - for bringing in Desktop window - via two-handed gesture.

For the sake of future compatibility, I would suggest also mapping A button to some of the unused gestures. At the moment, thumbstick can only be used as a shortcut in menu, for example as a workaround for the badly rotation controllers, one can map thumbstick to value+ and value- to dial values. But I think this would be too complicated for the gesture control. So as long as both triggers are supported, and B button - it’s usable, and with Windows Menu and A buttons we’re golden!

1 Like

Just wanted to pop in and show my support for the work being done here. Firstly, thank you!
I don’t have a Leap controller but it’s nice to finally see WMR native OpenXR getting some actual development and forward motion (no pun intended :grinning:) instead of having to rely on SteamVR.

Maybe it can form the basis for many a hand-tracking implementation to come!

Bravo and good luck!

2 Likes

PS. Next thing is to swap out those pesky controller models for actual representations of hands in 3D space :star_struck: