Hand Tracking in MSFS - Leap Motion / UltraLeap native support for WMR/OpenXR

Guess what, I just bought a Leap Motion controller! I couldn’t resist, saw a local listing from a guy in my neighborhood. Got one in pristine like-new condition in box etc. for $80CAD (about $63 USD). So let me know when your next version is out, with B button support for direct mode and whatever else you fit there, and I will be happy to test it and provide feedback. Direct mode is essential, so there isn’t much sense to test the version without it. I’ll need a few days to set it up, print a 3D mount etc. but then I’ll happily be an alpha / beta tester for you.

That would be great, but I’m afraid that part has to be done by Asobo. I don’t think it’s possible to replace controller models in MSFS from outside. But who knows, maybe it could be modded somehow… Even if models are replaced, fingers won’t be moving. I’d love to see whole hands of the pilot in VR, not a disembodied blue hands - the whole body actually, it would be more immersive. We probably need some Asobo effort to do that though…

I know you won’t like this, because I know you like your hardware controls :slight_smile: BUT, you could disable the hardware yoke instead, but leave it in place on your desk. Then, when you grab the physical yoke in the real world, MSFS/Hand-tracking would register the grabbing of the virtual yoke in the virtual world and the two would not conflict! You would get the feel of the physical yoke, but MSFS would be using it’s virtual yoke… :laughing: :laughing: :laughing:

I don’t think that would work. That would require perfect orientation of real yoke to be at the same place and height as a virtual one, and same throws too. That’s not going to happen :slight_smile: Also, Leap Motion will lose tracking from time to time, certainly when you turn your head to look left or right. So you would only be able to control your yoke when you are looking at it. That’s not an option. I’m not sure disabling left hand is needed, that needs to be tested, but it may be useful. Maybe Leap Motion won’t even track your left hand when you’re grabbing the yoke, as it can’t see your fingers then. But controlling virtual yoke while grabbing a hardware yoke is not likely to work. You can test it of course.

Thank you both for all the input, this is exactly what I needed!

The device I’m using is the base Leap Motion Controller (the one that is $89 USD MSRP, though I got mine off eBay for $60 USD). Don’t buy from Amazon they are mega overpriced there for some reason! I’m using a simple “command hook” sticky strip to mount in on my headset (though I am using an Acer headset for convenience, which has a nice flat surface for sticking, Reverb might be a different story…).

It does appear to lose tracking once in a while when doing certain specific gestures, but still acceptable. And this might actually help with the yoke issue! As you mentioned, with the fingers getting obstructed, this might be all we need to avoid virtual yoke interference. I already have an option to fully disable one hand or the other, but it’s obviously not great for playability.

The concept described looks reasonable. On the 4 actions, I think they are doable, Windows key might be challenging however because it’s technically not handled by OpenXR but by the system instead. I might find a way to intercept the window handle and simulate it another way.

I’ve tested the direct mode (button B) and it wasn’t as great as I thought, but I think it’s because it needs more calibration with the hand ray position (which is very difficult to do “blindly”). For that reason, I am currently (well, at night) investigating drawing the fully articulated hands in the application, which I managed to do in the API layer for an unmodified OpenXR sample, but not working properly yet with MSFS. If I can get that done, it will make tweaking the config file much much easier (plus it might solve 2 issues at once, giving nicer visuals).

If I can get that to work, I’d love for someone to look if there’s a way (even hack) to disable the existing controllers visual so that we only see the hands drawn by the OpenXR API layer.

There’s one aspect I totally forgot to mention, and I’ve done 0 work on it: performance. I have no clue how the hand tracking (both the additional processing done by Leap itself and the overhead of the API layer) affects the game performance. It’s definitely a risk to keep an eye on eventually.

@RomanDesign let’s start a DM thread so we can discuss beta testing without spamming the forum!

1 Like

Yeah, it was kinda meant tongue-in-cheek - was just thinking outside the box about how to solve the dual inputs problem!

That’s fair enough, but don’t keep the rest of us in the dark for too long! :slightly_smiling_face: This is interesting stuff. I might even have to get myself a Leap!

1 Like

Please continue the discussion here (even though it’s beta) as I’m also very interested in your progress.

I’m therefore planning to buy a leap motion controller as soon as I can get hold of a cheap one and will then help with testing.

1 Like

Ok will keep updates.

Though, to be clear to everyone - @wheeliemonsta and @RPthreenine in particular - I am still considering this project as “risky” ie there’s a chance it just won’t be satisfactory (tracking stability insufficient to make it a good experience, or too much performance overhead, or maybe to get all of this right will take way beyond the amount of personal time I can invest).

In other words, there’s still a chance we won’t get this to work perfectly. I just want to be sure I say that before people start buying devices here and there and hold me accountable for their expenses if the project fails!

Glad to help! I’m really looking forward to making this work, for the VR MSFS community. You’re doing a great job!

Great, that’s what I bought. I checked it today and it’s working fine. I will 3D print a mount. It seems that the best position would be on top of the HMD tilted 45 degrees down - camera should get the best view that way. So I will print a GoPro style mount that can pivot.

You may want to look it up, but in a way, this key takes place of the menu button of sorts. Also, pressing keyboard Win key opens the VR menu just the same as Win key on the controller does. So if you can just send a Win keystroke via keyboard and not VR controller, I think it should work just fine.

Wow, that would be absolutely fantastic, I didn’t think it was possible.

And I could update the topic or create a new one to ask Asobo for making the controller model tweakable and moddable (transparency setting and model replacement). It would be useful for everyone, because in X-Plane a custom “ghost” controller that was almost transparent worked much better, and theat’s what many people used. Many would prefer a fully transparent model, or just a pointer instead of the default controllers. Or their actual controller model instead of generic one.

Needs testing, but I don’t expect a lot of FPS hit. We can also launch apps with CPU affinity to the last couple of cores via command line or Process Lasso app. With modern CPUs there are plenty of cores to take care of MSFS - it only uses a few. I have 12 physical cores / 24 logical cores, so it shouldn’t get in the way of MSFS if you take it away from Cores 0/1 and also the 2 preferred cores that system uses a lot.

You can see that people want to read about the progress here, so I think we should continue here. This will also keep the topic on top, so more people could read about your API layer and maybe Asobo will take notice and make things easier with time.

1 Like

It’s always prudent to make such disclaimer, but I think that’s very clear to everyone involved. I bought the device, knowing the risk, as I’m sure others would too. I’m very eager to see it work and I really hope it will be a solid and smooth experience, but obviously you can’t be held responsible if you’re short on time or hit a technical brickwall to make it work well. I’m keeping my fingers crossed though. You made the most amazing progress already, and it’s a very promising proof of concept. As far as I understand the minimal acceptable functionality is almost there, needing more adjustment with offsets basically. I’m sure you can do it. If it works, everything else, like working hands, is a bonus.

I’ll help with testing if it helps… I should be ready to test after the weekend, once I get the mount done and set up.

2 Likes

Totally understood. We all do this as a hobby and it’s fun to play with stuff but I certainly won’t be getting angry if it doesn’t work :+1:

1 Like

Not a problem, Matthieu! I am so very excited, that you have joined our community trying to further improve our VR experience and definitely want to help you wherever I can. I see this as a very interesting project and I am very aware that every project has a possibility of failure.

Being an IT guy myself it wont hurt me (and my wallet) that much when investing a couple bucks to experiment with new gadgets :slight_smile:

2 Likes

Thanks all for the kind words!

Here’s a screenshot showing my latest progress from tonight:

This shows:

  • Articulated hand skeleton rendered inside the game. Not sure yet WHY there are two right hands :smiley: I have to debug! You can see the controller model hidden behind.
    It’s not super aesthetic yet (it’s just a bunch of RGB cubes) but it shows the joints location which really helps for debugging and coming up with the calibration. As a bonus, we can add a nicer hand mesh later, but beware that this “post-processed” rendering is unaware of lighting so it will always look a little weird. Oh also, this only works for DX11 now, but I think I know of an easy way to make it work with DX12 later on.

  • I’ve started a configuration UI so that parameters can be updated in real time instead of having to Ctrl-TAB out and back into VR. Super helpful to tweak that “ray” orientation. I’m finally sort of able to use the direct interaction mode and it feels more natural now that the orientation is more in-line with the hand, but still needs some adjustments.

Also, one major issue I’m having now is that I somehow broke Ctrl-TAB, it now crashes the game :confused:, I’ll have to investigate this ASAP. I just added a lot of code at once for the hand rendering and I must’ve broken something.

6 Likes

That looks great! Im wondering, how do you manage to show “one eye only” of the actual VR screen within the WMR window? This is by the way also a functionality, which would be more than great to have :wink:

Could this have to do with some of the former “bugs” where Ctrl-TAB was giving a CTD in the 107. runtime?

Some side note:
I’ve just pulled the trigger on an ebay Leap Motion Controller for € 40,-. It might take some days before it arrives but I’ll be soon ready to do some testing :slight_smile:

That is absolutely amazing. Thank you very much for your efforts!

This is a feature of the WMR portal:

  • There’s a little “Play” button that should appear bottom-middle of the portal window
  • However it is only available if your PC qualifies as “Ultra” (which if you are playing MSFS you likely qualify)

But this view has 2 issues:

  • It may affect performance (extra copies on the GPU)
  • It does not play well with the MSFS keyboard/mouse focus model (you still need the other MSFS window to have focus to get keyboard/mouse inputs)

Thanks for the suggestion, but it’s not the same symptoms (hard crash, not failed to reinitialize VR) and and the way we (OpenXR team) fixed the other (less common) hard crash would make it truly impossible to happen. I had a very similar issue when I first started the API layer project and it had to do with the ordering of the calls during cleanup - I suspect something similar here too.

This is fantastic! You’ve made great progress.

I think you should make the mesh dark (black or dark grey) and much more opaque than the default ghost controllers. Very dark texture would not look so weird because the lack shadows etc. won’t be as noticeable. And more opaque will make it dominate the view and make the hands more solid than the ghost controllers.

This is surpassing my expectations, the fact that you can actually inject 3D graphics inside MSFS is amazing. I thought it’s not possible for some reason. Actually, this means that it must be possible to do for MSFS what was long working great for XPlane: creating an application that has an actual 3D model of an ipad that shows PDF charts and has a scratchpad, instead of an unrealistic 2D window implementations that exist now. Or maybe “skin” a WMR desktop-inside-vr app interface to look like a tablet. But that’s another story…

I’m 3D printing a modified Leap Motion holder now:


From the initial tests with UltraLeap Visualizer it seems that it’s sensitive to the center "hotspot
of it’s IR camera and the best tracking is happening in the center of the image. So the best coverage would be from the HMD-mounted controller, but not flat, as most people seem to have it, but at about 45 degrees down. You don’t move your hands in front of your face when in cockpit, but everyone mounts the Leap Motion flat for some reason. There is an “tilt angle” adjustment in the Ultraleap OpenXR Hand Tracking API Layer, so that should work. I rigged a kind of a GoPro holder to be able to mount it on top edge of HMD and tilt down as I need.

Small update: 45 degrees angle didn’t work at all, at least not with GitHub - SDraw/driver_leap: Self-sustainable fork of SteamVR driver for Leap Motion controller with updated vendor libraries layer that I’m experimenting with in SteamVR (easy to tune) - too much distortion when moving head left-right. Controllers seem to rotate when turning head when LeapMotion camera is at an angle. Looks like it works much better in a traditional way (mounted flush on HMD) so I’m printing a Reverb G2 specific holder.

GitHub - ultraleap/OpenXRHandTracking: OpenXR API layer enabling XR_EXT_hand_tracking support using Ultraleap tracking does have a tilt adjustment setting, but even if it works well, I still want LeapMotion to work with other SteamVR games, not just openXR.

1 Like

I printed and attached the holder. Tested the visualizer and OpenVR runtime - it works. I installed Ultraleap OpenXR Hand Tracking API Layer and I’m ready to install the next release of @mbucchia a layer and test in MSFS.

General Leap Motion hand tracking feedback:

It’s an amazing piece of technology, but it’s not ideal and presssing buttons can be a bit glitchy, or maybe I’m just not used to it. Hands are tracking for the most of my field of view, but they disappear close to the border of my vision. Which is not too bad, as I usually look where I touch. But still, reliable tracking is less than full FOV of modded Reverb G2 with custom-printed thin gasket. I don’t expect it to work 100% reliable in MSFS, it’s sure to have some hit and miss issues. But the benefit would be amazing, so some occasional glitches should be tolerable. When it’s tracking - it’s tracking quite reliably, I see no glitches in movement, until it gets close to the sides and disappears. I think there’s still some camera FOV left, it’s just getting dark, so maybe if some additional IR LEDs are used to light up the cockpit area the tracking would improve even further. It’s worth exploring in future, once everything else works.

The most precise gesture for trigger seems to be touching the middle of ingex finger with thumb. Hand stays steady, much steadier than with other gestures, so that should definitely be used for triggering switches and buttons. Also it’s detection seems to be the most precise, with almost no false triggers or undetected triggers. It almost feels like a button, it’s that precise. Also, it’s similar fo phisically flipping switches, so it’s more appropriate than a “trigger pull” gesture.