Hand Tracking in MSFS - Leap Motion / UltraLeap native support for WMR/OpenXR

To begin with, a big THANK YOU to @mbucchia for implementing that feature and to @RomanDesign for starting the thread and pushing it forward.

I use a Reverb 1 and got a LeapMotion controller as a Christmas gift, and now I am doing my first steps. The controller and support programs (most recent versions) are working fine and I see the skeleton hands in the cockpit - certainly a good start.

Now it comes to calibration and this is where I’m stumbling. There may be solutions hidden in this long thread but perhaps someone can give me a hand at this point.

  • The config file is still under C:\Program Files\OpenXR-Hand-To-Controller and write-protected. As I learned in this thread, the proper procedure would be to save under /MyDocuments and then copy it over manually - after every change I make…?

  • I would be grateful for a recipe who to handle the config program and the simulator in parallel. Right now it’s a continuous Alt+Tab, then headset off, as I need my goggles to read the values on screen, and on again (I use lens insets within the headset). As far as I understand, configuration isn’t possible from within the sim cockpit right now, but how are you clever guys doing that?

Overall I think this has a lot of potential and I am already excited to see the hands moving in a proper way just as in @mbucchia’s video above (just not where I want them).

Thank you!

To overcome write-protected file issue, you can run configuration application in as admin mode (right click on application icon, select Run as administrator (it may be in submenu More)).

1 Like

Thanks, I’ll try this, sounds like a good idea.

End goal is that we create and deliver with the tool a config that works for everyone. In that case no need to use the config tool - unless you wish to adjust something very specific. For now you have to indeed go back and forth to the desktop.

One thing you can do if you are doing things like adjusting the offsets and/or sensitivity is make sure the config too is focused (and not FS2020), then with the headset on, use Tab/Shift-Tab move from a setting to another, then Left/Right key to adjust values. This “blind” interaction is how I got most of my default settings done. Not great but still faster than going in and out.

1 Like

@mbucchia Any idea how/when you can fix the two most show-stopping bugs I listed in my long post above? Do you have an idea about what’s going on there and why?

I can’t wait to continue testing this. I have to tell you, that combined with a cockpit motion simulator with vibration transducers and custom large-throw pendular yoke and other hardware, this hand-tracking miracle of yours creates the greatest VR immersion I have ever experienced. I don’t want to use the clunky VR controllers anymore, they now seem hopelessly obsolete…

BTW to all concerned, a 30-feet (10 meter) active USB extension cable from Amazon works fine for the Leap Motion Controller, if anyone is interested. I needed it because the included cable is very short, and I have VR on a long extended cable setup (+16 feet DP and USB3 extender cables in addition to new 2.0 HP Reverb G2 cable).

I forgot to reply but I saw the post. Thanks again for all the feedback! I agree with you that we should try some sort of keepalive so that the controllers don’t go to sleep. Hopefully that will work and resolve both issues.

As for when, I’m hoping to be able to make a release of the brand new toolkit with the same level of hand tracking support (meaning none of the newer improvements like haptics etc, but will try this keepalive fix and the index fingers tap fix) by the end of the month.

Once all the code is in the toolkit, extending the features will be much quicker!

1 Like

Perfect! With that issue fixed, I’m positive it’s going to be a fully usable solution for everyday flying. All future improvements would be a bonus. There are some MSFS VR implementation issues, like somewhat rough rotational control, but for me and many others those are taken care of by hardware encoders, throttles etc. The worst experience was all the secondary controls and switches not covered by hardware, that felt clumsy, unrealistic and immersion-breaking with controllers (and just dreadfully bad with a mouse or trackball). And this solves it. When I unlocked and opened the TBM door with my left hand, I couldn’t help laughing out loud - it felt so good, and felt like a real spirit VR innovation.

Hi, just installed the Beta and it is great. I have a novel question. Is there a way to use the software to just display the skeleton hands but without any cockpit interaction and without the controllers popping up? What I want to do is extract the 3d model for a particular cockpit and then replicate it IRL using encoders/switches etc, and use the skeleton hands in simulation so as to locate the switches (which will be 1 to 1 with the virtual cockpit).

Any help appreciated.

Fun use case :slight_smile:

I think if you literally unbind all the actions (in the binding tab) and start the game, you will have the hands show up, but since no action will ever be triggered, the game will never show the virtual controllers.

Perfect. Works great. Time to fire up the 3d printer.

1 Like

Very interesting idea. Before you fire a 3D printer though, you need to test how this would work. There are a couple of potential problems with that approach:

  1. You have to set the HMD position (spacebar) and most importantly, HMD angle, very precisely. A slight difference from flight to flight, and your hardware will not match to VR. That’s difficult. I find that most precise zero position is tree notches forward and once up (mapping actions to VoiceAttack, so I say “move seat forward” 3 times and “move seat up” once, or use a macro I set for that. But the angle is more difficult to set the same every time. A very small difference and radios will be several centimeters off.
  2. LeapMotion tracking seems to be slightly distorted in a non-linear way. I guess that’s from the IR camera lens distortion. So the motions may not map exactly to the physical space around you. It’s close enough to not matter in VR, but mapping to real world may give you a significant margin of error.

I also have some hardware, but it’s not at the real positions. I also fly different aircraft so it’s moot anyway. But my yoke matched exactly to a TBM yoke, which I didn’t expect. My left “virtual” hand hugged the left yoke handle in the plane. That was neat!
2.

Position calibration probably will be needed every time you start MSFS, but you can adjust VR head position with keyboard. E.g. put you real hand/finger on some control, and then adjust VR head position till virtual hand is on same virtual control as you real hand.

In the upcoming OpenXR toolkit, there will be an on-screen menu (displayed in the headset), something basic like those on-screen menus for DVRs and monitors:

This is not meant to replace the current hand tracking config tool, because there are SOooooooo many settings for hand tracking that this on-screen menu is not practical. However I can fit 2-3 of the most important settings on there. There will be a “hand tracking on/off” already. What other simple choice/slider options would make sense?

For the advanced settings like the gesture binding and near/far sensitivity, the same config tool will be available.

2 Likes
  • enabling/disabling left hand (or toggle left/both)
  • controller reset of some kind (when/if implemented - for when a ghost controller is stuck floating in your view)
  • laser/direct mode toggle

Actually it would be nice to have a keyboard shotrcut for all those as well, because then they can be assigned to VoiceAttack and be voice operated.

Got this running on my pimax 8kx with handtrack module under steamvr!

It’s not quite useable yet but the potential is there!!!

I have to figure out how to “orient” the camera in the ultraleap software.. in the visualizer, objects are higher than in real life, resulting in my virtual hands being in a higher position. Is there a ultraleap camera tilt offset in a config file anywhere? i think its off by about 15 degrees.

Also the ‘laserpointers’ in MSFS are terribly annoying, when they are onscreen, i cant use the mouse, and it sees my hand on the mouse so it activates the pointer. That, and combined with the natural curl of my hands means all kinds of things are getting clicked when i dont want them to!! i was able to adjust the orientations to something that makes sense using your config ui. wrist aiming seems to be the best.

Also cant seem to get the laser/direct B gesture going.

Looking forward to future updates!!

Which software? Sorry there’s a few things discussed on this thread. I’m certainly interested at some point to understand if my OpenXR hand tracking software works with Pimax, but it sounds like you are talking about about different SteamVR mod here?

Im talking about OpenXR hand_to_controller inside MSFS, (although i also use your NIS software) and yes it works fine with Pimax.

I had previously tried that steamvr leap mod by SDraw, but yours is much better so far since it draws the hands, the controller positions are easily configured, and although i have not tinkered much with gestures yet, those look simpler to tweak as well

Oh cool!! Thanks for letting me know that it works. This is the official Pimax hand tracking module (by Ultraleap) right, and not just the first generation “Leap Motion” device?

I’ve been lagging behind a little bit on this software since the initial release, but I’m finally picking back up and going to work on some of the issues and improvements.

Yep it’s this one that bolts on underneath the unit

Great! And all you did was follow the instructions already in my Readme, so install Gemini + the Ultraleap OpenXR layer, and that’s all?