Hand Tracking in MSFS - Leap Motion / UltraLeap native support for WMR/OpenXR

Great! I got all of my issues solved I think:

  • CTD on exit VR should be fixed but needs more testing
  • Hands display is fixed (no more double vision)
  • Adjusted hands rendering to look better. Still pretty crude but at least looks better (no more RGB coloring).
  • Added a skin tone option to be inclusive and also allow people to select brighter/darker based on preferences
  • Configuration tool is fully functional and all options can be changed in real time and saved to a file to make them permanent
  • Support binding a gesture to the Windows menu button (note: this can only work one way right now, once the Windows menu is up, the hand tracking no longer works until the app is resumed using keyboard/mouse)

Still to do:

  • Tweak default values and produce a configuration file for MSFS. Hoping you’ll be able to help with that!
  • Support squeeze gesture
  • Support hand opacity setting (currently fully opaque)
  • Support DX12
  • Optimizations! Mainly caching hand poses to avoid duplicate requests between the hand pose code, the gesture detection code and the rendering code.

I’ll see if I can make a video tonight and release a new version, probably in a good enough shape to be upgraded to Beta stage!

1 Like

Great news! I’m looking forward to the video and the beta release. I will test, including offset values etc. and will provide feedback. Will create and tweak MSFS config file. Will hppily help with that.

I think Asobo just opened public beta for the upcoming update to Steam users. It has VR controller fixes and updates, including button mappings and rotating WMR controls. I will install the beta and test on it.

I saw a picture somewhere of having the Leap mounted on your chest, i.e. below the headset. I know it’s not perfect in terms of where to run the cable but perhaps would give you better tracking of hands in the cockpit?
I guess overhead switches would be the biggest issue here?
I don’t know because I don’t have one of course, but would it work mounted facing you like a webcam? Perhaps that would provide a good tracking area? Just ideas


Would you be willing to share the STL for the updated G2 Holder?

Strange I don’t have that “Play” button. I consider my rig as quite powerful (11900K, 3090 OC, 32GB CL16 3600, NVMe). Any settings I might have forgotten?

Here’s a video showing my latest progress (sorry the quality/lighting isn’t the best):

Progress update on MSFS 2020 with Windows Mixed Reality and Leap Motion hand tracking - YouTube

You can see the hands being displayed in the cockpit, using “left index to right index tap” to emulate the B button (switching between ray mode and direct interaction), “left wrist tap” to go to the menu (but could be replaced with Windows button too in the config).

Then I go a grab the altimeter calibration knob by “pinching” then turning, which feels pretty natural, also use the dashboard dimmer in the same way. It’s a little bit hard to see in the video, but you can see the altimeter needle move and the lights dimming. I use the same pinching to grab and pull the throttle.

You can then see me use “squeeze” to switch hand focus (switch which hand is active), and release the brakes (which did take me a couple of tries, I think the squeeze detection needs a bit of tweaking). I obviously have no clue what I’m doing in the cockpit so I start swerving to the side of the runway, then I visibly panic and pause to the menu :smiley: :smiley: :smiley:

Here are also some screenshots of the configuration tool showing what can be tweaked:




I’ve implemented a “thumb press” action which is what @RomanDesign you suggested in your latest post, but haven’t tried with it yet. Definitely something to experiment with and compare with the pinching!

Fairly happy with the handling so far! I did find a quite noticeable bug that will delay the beta release: in the headset, there is an error in projecting the hands so the way they are drawn is a bit blurry. But I think I’ll be able to fix this quickly. I’m hoping to release the beta tomorrow or Saturday.

2 Likes

Fantastic news! Config menu looks much more extensive and customizable than I expected. Great job! Better than both OpenVR and OpenXR layers I’ve installed, no need to mess with text config files and restart every time.

I tried to upload this remix to Thingiverse, but it’s so buggy it won’t let me - I get 404 errors on trying to upload anything there. This site doesn’t allow ZIP or STL files, so I renamed it to PLN file - rename it back to STL for printing. I modded the file I found on Thingiverse to retain the Reverb G2 curve on the back, but I cut away all the bulk so it doesn’t get in the way of the cameras and allows placing a bit lower on the HMD than the unmodded file. Hopefully that reduces the chances of interfering with HMD or original controllers tracking. It’s easy to snap away the Leap Motion controller when not needed. I guess this is the best mount for G2 owners now. I couldn’t find anything better, so I modified the bulky one I found in Blender and exported back to STL. Printed fine. The bottom corners are at about 45 degrees and I printed it without supports, so they came out a bit uneven on my printer. The part that’s fitted to the Motion Leap itself is a standard unmodified developer mount STL found on Thingiverse.

MotionLeapHolder_ReverbG2.pln (19.4 KB)

1 Like

I just learned something myself - this feature is not available when you select “Optimize for performance” in the “Headset display” → “Experience” settings. I am guessing this must be the setting you are using? Maybe try the other settings too?

As mentioned this feature does incur a small performance penalty, so it would make sense to force disable it in performance mode.

Note that with what I’ve learned in the last 2 weeks about doing OpenXR API layer, I think it would be quite possible to write a small API layer that creates a preview window and only blits one of the two eyes onto it. However this would (still) lower performance (but don’t know how much).

For short we can take some D3D sample code showing how to render a triangle onto a window (there’s tons of those examples out there), then look at the code I wrote for rendering the hands in my hand tracking code, and instead of rendering the hands just do a CopyResource of the left or right eye texture to the window from the D3D sample code. It’s probably a day of work at most for a D3D programmer with minimal knowledge.

This “inject 3D graphics” does have quite a few limitations.

This is how I am doing it today: I intercept the rendered 2D image and the depth buffer from the application, and perform some extra rendering onto it before passing it along to the OpenXR runtime for display.

So think of it drawing in 3D on top of a 2D background. If I can get the depth buffer then at least I can do occlusion. Somehow this isn’t working for MSFS right now (one of my To Do’s), so you can literally send you hand inside the dashboard and it still displayed. It’s a bit confusing the first time it happens. I’m able to do occlusion with other test apps so hopefully it’s a just a bug to fix on my end to make it work with MSFS.

But as I also mentioned, the way this extra rendering is done, it has no knowledge of the actual scene and rendering made in the game. So lighting/reflections won’t work. Don’t even think about physics (collisions). And also any interaction has to be implemented yourself (for example if you wanted to use the hands to grab and move an object rendered by the layer, this is all on the developer).

This technique does not allow to use a real engine to do rendering and gameplay. The code for rendering the hands I have today is literally drawing triangles one at a time :smiley: which for complex things can get quite tedious. Maybe something like DirectXTK could be used without too much effort, but making an entire scene with an object (iPad) rendering external content (slates) and having interactions like scrolling the content or moving the object around - you’re talking about developing a whole game on top of the game!

This would be a several weeks/months project


Baby steps :slight_smile: Let’s just make the hand tracking work - you’re almost there. That will be a huge deal for many of us. I can live with WMR Display window I bring into MSFS. Not very realistic, but not worse than other 2D windows in MSFS. The benefits it that I bring a separate monitor/desktop in that is mapped to my Wacom pen tablet and I can make notes on the charts with a pen, and see and manipulate any Windows app like Pilot2ATC, VATSIM client etc.

I’ve been a little bit busy with another project that y’all might appreciate:

Back to hand tracking, I have fixed the visual bug with the hands, but it’s late and I’m not going to release the beta tonight. Tomorrow (Saturday) I promise!!

1 Like

I just saw that other post and it blew my mind. Those two projects together may just make this the best experience possible! Right now with my high-end 5900X / RTX3080 / Reverb G2 system I could barely get a fluid experience with GA in medium-built areas like LOWI. NY with custom scenery is a stuttery mess closer to the airports. There’s hope that NIS can help to push these few FPS that are needed for MR to work well with airliners.

1 Like

Here it is, the first beta release!

Looking forward to the feedback!

Top 5 known issues and/or limitations

  • The software was only tested with the Windows Mixed Reality OpenXR runtime.

  • The installer script is known to be flaky. If it doesn’t work, follow the manual steps from the README to update the registry as needed.

  • Hand display is only supported with DirectX 11 applications.

  • The “Load” option in the configuration tool is not implemented.

  • Hand display opacity is not implemented (always 100% opaque).

A note about tweaking the configuration

So this is a little embarrassing. The configuration tool - while able to directly modify the configuration of a running application in real-time and capable of saving the configuration to a file - is currently not capable of loading a configuration file. So if you plan on doing tweaking across multiple runs, be sure to write down all the values so you can “re-create” the configuration before you continue tweaking it.

See my screenshots from this post for the current values shipped in the FS2020.cfg config:

A tip while tweaking configuration in real-time: if you make sure that the configuration tool window stays in keyboard focus, you can then put your VR headset on, and use the keyboard to adjust the configuration while using your hands in VR. Use Tab to move between the different settings (you’re going to need to remember their order) and Left/Right arrow to move slider or Up/Down arrow to change options values.

What about my NIS scaler layer

I’ve posted this morning another OpenXR project. With this, we’re entering the (complex) realm of multiple OpenXR API layers. This can work but requires special precautions. The order of the loading of the layers is important. For now my recommendation is to not try to use both the hand tracking layer and the NIS layer at the same time. I have confirmed that they both work together - but the installation script does not handle this case. Until I make a real installer, I suggest disabling the NIS layer while using the hand tracking layer, and vice-versa.

2 Likes

Documentation

This area is a bit lacking at the moment, as I’ve been focusing on developing the proof-of-concept code.

Here are some useful things:

Grip and aim

Some applications use the grip pose and some use aim pose. FS2020 uses the aim pose to calculate the “arrow” or “ray” of pointing. The grip pose is not used as far as I can tell.

The offset can be modified with a translation and rotation to adjust how the pose of the selected hand joint (for aim or grip) translates to the controller’s pose. You’ll have to experiment with this, but the default value I’ve put so far work pretty well in a “pinch action” model where pinching if the way to grab knobs and press buttons etc


Gestures

  • Pinch: bringing the tip of your thumb and index together.

  • Thumb press: using the thumb to “press” onto the index finger. The target is the “intermediate” joint of the index finger (2nd inflection joint from the tip).

  • Index bend: bending your index in a trigger-like motion.

  • Squeeze: the middle finger, ring finger and little finger bending in a trigger-like motion.

  • Wrist tap: using the tip of the index finger from the opposite hand to press on the wrist.

  • Palm tap: using the tip of the index finger from the opposite hand to press on the center of the palm.

  • Index tip tap: bring the tip of both index fingers together.

As mentioned in the configuration tool - some gestures may interfere with others! For example between a wrist tap and a palm tap, there isn’t a lot of margin. You can use the sensitivity adjustments but ideally, you do not want to have both gestures bound to different actions.

Sensitivity

The gestures described below typically produce an “output value” between 0 and 1 based on the distance between the joints being described.

The “far distance” in the configuration corresponds to the when the value maps to 0. Any distance larger than the far distance will output 0.

The “near distance” in the configuration corresponds to the when the value maps to 1. Any distance smaller than the far distance will output 1.

So when you have a near distance of 10mm and a far distance of 60mm for pinching, this means that when the tip of your thumb and index finger are 60mm or more apart, the control value will read 0 (equivalent to the controller’s trigger being at rest for example). When the tip of your thumb and index finger are 10mm or less apart, the control value will read 1 (equivalent to the controller’s trigger being fully pressed). When the tip of your thumb and index finger are 35mm apart, this control value reads 0.5 (because 35 is half-way between 10 and 60, equivalent to the trigger being pressed half of the way).

Binding action

The action names are currently showing with their OpenXR standard path (eg: /input/trigger/value) but they should still be easily identifiable.

Paths ending with /value typically mean a continuous value between 0 and 1, while paths ending with /click mean a boolean action (pressed or not pressed).

The click threshold option allows to modify the sensitivity of the /click actions.

Hands display and skin tone

I’m currently rendering a very basic articulated hand skeleton in game.

Note that MSFS currently does not submit its depth buffer to OpenXR, which means that the hands will still be visible even if they are behind another surface. This is beyond my control. Hopefully the game ending will submit depth in the future.

I am also proud to support diversity & inclusion by allowing to choose from a brighter skin tone to a darker skin tone. I’m happy to add any other tone (like maybe someone wants a Smurf tone that will be blue).

3 Likes

Awesome! I will install and test it. If I want to try and run both of your layers together, as you did, what steps should I take? Any manual registry edits, installation order etc.?

I think uninstall nis_scaler and reinstall it at the very end will do (don’t need to reinstall every thing, just run the 2 nis_scaler scripts). It’s all based on the order of the regkeys being created.

In our case I think we want:

hand_to_controller
Ultraleap
nis_scaler

The first two are properly handled by the installer of hand_to_controller (the installer deletes Ultraleap before adding hand_to_controller then re-adds Ultraleap).

Thanks, will try to test both today, or tomorrow at the latest. Will report back. It’s amazing what you have been to accomplish in just a few days. Looks like a NIS layer is a hit as well.

2 Likes

I tried to run the install script. Just double-clicking it, ot right clicking and select “Install” simply opened it in notepad. If I chose “run with powershell” it runs, and registry setting was created. But then when I run OpenXR developer toos it took a while and I got the error:
image
I didn’t try to install NIS layer yet.

The if I get to the Developer Settings tab and go back to System Status - I get this screen, as if no error was there. But if I close and rel-launch Developer Tools - I get the error again.

I rebooted and I still get this error every time I launhc OpenXR.

When I install NIS on top of that, the Developer Tool doesn’t open at all, it just flashes and disappears. I copied the files to the same directory c:\program files:\Open-API-Layers. FS2020.cfg files have same name. I just added 2 text lines from one to another. As soon as I uninstalled NIS via script, I got to the same error I had before, but Developer Tools is launching fine again.

Let’s try one thing at a time. Let’s remove and ignore NIS for now.

The ERROR_LIMIT_REACHED thing is always a bit finicky
 real question is, can you start MSFS and get the hands working?