To follow up from @Devrij I just wanted to say hello and say that Iām the main developer on the Ultraleap OpenXR API layer.
Itās great to see all the interest in it in this thread, and the great work thatās been built on top of it and OpenXR!
Iāll admit the length of this thread () Iāve not had the chance to fully read everything thatās been said. Please do reach out with any issue, questions or anything we can clarify or help with, Iāll try to keep an eye on this thread and answer any questions along with Devrij.
@RobPPL59: We do release testing against the tethered OpenXR runtime on Occulus Quest 2. If youāre running into issues, it would be great to get some more details of your setup and issue to support@ultraleap.com, or probably best as a ticket on the GitHub issue tracker where I can pick it up directly.
I also ordered leap motion controller just for fs2020! But not the vr mount. Instead I created my own mount piece with just a simple car camera holder with suction cup on it and simple phone holder for tripods. It is not hurting my reverb g2 also easy to remove and redirect leap motion as I like.
For now I am waiting to delivery for leap motion but here is my concept (blue plastic is representing leap motion controller )
Maybe someone can found it useful or even improve it! I just wanted to share my idea.
It pushes ML way forward. The tracking FOV is not very wide and barely covers most of visible space in G2. With ML that far forward it will narrow FOW and may distort tracking. The UltraLeap software is made to use ML in 2 ways: either lying on the table looking up, or mounted flush on the front face of the HMD. Not much adjustment is possible.
It seems to be off center. That will distort the tracking.
It blocks some of the HMD camera views, which may worsen HMD head tracking.
3D printing the mount, or my version of it for G2 is a better option. Most likely, just sticking it with velcro or sturdy double-sided tape is a better option because of those reasons.
Oh I see. Thank you for the informations. I donāt have any experience with leap motion therefore I didnāt know these subjects but I have some questions just for my curiosity.
Yes FOV will be affected but is leap motion loose track if I change the angle of it approx 45 degrees looking downwards? I mean this black piece on top of suction cup can move upwards, I think with this action may also help to not blocking reverb g2ās cameras.
With 2 seated cockpits, can it be helpful for a bit better tracking due to positions of flight instruments not centered according to seating position? ( I can move this suction cup btw, with this it can be centered)
I thought the same and 3D printed a 45-degree mount first. No, it doesnāt work: huge distortions and virtual hands rotate when you move them around. LM has to be flush with HMD and exactly parallel to it, or it wonāt work well.
You can try, but I doubt it, because it has to match your real hands closely to feel natural - thatās the point, and if it will be skewed you woudl feel weird.
Thanks @XenoPhoenix and @mbucchia!
Iāam posting a ticket on the GitHub issue tracker with the details of my config.
Just a little update for the fellows of this forum: I tried to change the OpenXR runtime from Oculus to Steam.
With the Steam XR runtime the OpenXR Explorer report looks the same (the only difference seems obviously the āruntimeNameā entry), but in this case the OpenXRhandtocontroller log shows no error:
2022-01-31 19:26:14 +0100: dllHome is āC:\Program Files\OpenXR-Hand-To-Controllerā 2022-01-31 19:26:14 +0100: XR_APILAYER_NOVENDOR_hand_to_controller layer (Beta-1) is active 2022-01-31 19:26:14 +0100: Using OpenXR runtime SteamVR/OpenXR, version 0.1.0 2022-01-31 19:26:14 +0100: Loading config for āFS2020ā 2022-01-31 19:26:14 +0100: Emulating interaction profile: /interaction_profiles/hp/mixed_reality_controller 2022-01-31 19:26:14 +0100: Hands display is enabled in projection layer 0 with app (if available) depth buffer 2022-01-31 19:26:14 +0100: Using medium skin tone and 1.000 opacity 2022-01-31 19:26:14 +0100: Left transform: (0.028, -0.054, -0.020) (-0.470, 0.027, 0.102, 0.876) 2022-01-31 19:26:14 +0100: Right transform: (-0.028, -0.054, -0.020) (-0.470, -0.027, -0.102, 0.876) 2022-01-31 19:26:14 +0100: Grip pose uses joint: 0 2022-01-31 19:26:14 +0100: Aim pose uses joint: 0 2022-01-31 19:26:14 +0100: Click threshold: 0.750 2022-01-31 19:26:14 +0100: Left hand āpinchā translates to: /input/trigger/value (near: 0.000, far: 0.050) 2022-01-31 19:26:14 +0100: Left hand āsqueezeā translates to: /input/squeeze/value (near: 0.035, far: 0.070) 2022-01-31 19:26:14 +0100: Left hand āwrist tapā translates to: /input/y/click (near: 0.040, far: 0.060) 2022-01-31 19:26:14 +0100: Left hand āindex tip tapā translates to: /input/b/click (near: 0.000, far: 0.070) 2022-01-31 19:26:14 +0100: Right hand āpinchā translates to: /input/trigger/value (near: 0.000, far: 0.050) 2022-01-31 19:26:14 +0100: Right hand āsqueezeā translates to: /input/squeeze/value (near: 0.035, far: 0.070)
Unfortunately with Steam runtime Iām unable to play FS2020 due to the severe stuttering, but this confirms that the problem lies somewhere in the interaction with the Oculus XR runtime, even Iām not able to indicate in which software layer it residesā¦
I think in order to effectively triage issues, we need to be able to differenciate issues due to my OpenXR layer and the ones that could be related to Ultraleap. I donāt want to start filing issues against Ultraleap if they are due to my project.
@XenoPhoenix, do you have a go-to app for testing that the Ultraleap layer is functional? Just a Windows app that comes prebuilt and uses the XR_EXT_hand_tracking? If not I need to just build something (perhaps StereoKit, which works awesome with your layer).
This way we can have a good test to show that the Ultraleap layer and stack is working, and this means bugs should be redirected to my project instead.
We typically use StereoKit for this purpose, but as you point out, it doesnāt come pre-built unfortunately. I think if there is a pre-built version of StereoKitTest somewhere, itās a good tool to use for this purpose.
Our Stereo IR 170 Evaluation kit has greater range and field of view that makes it the best camera for VR, HOWEVER, it is not certified for consumer use and was never intended to be used as a consumer device (the usb connector is not very robust for long term use and the lenses are not protected) so I have to recommend the Leap Motion Controller. We also have a VR mount kit that comes with a long usb cable and allows you to mount the LMC on the front of the odyssey easily.
Thanks for that answer. If the 170 is the best solution for VR. I can fabricate a VR mount, make the USB more robust and fabricate a lens cover if need be. Plastic or glass for the lens cover?
This app will not use the hand tracking from OpenXR Toolkit, instead it will use the hand tracking from the Ultraleap OpenXR layer. You should be able to see/use your hands. If not, can you please capture the output of the console (text) window?
I confirm that with the StereoKitCTest.exe application everything if working fine also on my Oculus Quest, I can see/use my hands.
I also attach the log, just in case it could be useful.
Thank you for your great work!
Thank you @RobPPL59, I think this rules out an issue with the Ultraleap layer (cc @XenoPhoenix), and means that I have to work with you to investigate what is happening in the OpenXR Toolkit.
We actually have files to print a mount that fits into our standard VR mount bracket on our developer site (scroll down a bit on this page) if you are happy to proceed given the info I provided. There are also quite a few community designed mounts out in the wild for different headsets. But yes, ensuring that there is no possibility of cable tension on the USB port is a very wise step as well.
On the lens cover, it gets a bit trickier as we have very specific spec for what materials can be used, their distance from the lenses and IR LEDs etc. My recommendation would just be to replace the lens caps when not in use and do your best to avoid dust, finger prints and scratches. It might sound obvious, but try to avoid bending the camera pcb as it will throw the lenses out of alignment and result in poor or non existant tracking performance. We have a stiffening layer in the pcb and the enclosure is designed to support it further, but the enclosure is plastic so Iām just throwing it out there that you shouldnāt sit on it in the same way that Tide pods have to say you shouldnāt eat them.
Hi, how does the hand tracking behave when using a yoke outside the headset ? Letās assume the position of your yoke in VR matches your yoke in real life: does the aircraft take input controls from your yoke in real life ? Or from your hands in VR ?
I use hardware yoke. It works. You need to grab and hold the yoke in VR to activate the VR yoke. Normally, unless you are looking down, you will lose hand tracking in the yoke area anyway. I did have an issue once with trim will, which accidentally activated then lost hand tracking and stuck, and plowed me into the ground. But there will be a button reset on tracking loss very soon, so that should not be an issue.