I am using Oculus CV 1, Ultraleap 5.0 (Gemini) is installed and works fine (in other programs). Ultaleap OpenXR Hand Tracking is also installed. I have booted a couple of times now but I keep getting the same error message as fuzeft!
I can also confirm that the OpenXR Hand Tracking software does not generate any log information ā¦
Cool. Depends on the price though. This is pure haptics, no tracking. Trailer says it basically needs LeapMotion to track your hands, and the glove only triggers finger vibrations. Itās a cool thing, but it if costs more than $50, Iād probably skip it. It still needs full native LeapMotion support from MSFS in order to have a full experience.
Iāll give a quick development update regarding my project.
Iāve been very busy with my other (NIS) project but also learned quite a lot of things that will help with the hand tracking support.
At this point, the NIS project (which is actually about to be renamed and undergo a transformation) is taking priority over the hand tracking project since I am using it to create a generic framework for injecting behavior into OpenXR applications. What this means is I am using the NIS project to create some very nice foundations that will help me implement features like the hand tracking (and hands display) much more easily. The current code for the hand tracking has reached a point were its layout is not allowing me to make quick progress, and I am accumulating a lot of debt (which is bad this early on for a project, and also means bugs).
Also - the NIS project has been quite popular, so it also make sense that I am going to give it a push now and try to reach even more users.
That does not mean that I am abandoning the hand tracking - itās far from that! With the foundations I am creating for NIS Beta 2, I will be able to come back to the hand tracking with more solid bases and make progress much faster. Things like improving the hand display or reacting to haptics for example will be much easier to do.
But in the meantime, I will make less progress. It looks like the top priority issues to make the current beta slightly better are:
Fixing the index tap gesture that I broke in beta 2 (to switch to direct mode)
Multiple gestures bindings to the same action
What else?
I can try to address a small number of these quickly, before pausing a few weeks for the next-generation foundations described above.
I think if those two issues can be solved, the beta would be in quite a practically useful state and can last us all comfortably until you integrate your experience with e a NIS project into hand tracking. You developed quite a useful interface, so if those two bugs are solved, what remains is mostly visual issues and haptic workarounds, that are a kind of secondary quality of life improvements. Functionally it will be good enough for practical use. But itās important to resolve those two issues, because without the resolution itās a road block for using the API layer.
Once more, thanks for you work on that, Iām keeping my fingers crossed for those two bugs resolved.
Sorry for the stupid question: what would be the minimum hardware requirement to experiment hand tracking.
I currently own Reverb g2 only.
I guess i will need some device for the hand tracking itself, maybe not needed if i had a Quest 2?
Thanks in advance!
I am very interested in trying hand tracking with the leap motion, even if I donāt have the leap motion yet, I will buy it very soon.
I have a simple question: where should I place the leap motion device? in front of me or attached to my headset (hp reverb g1)? in the latter case hp reverb g1 has a thick cable and i guess leap motion has its usb cable to work ⦠another cable to manage ⦠how do you fix it?
Thanks for any explanation you can give me to remove any doubts about leap motion interfacing
I dont know if hand tracking works with the Quest 2.
I have it strapped in the front if my headset (I use an āoldā Acer one for development) using a command hook. Some people 3D print a bracket. Thereās also an āofficialā mount you can purchase from Ultraleap but I think itās not advertised for Reverb.
Yes you will have to ābundleā another USB cable with the rest of your cables⦠I used twist ties for that, put them every 1-2 feet and also twist both cables together.
I have the original one (cheap). I have not tested either my software nor the SteamVR software with the new one, so I cannot tell. I hear it is better, but again I donāt know if it will work with FS2020.
I got the original one in like-new condition secon-hand for about $62 US ($80CAD). To my surprise someone was selling it right in my suburbia neighborhood. So check your local listings. Many people bought it, only to find out that itās not very useful for most games and applications. It is shaping up to be extremely useful in MSFS, so their loss - our gain!
I think the original device is available new for about $100. It works well, but can lose tracking close to the edges of your vision. Iām not sure if the new generation is worth the cost, but I never tried one.
I have got new Ultraleap Stereo IR 170 and did some testing with it.
Tracking is good when in front of cameral, but when hands are close to other surfaces (especially bright or reflective) - it has difficulties tracking fingers and palm orientation.
Gestures are a bit twitchy and not always trigger, need to play with them more.
I did not like default controllers orientation and changed them to look like I am holding them in hands, and use custom gesture Thumb Tip-Middle Finger Tip for trigger binding.
There are lots of room to improve but even in current state, with practice I think, using hand tracking should be faster than my current trackball.
@mbucchia Happy New Year to you and everyone here ! Any idea on when do you expect to release the version with those bugs fixes?
BTW I got the needed USB extension for my motion leap and Iām setting up Leap Motion to actual flights (as opposed to cockpit interaction testing) on my motion rig, together with NIS layer. Hereās to the new year being the most exciting year for the flight simming, with your help! I aim to finally get to the performance/visual quality sweet spot and an complete the ultimate quest for full immersion (on a budget) with the Motion Leap. Once again, this is the ultimate solution for VR + hardware throttle quadrant scenario that so many of us have. Even imperfect, ability to just use your hands is invaluable! Hopefully with the two bugs fixed soon, this beta will be already fully usable for a practical flights, which I will begin testing today.
So Iām now thinking that the best path forward is actually to merge the hand tracking into the upcoming OpenXR toolkit, since it will avoid me spending time to cross-port bug fixes and features between the 2 projects (save time) and the toolkit has things making it it much much easier to debug issues. Iām hoping that this merging effort would only take me one weekend, either the next one or the one after.
Hopefully I can make a joint release of the two projects (well, the toolkit, but including everything from my 2 projects) sometimes like 3rd week of January.
Sure, makes perfect sense, I was proposing that beforeā¦
So, now I have taken a first full flight with using exclusively your hand tracking layer, with hardware yoke and throttles. There are a couple of issues that became very apparent, that are basically show-stoppers:
While hands are tracking great and are always visible, the MSFS āghost controllersā disappear and appear randomly. Sometimes they just stay for a few seconds and then theyāre gone. Seems totally random. I have to clench my fists or āfinger gunā several times to make them appear again. Initially I thought thatās the same short timeout MSFS controllers have, but no, itās much shorter and random.
Even worse, when the ghost controller appears again after randomly disappearing, nothing can be highlighted (turned blue or yellow) again. Hands and controllers are tracking, but canāt interact with anything. Switching to laser and back, or using āfinger gunā gesture several times, makes it work again, but then after working for a little while, the ghost controller randomly disappears again, and once again the issue is back - it did so every time today. It seems that the code that deals with connecting Leap Motion and your layer works perfectly. Itās the part that feeds it to OpenXR/MSFS as emulated controllers that seem to be quirky. MSFS loses controllers randomly. Maybe you should use some kind of ākeep aliveā code to prevent ghost controller from disappearing?
Those two issues together makes normal flight practically impossible. They should be fixed, even if other issues that have workarounds, remain. Those issues could only be happening on my system, but I suspect they are universal. It feels like oh so close! Youāre almost there at the fully usable software, just need this ironed out. The rest is a bonus.
Itās important to note the positive facts:
Hands are tracking great. I can see the ācube handsā all the time my actual hands are in view. They appear instantly when the hand is back in view, and track in a very stable manner. Graphics is smooth and there are no jumps or stutters. It so happens that in my favorite seat position in my motion rig, even my left hand ends up exactly on the yoke, it lined up perfectly with my hardware yoke without trying. That was unexpected and very realistic in a way. Even with rudimentary blocky hand it āconnectedā me with virtual cockpit more, increasing immersion. So Leap motion controller and your code that feeds on it is very solid.
Interaction (when itās working) is working well, for the most part. Any problems, like rough rotation, seem to be related to MSFS glitches, so once Asobo fixes them - they should work equally well with your layer.
The experience of using my hands to operate the cockpit without blindly feeling for a trackball or a controller is AMAZING (during the periods that it works)! Itās a dream come true.
āFinger gunā gesture works fairly reliably. The whole experience needs a bit of getting used to, but it looks like after a short while, using index finger as a controller āattachmentā and finger gun as a trigger becomes very intuitive. I could press switches and adjust things fairly well. For very fine adjustments I could use my hardware (like GPS dual encoders etc.) But the immersion factor is out of this world!
NIS layer on 80% with 40% sharpening (100%OXR / 100% TAA), and disabled āsharpenā in MSFS config file gave me very smooth performance, with noticeably sharper looking than before exterior and interior, while having a very slight increase in shimmering of buildings and straight lines. I think itās definitely worth it. Overall both graphics and performance are significantly better. If FSR is implemented, itās interesting to see how it looks. Marrying both layers is a very good idea.
GIU works well and is intuitive. Very nice to not have to edit config files and see all changes in flight.
Other, less critical issues and suggestions:
When hands go out of view and Leap Motion loses tracking (expected behavior), ghost controller often stays stuck in the last known position (like in front of my face). It should move to a zero position behind/below me instead, or disappear - whatever is easier to code. I shouldnāt remain in view.
Workaround: grab/trigger several times for the ghost controller to appear again and it jumps to wherever the hand is.
Index finger touching for laser/direct switching doesnāt work, as reported before.
Workaround: assign the wrist touch or other double-handed gesture to that function. It works OK that way.
Perhaps even darker hand color, like dark gray or even black, should be an option. It may be less distracting. Together with opacity, when itās implemented, we could dial it to what feels informative enough to see your hands, but without looking too out of place.
Translate vibration feedback to a soft āclickā sound - for replacing haptic feedback with aural feedback. Using custom sound, and custom sound device would allow to feed the āButtkickerā and have a true haptic feedback of sorts. I suggested it before, Iām listing some things again just to have it all in a single one place for you and others to discuss.
Option to convert a haptic āclickā to a trigger press when hand is already in a āfinger gun pressedā gesture and approaches a control. That will work as a natural button press.
Idea for a new feature, related to 5 above: have 2 modes of aligning a controller. Ghost controller would then jump to desired finger, depending on the position of your hand. For example, mode A: āfinger gun pressedā - align the controller it with index finger and press buttons that way as above, or mode B: align it with a thumb or palm while not in a āfinger gunā position. That would allow for natural grabs or rotations, and approaching those controls with a āfist clenched, thumb upā position (trigger on thumb down), or something similar. Needs to be tested for convenience, but I think itās a good idea. It would require 2 sets (2 tabs) of controller alignment settings, and setting a gesture to bind to mode A (default) and mode B. That way we almost have 2 fingers/points active instead of a single point, making controller emulation a step closer to hand emulation.
Some may prefer disabling left hand, so there should be a way to switch between laser/direct mode without 2-handed gesture, or even better - still use use 2-handed gesture but just not feed/show the left hand to MSFS. Maybe it already works that way, not sure.
Sometimes itās difficult to keep your hand on the small control, while rotating or moving it. So if possible, maybe some kind of position-locking when trigger is pressed can be implemented, so slight hand movements wouldnāt lose the highlighted point. IT may be a MSFS issue thatās not fixable in your layer, but Iām just listing it because you have surprised me many times already, making things that I thought impossible work.
I have to clench my fists or āfinger gunā several times to make ghost controllers appear again after they disappeared due to issue above, or a natural timeout. They should appear after a single āgrabā gesture or trigger gesture. It takes 3-5 times for them to appear.
Iām very excited by the future of this tool. I will be using it 100% of my flights, even in its current form, but the disappearing controllers / no interaction issue needs to be solved for this to become a reality. Other things can wait or can be bypassed with workarounds, and will be an added bonus.
Most switches and buttons already have clicking sound when they clicked, so this in not important for me.
I also have a wish for a future feature: ability to bind a gesture to enable/disable each hand, so hand to controller mapping and hand rendering can be temporarily disabled when I donāt use it for interacting with cockpit.
No, thatās not what I meant. I mean the moment when your hand reaches the ācontact zoneā (blue highlight appears) there should be a haptic āclickā that you feel in the controller, that can be transleated to a sound. That happens before you actually trigger the control and serves as a haptic confirmation.
Yes, thatās a good one. I actually suggested it before, I think, but forgot to include in last post. For example, you may only need left hand for opening/closing the door, and operating a fuel valve - so only few times per flight. It would be nice to turn on the hand with a gesture just for those occasions. It would prevent it from a slightly annoying issue of an active controller (white arrow) jumping to the wrong hand occasionally. I havenāt listed it, because itās related to MSFS, not the layer. But this feature should help.