I have this vision that in VR I can use my HOTAS for manual control, and then take my hands off my hotas and reach out to adjust autopilot, fuel mixture etc without having to fumble for a real world controller I can’t see. Essentially use my real hands in a virtual environment.
The way I’d like to see that work is using hand tracking. Full disclosure: I work for Ultraleap, owners of the Leap Motion hand tracking technology. We’d rather like our technology to be used in a game like MSFS2020 which is already pushing the boundaries of technology, and we think our hand tracking offers a non-partisan and affordable (from HMD perspective) method for interaction with the world’s best hand tracking. There are nearly 1m Leap Motion Controllers out in the world and they could all be used for this. And no, no big licence fee attached, this isn’t a sales pitch.
So, devs if you’re listening, drop me a line at dev.purdon@ultraleap.com if you want to chat about it, where we’re going with our tech etc and how we can help you design interactions that work (to avoid annoying gotchas like dealing with knobs/dials that are close together, how to get people to interact with them naturally and reliably etc).
Players, let us know if you like the sound of that. As a MSFS fan myself with a couple hundred hrs logged on FSX/FS2020, I know I do!
Yes I want this. I want this right now. I really want this RIGHT NOW!!!
Do you really think this could work at the kind of resolution of detail required to manipulate individual buttons and knobs in a complex cockpit environment such as you would find in FS2020?
This is where our expertise kicks in. We are already working with a number of companies who are designing enterprise level cockpit simulations, so it isn’t the first time we’ve had to deal with lots of tiny switches close together and wobbly shakey hands (when there isn’t a real thing to grab, hands are fairly wobbly, as anyone who’s played a VR FPS can attest).
An example of a really simple solution is having “snap to” controls as you move your hand close to them. Once you “activate” the gesture to interact with them, you can essentially ignore the accuracy of the hand at that point as you lock the interaction to that control.There are a few ways to go about it.
I’ve seen your impressive Gemini videos and details and this looks promising.
However I just hope Asobo and Microsoft will give you support in the simulator. It appears the focus is heavily on Xbox version, and I believe if LeapMotion works with an Xbox it would certainly be working on a PC, not the reverse.
I was going to write about this and then I saw your post. I don’t have a VR yet, and two of the reason is that I wear prescription glasses, so I don’t know if it will work for me, I would love to try it first. but with COVID that is out of the question right now. The other reason is that I have a full Honeycomb setup, and trying to make a change on the screen will not work if the action is programmed to my Yoke/Quadrant buttons and I don’t think I can do it in the dark. Yes, having something like the Leap Motion device would be awesome. I am using TrackIr for the time being.
A few years back even I used to fly with “flyinside” which supported hand tracking using leapfrog (guessing now leapmotion?). Far from prefect but was the best overall experience I’ve had to date!
Being able to reach out flick a switch, turn a dial with your virtual hand really nailed it.
Shame it hasn’t been developed on, it was so close!
An update screwed it up and seemed to just get abandoned
Make it happen! You guys have been SO CLOSE for years now. My impression from watching Leap from a distance is that you are not investing enough in software development, but of course I am just speculating.
I sincerely am looking forward to the day Leap becomes useful to me, and love the product. The problem is individuals using things like 3D printed rings with a blinking locator LED on it are more functional than Leap simply because they put the hard work into making game-specific software that actually works.
The leap plugin for x-plane, for instance, needed continual development - not to be a 3 year abandoned tech demo!
Coming from X-Plane I am missing what I can do with my Index controllers with my HP Reverb G2.
I have a physical yoke and rudder pedals and everything else is done with my touch controllers that are on my hands at all times with no mouse needed. To me that is immersion and can move from plane to plane with no changes in any hardware cockpit.
We are early days in VR support for MSFS but hope that we will not have to wait for many many months before the need for such support is realized. Not having a timetable is a little concerning to me.
In X-Plane I have a HP Reverb G2 using Valve Index controllers and rudder pedals.
Because the Valve Index controllers are strapped to my hands I can control the yoke with my left hand. I also use my left hand to start the engine and switch that are easier to reach that with my right hand.
My right hand is free to control everything else like throttle, mixture, flaps, trim, ect.
You can watch some of my videos on Twitch or You Tube where you can see me doing what I am describing.
I did a test flight from Edinburgh airport and back in MSFS also using my HP Reverb G2 with my Index controllers and it did go quite well so will be doing more testing before I want to stream that.