VR Controller support is missing. Please add in addition to 3D mouse

What also is needed is a better SteamVR OpenXR implementation for WMR (Reverb G2 etc.) There was talk about implementing a more direct way than copying video buffer, but not sure if latest updates did anything with that. There is an amazing DIY finger tracking/haptic gloves project that may solve the whole controller trouble, leaving your hands free for the yoke when you need it. Beats strapping the controller to my hand for sure. $21 worth of parts, some 3D printing and there is full finger tracking. Full haptic version is coming. Of course the open source driver is for SteamVR… Just as VRK and tons of other software… We need a good SteamVR implementation that doesn’t costs frames.

Looks nice. I guess it could even provide some force feedback effect and you could still use a joystick with that glove. I would hope a more advanced consumer friendly version will be released someday though :smiley:

You also mentioned that the SteamVR integration does cost some frames, I also have bought this game from steam. How many frames does it cost compared to the version from microsoft store?

He is working on the prototype 4 with haptic feedback, but the way it’s done is not for joysticks etc. The small servos (same kind I have on my RC airplane models) will stop the wire extension, limiting the finger bend and simulating you grabbing something. So you could grab an apple in VR, feeling you grabbed something round. But your hand will still pass through a wall as there’s nothing stopping it from doing so. So we’d still need joystick/yoke.

As for FPS, there is no difference between MS Store and Steam bought MSFS. Reverb G2 works in WMR with OpenXR, so what SteamVR is doing when WMR is selected for OpenXR is just plugging into it and letting it control things. But as it’s not native to SteamVR nothing else works together with it, so any software like desktop mode (bringing your screens into SteamVR), kneeboards like VRK, those VR gloves etc. won’t work. For them to work, we need to enable SteamVR OpenXR implementation instead, and THAT has a disadvantage: as far as I remember, it was just copying video buffer for every frame and passing it into SteamVR, instead of directly working with a driver. Because of that of additional overhead it affected FPS/stuttering and was unworkable for me personally, while the WMR way was “meh” and is still borderline. For me (as you see in my video in the top post) VR simming is only viable when I can:

  • Fully interact with the cockpit with your hand (hence this topic) - controller support is a must, gloves will be even better.
  • Having all charts in the cockpit, available at all times, and a notepad to write down live ATC instructions from Vatsim etc: X-Plane had Avitab, but MSFS has nothing. And SteamVR has VRK, which is an amazing highly configurable tabbed notepad/chart holder where you can actually annotate on your PDFs with a stylus (you need a cheap stylus to use that feature). But it’s for SteamVR, as are the glove drivers, so until MSFS can work natively in SteamVR OpenXR implementation intead of WMR OpenXR we can’t use all those goodies…
2 Likes

Yes Flight Simulator 2020 in VR is best, but …
Flight Simulator 2020 in VR without the Controller loses all interest in VR immersion
I didn’t think we could have VR without the controllers in 2021 ??
Please asobo add a VR Controller in contrôle interface.

1 Like

We hope they implement the controllers in vr, they are essential but also the use of multiple windows at the same time as it was in fsx, as well as better flight physics, to date the only sim a bit faithful to reality is xplain, take an example

I’ve posted this a few days ago:

I’ve found traces of voice recognition bits in the binaries… Maybe just experimental implementation left over, maybe actual future version bits, I don’t know.

Same for VR: there is no controller support yet, but there are already bits telling me they are experimenting with controlling the switches and knobs via a laser pointer or by direct contact. The latest update also includes the 3D model for hands:

1 Like

I hoped even before release that voice recognition would be used. Years old Pilot2ATC software uses it wonderfully for real communication with AI ATC contollers and that works great (for the most part) - such a liberating experience! Also, I used free VoiceMacro software to also assign all kind of commands via key presses and TTS. So I could say “Gear Down” and gear would come down, ask FO to read specific checklist etc. It’s all demonstrated in my video on top post. And it’s all done with really old voice recognition and TTS engines. Asobo has Azure which would make it sound really natural. Why not have it built-in option for ATC control and assigning commands in sim? That would be amazing not just for VR, but especially for VR.

Hand tracking needs hardware/driver support I guess, so it’s more difficult, but even with controllers it would be great to see your hand - with an arm, not a stump - I don’t know why everybody’s doing stumps. Whole body experience in modded Skyrim is amazingly immersive. Seeing your arms and legs and copilot in cockpit would only add to the immersion.

But it’s a good thing they are working on it…

@RomanDesign
I’d recommend you watch this video from the latest Game Stack Event:

PS: I’ve asked Martial on the Game Stack chat system whether they’re considering this, both for Voice Recognition (ATC for example) but also to supply a networked based and integrated system to voice coms (which to me should recreate aviation coms, i.e. one “lobby” per tuned frequency, instead of one lobby per group of players). He couldn’t say anything committing, but at least, that they’re looking into this.

One neat advantage is that PlayFab Party also does translation and text to speech, which means to me you could speak in French, and the automatic translation would impersonate you and speak in English to the other players.

It’s nice, but Asobo has all the Microsoft Assets and is already using MS Azure for TTS and Azure already has both text-to-speech and speech-to-text and is cloud-based, and supported by Windows. It’s already partially integrated in MSFS. So it would already make sense to go Microsoft all the way, to showcase the technology and don’t pay extra license fees, the way they went Bing for maps etc. They already have the solution at their disposal, they just not implemented it…

I’ve been wondering the same about another technology Microsoft acquired 4 years ago:

https://forums.flightsimulator.com/t/is-there-a-roadmap-for-the-terrain-lod-morphing-issues/385559/5

It does not work for many many languages. English speech I/O is getting shape now, but TTS as well as speech recognition is still emerging technology for non-English languages. That is ongoing development and it is slow. I’ve been involved for years. Languages like Dutch and French have a far larger lexicon and a different coarticulation. That makes TTS a completely seperate project for each language and recognition is also far more difficult. In the RW there are gadgets in Windows you can chat with. But avionics… you don’t want to start passenger boarding procedures at 20.000 feet resulting in decompression and crash, just because you’ve been misinterpreted…

For MSFS it should not matter. All ATC is in English all over the world, at least for IFR at large airports and control centers. Implementing that should be good enough, if you want realism and voice recognition operations. And if not - what’s already there should be good enough.

Even with English Only you’ll see foreign accents. Recognition must take these into account if you want everyone to communicate in English. Certain things are possible, like ATC/ATIS (barometer) requests that have a fixed format… channel change requests… maybe they can try to do these first, but I doubt if names could be supported with speech technology, or aircraft types, or exceptional situations… missed waypoints, change requests… confirmation of number sequences, in a foreign accent… the recognition side of this will need a lot of work ! Suppose you did all the procedures and the system misinterpreted your designation. You take off, your AP makes a turn to the south and you don’t know why… oops… Atrecht or Utrecht… small detail… opposite direction.

Well, I think all of what you said is already supported by Pilot2ATC software. I did try it and it works, and I do have a foreign accent :slight_smile: And Pilot2ATC only uses the Windows built-in speech recognition and TTS engine, no fancy AI cloud processing Azure has. I made fully voice-activated IFR flights with it on X-Plane. See my video - it works fine. I could interact with it, initiate procedures, request different flight levels, headings, IFR/SIDS/STARs, vectoring, ground and tower clearances, runway holds, taxi instructions, basically everything. It even sees traffic and alerts you and commands a go-around if runway is occupied by traffic etc. I didn’t do readbacks because of difficulty writing down and reading back in VR (your “copilot” can handle readbacks and tune radios, it’s an option). But people are doing readbacks themselves apparently. And it’s easier when you see ATC window so you know what to say back. And all this with old technology. So there is no problem whatsoever using more advanced cloud-driven Azure to do even better job.

1 Like

Well the world is already moving on apace and Quest users can now fly in VR untethered thanks to Air Link. Personally I agree with everything the OP says but would urge Asobo to go one step further and implement hand tracking.

Currently I fly X-Plane VR with a Quest 2. I mainly fly gliders with a center stick and like the OP, I can fly with one hand on the physical stick and one controller in the other hand for flicking switches and turning dials. Now think about it - With hand tracking, we could adjust the seat position (actually can you? I know I can in X-Plane using Planemaker) but we could adjust the seat position so that when our hand reaches out and touches the virtual stick or yoke it actually touches the real stick or yoke, so we easily get the best of both worlds without anything dangling from our wrists. We can fly easily with a real stick whilst reaching out with our free hand to maniplate the knobs, levers and switches.

That’s true, but I think hand tracking needs a strong internal processor. That’s why it’s only available for developers on Quest, because it has enough CPU power to process its camera feeds and calculate hand tracking. On other headsets I don’t even thing camera feed is available to work with and it may be too low-res to work for hand tracking. So I’m not optimistic for that on G2, for example. LucidVR DIY haptic glove is much more possible, provided G2 works with SteamVR OpenXR well, which it doesn’t now. Controller would be strapped to the glove, and although cumbersome, your fingers would be free to for the yoke or operating cockpit stuff.

I find hand tracking using the Valve Index camera working quite good. Maybe not to the level of Oculus yet, but this is more an AI problem than a camera problem for now.

That’s good to know, I didn’t know Index has tracking. So there’s hope then…

Not Index tracking, but you can use Vive hand tracking SDK with the index cameras… :slight_smile:

Has there been any word or hints on expected release date for controller support?