4000 cards a dud for MSFS in VR?

I was super hyped for the 4090 after watching Nvidia’s presentation yesterday, promising an fps increase of around 100 percent. But it seems on closer inspection to be a lot of snake oil.

I only fly in VR using a Reverb G2 (for now will upgrade to a better headset once they launch) and thought we’d get astonishing VR performance with DLSS 3.

Unfortunately I’ve seen on Wikipedia that DLSS 3 does not work with VR. Does anyone know more detail on this, is that a permanent limitation, or it’s coming later in a future update?

When you look at the benchmarks given by Nvidia themselves, the 4090 and both 4080 models are scoring around 2 x fps versus the 3090 Ti, which seems on the surface impressive, but this is because they have DLSS 3 enabled. Given there is next to no significant performance difference between the 4090 and 12 GB 4080 in MSFS according to Nvidias graphs it seems the sim is so CPU limited that it maxes out at the level of the 4080 12 GB.

If you look at benchmarks of the 4080 12GB in non DLSS games, it actually score below the 3090 Ti so I’m assuming that with DLSS 3 turned off the 4080 12 GB get less fps than the 3090 Ti in MSGF, and the 4080 16GB is about the same and the 4090 is only marginally better due to the CPU bottleneck.

So basically, as it stands currently, the 4000 series cards are solid upgrade choices for monitor users but completely pointless for those of us using VR.

Does anybody else know more details on this and how it will likely develop in practice?

Sounds like the optical-flow technique DLSS 3 brings in will conflict with any motion reprojection as they will both be adding synthetic frames. I can’t see any reason why though DLSS 3 won’t work in VR but it may not be effective. There’s a whole thread on this discussion in redit.

Redit: Will DLSS 3.0 work with VR in the future?

1 Like

Thanks I’ll check that out I hadn’t seen that.

1 Like

The 4090 looks very good, and at 1500, which is far out of my budget, but for a 90/Titan seems like a fair price when you consider the inflated prices we pay for high end GPUs now.

The 16GB 4080 seems slightly weak to me. It looks to me like they’re leaving a huge gap between it and the 4090 for a ‘Ti’ variant. Possibly waiting for whatever AMD are doing before deciding whether they need to release it (spoiler: they almost certainly will need to, because I can’t see the new Radeon cards being anything other than very, very good and a real, genuine alternative to GeForce IF they will only deign to divert some of that Ryzen silicon to produce plenty of cards)

The 12GB 4080 is borderline false advertising, giving it the 4080 moniker, yet it has 2000 CUDA cores less than the 16Gb variant. I see it as essentially a 900 dollar 4070. They’re selling people a 4070 for the price of a 4080. Obviously no one would buy a 4070 for almost a grand, so they give it the 4080 moniker, tweak the clock speed, slap on some extra VRAM and now people think it’s a 4080. It will be nothing close to the 16GB variant, and no one should buy it for the price they will ask for it. I realise it will be cheaper than the 4080 proper, but it won’t be cheap enough.

Taking that at face-value, Nvidia will now produce a 4070 in-name-only, which will be where the 4060 should have been in terms of performance, but without a doubt will be where the 4070 should in terms of price.

And the 4060 will probably be just the equivalent of a 3070 for 350 dollars or something.

Then you factor in the massive power draw of these monsters and then consider the rising cost of electricity…either AMD will produce a GPU that I like and in numbers great enough that I can get a hold of one, or I’m skipping this generation altogether. I’m actually not unhappy with my 3070’s performance in VR all things considered, and DLSS, as well as apps like Oculus Tray Tool & OpenXR Scaler have made it possible for me to run in VR quite well, actually. On top of that the recent optimzations in the sim, as well as a recent CPU upgrade I made all come together to mean I just don’t need to eat Nvidia’s garbage right now. I’m not sat here with a GTX1060 and an i5-6500 and a 7200rpm hard drive housing both the sim and windows anymore like I was when this sim was first released. I’m now up to 2021 standards and Nvidia’s nonsense doesn’t impress me.

EDIT: Regarding the “100% more performance”: Yeah, Nvidia tries that line every single time they bring a new line of graphics cards out. I don’t remember the last time it actually was true, and it will not be true this time either, although the 4090 is at least a card that I have very little complaint with, possibly because I’m not really concerned by it since it’s far out of my price range.


I am not using DLSS in MSFS as it makes the glass cockpits too blurry in VR. This is max resolution on Quest 2 with a 3090.

I heard that Asobo are investigating a way to make the glass cockpits render at normal resolution with the rest of the scene using DLSS, I think this is very important - surely there is no need for us all to spend thousands upgrading to the 4000 series to render the whole screen/FOV in super high resolution when we only need those small panels in full res, not to mention the environmental impact of all the increased power draw and millions of new plastic components :slight_smile:


I think is some respects DLSS 3 not working in VR might be a good thing for me, for the reasons you mention above. It’s not really necessary to spend thousands of pounds and increase our electricity bills further for modest gains, so if they don’t help VR much it might save me some serious cash.

If other people start buying them though, especially Youtubers, I might be swayed as I hate being a generation behind everyone else!

LOL. You must be new to flight sims. That is what we do on the PC side of flight simming. We’ve been doing that since the days of 5½ floppy disks.


You can bet yourself anything Youtubers will be the first to get them. After all its important for them to have the latest and greatest so you can watch their videos and have serious hardware envy. This brings them many all important clicks. Don’t buy into the hyperbole.

Sometimes its good apply a bit of cold hard logic to computing hardware, with regards to whether or not you actually need a new generation of hardware when it comes out. Remember when the 3080 and 3090 launched people with higher end 2000 series cards were consistently reporting the same or sometimes even better performance with the sim simply because the older generation of cards were better optimised and had a few years of drivers behind them.

My guess is the 4080 and 4090 will have better performance than the 3080 and 3090 but it will on average be fairly marginal and the 100% increase in performance being reported won’t actually be seen in real world use or if it is it’ll only be in very, very specific situations and will come at some cost elsewhere, such as with the issues we’re seeing with DLSS 2.

I guess things that could make a difference is when Asobo finally bring ray tracing and possibly even some other graphics enhancements to the sim. The 3000 series cards may struggle with things far more but we’ll need to see.

I’ve got a 3080 and I’m quite happy for it to do me for while yet, probably even until the 5000 series launches. They do say upgrading every other hardware generation makes the most sense and has the best value for money. I’d much sooner upgrade my 5600X than I would my GPU at this point.

I’ve got a 3080Ti with a lot of miles left on it. My mobo will take the i9 chips, so my last upgrade will be from a i7 to an i9. I’ll ride that out for a while, then likely get a new mobo, ram, cpu, and gpu when the 50xx’s come out. in two years.


I’m riding my 1080ti/8700k combo until it croaks over.
MS/Asobo have done a stunning job at performance optimization that runs on older hardware since SU6 on up creating more performance headroom. I’m happy with 100 TAA in VR for now.

From what I’ve read, “Frame Generation” and motion reprojection (MR) set out to accomplish the same thing…guessing data.

DLSS was/is guessing pixels that aren’t there…to upsample, basically a fancy interpolation, but somehow using AI, and being better, except for digital displays evidently. FG is kinda the same but in time, and it’s extrapolation instead.

MR tries to do the same thing and is something we have in VR now. FG will be based on an AI algorithm and maybe better. MR is based on something Microsoft wrote, and has been tweaking ever since.

They both will “guess” a frame or 2 in between official rendered frames from the game.

Those 142 FPS shown with DLSS3? Likely half were guesses.

Having FG guess a frame, then MR guess one based on that other guess, all while waiting for the CPU to provide the correct answer for the next…is a recipe for disaster.

However, if FG can “generate” 2 or 3 intermediate frames such that the output is locked at 90, and the old MR is turned off, then DLSS3 FG can work in VR and basically replace MR.

Time will tell if DLSS3 will be coded to do all this, and if Asobo will support it for VR.

It’ll be interesting to see if DLSS3 frame generation does a better job with prop artifacts as seen from the cockpit, or if a mod is needed. :wink:

1 Like

Aside from the wikipedia page mention, has anyone found a statement anywhere from Nvidia that actually says dlss 3 won’t work in VR? I haven’t been able to find one, and the optical flow SDK, which is at the heart of the frame generation process, specifically mentions VR as a prime use case. There’s a new verson of that SDK launching in October as well.

DLSS 1 didn’t originally work with VR, but that was fairly quickly fixed, starting with No Man’s Sky I think.

Not sure where this statement came from.

1 Like

Correction - since cassete tapes. - I still remember on Atari 130xe default low fps in Fighter Pilot, F15 strike eagle, Solo Flight, and Tomahawk. And even lower fps when near complicated 3d structures like a mountain drawed with 4 lines.

1 Like

That’s my understanding as well, it would be interesting to see a 4090 FPS comparison with DLSS3 taken out of the equation and compare “real” frames against “real” frames.

Personally, I can’t stand any form of pixel interpolation in VR, whether it be motion reprojection or what little I’ve seen of DLSS2 in the last 24 hours (granted, the updated Nvidia drivers still aren’t out and DX12 isn’t working with OpenXR Toolkit yet, so maybe some gains will still be had on that front). But full frame generation is intriguing. I’m hoping it works well without all of the visual artifacts.

Just one thing to mention (and it’s a bit off topic, but related to your post), there is a new version (1.2.0) of OpenXR toolkit out now that works with DX12. I haven’t personally tested it yet, but the release notes do say works with DX12.


Solo Flight wow. That blew me away back then and all it had was a few lines on the screen.
Now we have the whole world to fly in, with almost real life visuals. Puts things in perspective when I think about it.
Sorry, o.t.


It’s the only place I’ve seen it so far. Fingers crossed it ports to VR at some point. If I can get 113 fps in VR with a 4090 I’m ordering instantly.

Looking at their graphs though, without DLSS 3 I’m probably not going to see any noticeable increase over a 3090 in VR. This is the only game I play where I’m not more than satisfied with performance so whether I buy depends entirely on how much it improves the sim.

Yep. Personally I’m a bit tired of NVIDIA’s BS. Not only are the lower tier cards basically a different core advertised as 40 series (they’ve done this before), but they’re also artificially limiting supply to keep prices up (reading between the lines of some of the CEO’s statements). I’m think I’m going to wait to see what AMD cooks up.

Also, EVGA just flat out gave nvidia the finger and is dropping out of the video card game altogether. They always had some the best cards IMO. Sad end of an era.

I never had a computer with a 5-1/2" floppy disk. Was that an upgrade?