I understand that VR puts a huge strain on the GPU, but my question is whether my i5 9600kf with 32GB RAM would need upgrading to avoid a CPU bottleneck or would it be able to support VR if I upgrade to a 4080 super?
Thanks in advance
I understand that VR puts a huge strain on the GPU, but my question is whether my i5 9600kf with 32GB RAM would need upgrading to avoid a CPU bottleneck or would it be able to support VR if I upgrade to a 4080 super?
Thanks in advance
I’m no expert with regards to this, but I’ll share my understanding of the impact of VR.
First off, i’d say that the CPU is critical to framerate irrespective of whether your playing in VR or monitor, and MSFS is CPU intensive in general. However because VR is more sensitive to changes in frame rates (and thus it is more important to maintain consistent frame rates in VR than with a monitor), for this reason I’d say having a good CPU is more critical when in VR.
Now in order to manage the load on the CPU, you can adjust the Object and Terrain LOD settings - as these impact CPU but not GPU. So in the case with your 9600k, you will probably find that you will have to run lower LOD settings in VR to maintain consistent 30fps. However as a 4080 will be mismatched with the 9600k (a lot more powerful), you could find that your CPU is bottlenecking the GPU - unable to provide instructions quickly enough to the GPU. In this case, you may not be able to fully utilize the GPU, as it spends time waiting for instructions from the CPU.
So to answer your question - yes it is likely that the 9600k will bottleneck the 4080, and to adjust for this you will have to lower you LOD even further. Also even if you run with all non LOD graphics settings on high, it is likely that you will experience a mediocre framerate, with your CPU being 100% utlized while your GPU will be loaded much less.
My suggestion is to upgrade the CPU at the same time, or soon after buying a new 4080.
First of all, thank you for the detailed answer. Can’t say I wasn’t half expecting this answer. However to have a meaningful CPU upgrade I would have to change the main board too, which would also mean a new windows license etc. VR will have to wait
But at least I get to keep the gpu upgrade money too…
I don’t know the details of your CPU performance, but I’m pretty convinced that in majority of cases, the GPU is the bottleneck in VR due to massive per eye resolutions processed.
I believe that with 4800 and even relatively weak CPU, you can take a risk and venture into VR.
My previous PC was on i7-7700k and 4070Ti running HP Reverb G2. It wasn’t perfect but good enough to never return to 2D again (I fly mostly VFR GA planes).
I run a 9700 at 5Ghz (water cooled by Corsair) but with a RTX3080ti successfully in VR.
I have a similar system - i9-9900k OC’ed to 5.0GHz (with water cooling) and a 3080Ti with a Quest 2. VR is great for me. Granted - I use some other tools like openXR Toolkit and some custom Oculus Debug settings… but it’s a great experience and I’m in not a huge rush to upgrade… well… ok that’s a lie I do want to upgrade, but need to find the time to do it
Upgrade (for me) = 14900K@6.0Ghz (use 8 Pcores)+RTX4090+32GB5600MhzRAM = $$$$$$
Very intensive on both on CPU and GPU. There are a few factors to why this is so:
1-Your PC needs to output to two displays or one long one (equal to two side by side).
2-Resolution has to be high, so pretty high pixel density for your eyes (brain) see good image.
3-VR reacts to your head movement. Our heads are constantly making little moves, these moves need to be reflected by updating displays.
Even the highest end GUPs work hard in VR. Our eyes, especially in VR, need a good framerate to experience smooth motion, at least 70 fps to start to experience smooth motion but 90 fps is what is perfect for our eyes and brains. There are a few techniques used to achieve smooth motion and good visual quality:
Foveated rendering is the main and popular solution nowadays to help our PCs handle such intense needs by VR. You can look it up but basically high-end VR headset can track eye movements and then work with the chip on the headset and GPU on the PC to render the areas where the user is looking at a high quality while everything else at lower quality. This can still be done on headsets without eye-tracking by always rendering center (adjustable area).
Other techniques like only rendering what’s visible (in front of user) should also be used to help.
Another technique specifically for motion smoothness is to fake a frame and interlace it between two real frames. That allows us to achieve 90 fps where half of these are pre-rendered/fake.
While high FPS (90Hz) is very desirable for room scale VR games (in which you move a lot), if you are on a budget anything above 35 FPS will be acceptable for seated experiences like flight simming, especially if you fly gently in civilian airplanes, without aggressive manoeuvres.
Dynamic Foveated Rendering adds a lot of FPS in DCS due to QuadViews support. This technology is not supported in MSFS, so for MSFS foveated rendering adds up to 10% of FPS (a least in my case), nice but not a game changer.
I have a 10700k oc’d to 5 ghz and with my 3080 I’m CPU bottlenecked I’m sure with a Quest 2. The GPU is often running a low temp and working at a low rate when I’m getting stutteres and the CPU appears to be working quite hard.
In reply to OP, if it was me, I wouldn’t do a 4080. I’d do a 4070 Ti Super and save some dollars and put toward a CPU. I think a 4080 would be more than enough for VR unless you have something with far higher res than a Quest 2 or upgrade to an extremely fast CPU? I’m not sure how my 10700 really stacks up against the new ones.
Hey, you don’t necessarily have to get a new Windows license if you change your motherboard, I’ve upgraded a couple of times now and been able to use the same license.
You can save yourself a bit of money if you get something like an i5 13600K or i7 13700K and a Z690 board that can run DDR4 RAM too so can run your current RAM and upgrade that later on if needs be.
The performance difference for good DDR4 vs DDR5 isn’t as big in gaming as the marketing would lead you to believe!
That is a common misconception. It’s actually much worse.
As you stated the GPU needs to render more pixels. Actually almost 2 screens. (often around 2500 x 2000 pixels per eye)
However, since the left and the right eye differ slighly from perspective, the sim actually needs to render the scene 2 times, instead of 1 time with a bigger resolution.
There are, luckily, some optimazations done, so it’s not a full 2-times-render pass. But still, the GPU computation is much more intensive than only rendering more pixels.
“the sim actually needs to render the scene 2 times, instead of 1 time with a bigger resolution.”
Some headsets have a single display where the two images (one per eye) get rendered to, that’s what I was talking about.
In MSFS and in my experience and humble opinion, 35 fps does not cut it, you’ll need minimum 45 fps and have motion smoothing turned on to get somewhat of a decent experience visually (if you can put up with the odd image flicker). If you try take offs or landings at larger airports, you’ll experince visual oddities more commonly.
35 works for me too. Would 45 be better? Sure. Would it be worth a $2000 upgrade? Nah.
Some people prefer FPS and some visual clarity. I prefer the clarity so my settings are high, my resolution is very high and I accept the 35-50 FPS range.
I’m not using SmartSmooting or Motion Reprojection, as I don’t like the “jelly bean” artifacts it creates. What matters for me besides the resolution is smoothness, with 35 FPS my gently flying Piper Comanche appears very smooth.
Yep, let’s all remember movies are only 23.976 (I think?) FPS. They’re just really smooth and they look fine, except in certain fast motion shots on a large screen.
That’s a misconception. In movies, the slower frame rate means the sensor is active the entire time, creating natural motion blur that smooths things out. On a computer rendering, you’re only getting a snapshot with no blurring. That makes a huge difference, and a 24fps game is not going to appear smooth at all. Turning motion blur on in the sim may help offset this, but that requires more processing power and I didn’t see much difference to be worth it.
Back to the OP, I only tried VR a couple of times and noticed something to be on the lookout for. When I first set it up, I set my refresh rate to the minimum for my Quest 2 because I didn’t think my GPU could handle much. When I checked the sim’s fps display, my CPU frame times spiked extremely high compared to when in 2D. Come to find out it was the low refresh of the headset causing it. When I upped the refresh rate, my CPU frame times came down. What I think was happening is my settings were so low that the Quest’s refresh rate was the bottleneck, not my PC, and the wait time to render the next frame was presenting itself as extra CPU frame time. I haven’t fully tweaked it because I rarely fly VR, but setting the refresh up a notch freed up the computer to render faster, plus I had a bit of extra head room to increase graphic settings.
I understand, but the keyword in that is the “two images” those still need to be rendered independently
For smoothness, frame timing is as about as important as frame rate. If you have fluctuating 35-45 fps or a very constant 30 fps, the 30 fps will look a lot smoother. It is therefore advisable to enable vsync on for example 30 fps, which (if you tune your setting to do about 35 fps) result in a rock solid consistent 30 fps which appears perfectly smooth.
The more relevant distinction is that watching a movie or TV is a completely passive experience.
The rendered image of a video game must constantly respond to user input, which creates a measurable latency between the two. Watching a game running at 30fps is a very different experience to playing it, even though the image is identical.
Edit: At the risk of being pedantic, traditional movie camera shutters are only open for half the frame. The shutter has to close between frames so the film can be advanced. This convention has stuck for digital movie production even though sensors don’t have the same constraints.