Question to VR experts: Is resolution vs performance scaling linear compared to flatscreen?

First time VR user here. I am a flatscreen expert, literally work on pixel quality for a living amongst other things. I also have 20/20 vision, so there is my curse when going to VR. But it is what it is, I knew you can not compare visual clarity to 4k flatscreen.

I have one question though comparing flatscreen fps and VR fps:

Before I received the HP Reverb G2, I checked my framerates on flatscreen. I am playing all-ultra and 1920x1080 with render scaling 140% (so effectively the engine has to render ~ 2.9 million pixels). The image is crystal clear and not a flickering in sight with those settings. I get ~45-60fps depending on where I am, and my G-sync monitor does the rest to give me a smooth flatscreen experience.

I received the Reverb G2 today, plugged it in, OpenXR reprojection ON, all fine. Works like a charm in MSFS, setting it up could not be easier. Top rudder solo 103, first start, banked to the left and looked down, almost fell out of my chair, that’s how good the depth perception is.
The fps is baffling me though. As if it does not scale 1:1 to flatscreen in terms of resolution. I am running openXR 70% and MSFS 70% renderscaling. So effectively I render ~4.5 million pixels (2160x2160x2x0.7x0.7). If I compare this to my flatscreen 2.9 million pixels, I would more or less expect fps to drop to ~25-30fps. This is well above OpenXR 22.5fps and would be All-Ultra settings. Now this is not happen though, fps gets a lot lower than this. In order to get >22.5fps, I have to lower settings to an unexpected mix between low-medium all across the board. So now I am stuck with OpenXR/MSFS 70/70 and settings sitting in the low-med range, which is not exactly great. I am on a RTX2080Ti and i7-4790k with 24GB Ram. Do you have any idea why there is such a difference in performance for the same resolution in VR and flattie? (more drawcalls in VR maybe,…)

Short version:

I render 9million pixel to flatscreen and get 40+fps
I render 9million pixel to VR and <20fps

Same settings each time. Why?

I reply to my own question, hope I am right:
Draw calls are the culprit!
I believe this is the reason why the same resolution/settings perform much worse in VR than on flatscreen. In VR, the GPU has to render the two images per frame, each one from a slightly different perspective. This is ok for the GPU, but the CPU has to send the draw calls over two time per frame.
So it is basically a CPU bottleneck why performance is disproportionally worse in VR than on flatscreen.
DirectX12, you will be very welcome!

It is also the reason why games like Alyx and such scale so well in VR, they do not have the tons of draw calls that MSFS has.

1 Like

Rendering 2 viewpoints + a much wider field of view in VR takes a lot more effort. That’s why most VR games are kept looking fairly simple to keep the FPS up high enough that it’s not a bad experience.

well, a ā€˜lot more effort’ is exaclty the technical analysis that helps in detemining what to upgrade. :slight_smile:
It is the CPU that needs upgrading. Or the DirectX version. Or both ideally.
I would not gain anything by upgrading my RTX2080Ti/11GB to a 3080/90. The large drop in performance from +40fps on flatscreen to <20fps in VR for the same pixel amount and settings is due to the CPU.

Considering how CPU bound the game is upgrading that would probably give you the most improvement, but don’t expect any miracles, the game in general and VR especially need a lot more optimizing.
I don’t think DX 12 is going to be the miracle cure that everyone expects it to be. It might help a little or only for some people.

From what I’ve seen others posting it seems that even with top of the line hardware most people are lucky to get 30-40 fps in VR.
You got to live with the low FPS and tweak for smoothness.

Yes, going for the strongest possible CPU is not wrong. The game seems to be CPU limited on multiple ends. First the normal one, draw calls. Then everything ā€˜live’ and ā€˜AI’ in the options, which I have all off.

For draw calls however, DX12 will definitely help. I have first hand development experience how it reduced draw calls considerably. So I am confident it will help there. Should people be cpu limited by live traffic or other things that MSFS calculates, then they will not notice much difference of course.

And to my knowledge, VR should double the draw calls because everything has to pass the render pipeline twice. No biggie for the GPU, but the CPU will be in trouble. It has to issue two times the draw calls to the GPU. I actually have no other explanation as to why my framerate is more than halfed compared to flatscreen for the same 9million pixels to render and the same settings. But then I am also new to VR, because I always stayed away from it being a pixel perfectionist.

Yes, they’re doing the heavy lifting on the cpu, and maxing out a single core in VR. I’m a Rift S owner, the Oculus motion smoothing is being choked. It only gets a chance to breathe if I pause the sim.

The sim resolution setting has more effect than any other graphic setting, and you can lower it and increase supersampling on the gpu if you fancy a fun day of tweaking.

Well, if it is really draw calls that cut my framerate into half, then it does not matter at which end I downscale or upscale. The GPU will ask to CPU for the same amount of draw calls nonetheless.

I have a Ryzen 3600x paired with a 6900 xt. When running on my reverb G2, the GPU usage is maxed at 99-100%. The ā€œmax cpu threadā€ usage is never over 50%. So please explain why you think this software hammers a single core heavy? It doesn’t seem to be limited by the CPU at all.

Note that with 70% openxr and 100% render scale in the game I’m getting 37-43 fps in areas outside of major cities. I don’t use motion reprojection. When I turn motion reprojection on my frames drop and the ā€œsmoothnessā€ is worse. Maybe it’s motion reprojection hammering the CPU?

1 Like

Draw calls are not parallel in DX11. The cpu not showing 100% is perfectly fine. And especially if you have to render two images (left and right eye), then they come one after another, means sequential.
Your CPU is better than mine. If it manages to get all the draw calls over to the GPU, it will be maxed. Lucky you, being GPU bound at least you know there is nothing to aim for anymore.
Until they go to DX12, I might just accept though that my i7-4790k might show its age here when running a DX11 application. It is 7 years old after alll and usually this does not matter because the games I play have me GPU bound most of the time. But with MSFS drawing tons of houses and stuff, draw calls are their limitation. Not talking about calculations that the game itself does on the CPU, I mean draw calls from the render pipeline.

ps: just tried w/o reprojection. More stutters here. Probs because framerate is really low

Thanks for the update. So you’re saying your CPU is 100% maxed out on a thread and your GPU is not.

Regarding your ā€œrender 2 imagesā€ comment. Are you sure that’s correct? I’ve read things over the past year that state VR doesn’t work that way any more. Something about improvements to rendering where they can render 1 large image and extract the individual left/right eye components from 2 different positions of the one image essentially easing the load. But it’s been awhile since I looked into it so I could be mistaken.

yeah, I am also not 100% sure of the latest VR render techniques.
I am very sure though that you need to render from two different cameras in DX11/OpenGL to get two different images corresponding to left and right eye perspective. Two different cameras means two different renders in rasterization. So while my 9million pixel on flatscreen are rendered in one go, the 2x4.5million for VR need to be rendered sequentially. My GPU would be able to knock this off and give me 40-50fps. This is what I expected from my flatscreen experiments before I got the Reverb G2 and what I see now as well. But to my surprise, fps tanked on the G2 below 20fps. Maybe it is just something Asobo messed up, but I do not think so because DX11 is not rocket science really. I believe it is a fundamental draw call issue because my CPU is ā€˜outdated’. But I also know, that DX12 will relax draw calls considerably because I used it myself. But yes, I guess if they release DX12 and suddenly my VR frames will be 40-ish, I was right. If not, I will sell my Reverb G2 that I did not even pay for myself :slight_smile:

ps: I also get more fps hammered with areas of tons of houses as compared to forests. Houses = draw calls.

1 Like

Ok, here is some semi-tech post about rendering VR.

Probs too much for most here, but I thought I post nonetheless. This does not seem to be used for MSFS though. Let’s see, if my framerate in VR boosts with DX12 release, I was on the money with draw calls being doubled for VR and choking my CPU.

That was what I was remembering, thanks. What I didn’t remember was that it’s only useful in some specific cases and some headsets. It’s certainly not a cure-all technique.

And yes, fly high over cities and low over forests!

Just a short comment on that topic. I’m running FS with a G2 on an overclocked i9 11900k + 3090 system with 32GB of RAM tuned to low latency and for me both 2D and 3D gameplay is almost 100% GPU bound. I’m running 2D at 4K all Ultra and 3D at 100% render scaling (3168x3092) and a mix of Ultra/High/Medium settings. My FPS for a self-recorded 12min benchmark flight over urban and heavily forest covered areas are 59/45/38 FPS (average, 1%ile, 0.2%ile) in 2D and 33/27/25 in 3D. Both DevMode overlay and CapFrameX are showing GPU limitation for roughly 99…100% of the time.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.