Resolution versus Render Scaling

If you wanna have some fun and you use a Nvidia card, turn on some different resolutions under DSR-Factors in the NVCP under 3D settings → Global Settings. It upscales the textures in the GPU which lets you chooses higher than native screen resolutions in game. From there you can adjust your scaling in game. You might be surprised at your FPS at lower render scaling while the textures look better. It is fun to play with.

Yeah that’s about the limit on that GPU at that res I seem to be averaging in the 30s at KLAS but I take the fps hit worth it if its smooth.The extra wide screen does add some immersion to the sim.

The instruments inside the cockpit look much better at 1440P.

3440x1440 is so vram hungry that I have to turn all my brower off. Once I closed them the sim smoothed out and the vram usage dropped.I have yet to experiment with the rendering scale below 100 but it might make everything blurrier.

I think you could safely go to 90 and then 80 and see what you think. Glad that you mentioned KLAS payware. Very impressive work!!

I posted a few photos of final approach into RWY 19R.

In the first case you rely on the upscaling capabilities of your monitor, in the second on the GPU, solution which usually provides a better visual quality.

2 Likes

3440x1440 Samsung monitor here with an RTX 2060 super. Changing resolution to 2560 looks grim on my monitor, so use render scaling set at 80% @ 3440 and that works for me

2 Likes

It is going to partly depend on your monitor and whether it upscales and if it does how good the built in upscaling actually is.

Logically the PC sending the native resolution (the number of actual physical pixels) of your monitor will look better.

Without upscaling, sending 1080p to a 1440p screen means the screen is trying (unsuccessfully most of the time) to spread a single pixel over 1.333 pixels. On occasions one of the real physical 1440p pixels may even end up half way between two of the 1080p pixels. With upscaling an “averaging” algorithm (to use a loose technically incorrect analogy) works out what value the real physical pixel needs to be depending on what 1080p pixels overlap it.

In essence the question is should you let your monitor work this out by sending at1080p or let the game work it out with render scaling and send the monitor its correct native resolution.

My experience with my own monitor is the game does it better to be honest — however if you have a monitor with amazingly good upscaling you may get different results.

1 Like

Based on everyone’s recommendation I have set my resolution to 2650x1440 and played around with different Render Scaling settings.

I actually set some other parameter down a little and currently rendering at 100% and really happy with the settings. The instruments look much better!

Thanks to all who contributed and helped me make an educated decision.

1 Like

It is worth noting that you can actually set the render scaling to values like 85% or 95% or 105% in the config files and it works fine even on restart of the game a second or third time — until you try to fiddle with some other graphics settings in game at which point you will find on next restart it resets your scaling to the nearest 10% .

1 Like

What would you think: does it make any sense to set the resolution to 1440p (rather than 1080p) with a 4k monitor? My GPU (AMD RX 5700) is way too slow for UHD. With FullHD (1080p) each rendered pixel equals 4 native pixels on screen. Would there be any benefit from 1440p in a visual sense? How about AMDs ability to sharpen the input? I really can’t tell by looking at the output :wink:

As already recommended in this thread, you should set the resolution to the native resolution of the monitor and use render scaling to choose an appropriate render scaling for your GPU’s capabilities. At 4K (3840x2160), 70% scaling renders to a 2688x1512 buffer which is the closest selectable option to 1440p (2560x1440), and will usually look better than setting the resolution to 2560x1440 or 1920x1080 or any other value.

1 Like

@Vibstronium: Thanx for the advice. I usually think that native resolution is the best/sharpest.
BTW what exactly does render scaling do? Is it sort of up- (or here) downscaling done by the CPU to let the GPU ‘breathe’ so that it can cope with such a big resolution (4k)? And how does VR fit into the picture? It seems as if resolution and fps are set by the glasses’ capabilities then.

Render scaling causes the 3d scene rendering to happen at a lower resolution than the screen resolution. At the end of rendering, it’s scaled up to full screen resolution and composited with any additional screen-resolution overlays like the external view HUD, tooltips, menu, etc.

The value you select (eg “80%” or “50%”) is the percentage scaling applied the width and height. This means if you do 50% scaling on a 3840x2160 resolution screen, the 3d scene will be rendered at 1920x1080 (50% of each dimension), for a total of 25% as many pixels (50% on each side, thus 0.5 * 0.5 = 0.25). Rendering fewer pixels reduces the amount of GPU work for the GPU to do, and if and only if you are GPU-limited, it may result in higher frame rates than leaving it set at 100% while using full resolution.

The difference from using 100% render scaling at a lower resolution is that the upscaling is higher quality, and that non-3d portions of the user interface such as menus, tooltips, and the external view HUD are sharper.

I’m not very familiar with VR, but the exact same considerations apply as far as I know.

Changing my texture resolution to high made the difference which allowed me to keep the rendering scale @100 without the fps drops
3440x1440 @100 is already not quite sharp compared to 4k at 70-80 render scale

@Vibstronium: Thanx again, I see much clearer now.
Actually I checked your example flying in dev mode. In a given area with FullHD / 100% rendering and about 60 fps I obtained 45 fps with 4k / 50% and below 30 fps with 4k / 100%.
But the sharpness gain is sometimes hard to tell, at least at the scenery :wink:
It’s easier to notice looking at a small font (eg. ‘VOL’ at the glass cockpit frame of the SR22).
Well, does anybody have some experience regarding AMD’s image sharpening function inside the driver?

As far as I know any scaling will be roughly the same between NVIDIA and AMD. There’s no GPU-vendor-specific techniques in use.

AMD claims that their sharpening function (inside the driver) works ‘autonomous’ and doesn’t have to be supported by a game (if I got that right) - with practically no fps cost! But I can’t see any effect (eg. this small font scenario).

I’m not familiar with AMD’s autonomous sharpening filter; searching the internet turns up some very vague forum references but nothing clear. Do you have a basic article describing what it does and how it works?

Vagueness seems to be the main problem with the feature. When AMD had started it there have been a few enthusiastic articles in some german games magazines (with promising screenshots). Maybe it works similar as image sharpening by software (I guess) - if it works… :wink:
If someone is interested (article is in german):

(today DirextX 11 is supported as well I suppose)