there is only one “RTSS Rivatuner Statistics Server”, which can be used standalone or also is used in some tools, like afterburner.
If you need only an fps counter there are also alternative. You can e.g use one of the overlay’s ( nvidia or xbox if you like ) and new since SU10 exists these both new command line parameters ( so far I know this not have the issue that logbock will not writen )
“-showFramerate” and “-showFramerateMini” command-line options now allow for displaying the FPS panel without need for the DevMode
So long story short, I kept my previous Graphic settings (TAA etc)… And I see now that DLSS is smart and great for those who do not throw money (like someone I know here), on 3090s and such GPUs
I find DLSS on Quality to be the best compromise on my setup. TAA maxes me out around 45fps during cruise.
DLSS to Quality I’ll reach 60+fps. Most of my displays are still clear. Not as sharp as TAA but sharper than TAA@90% render scale.
RTX2060super WQHD
Find out whether you, in your typical flight scenarios that are important to you, are limited by the main thread or the GPU. Turn on developer mode in the options, then turn on “display FPS” from the dev menu.
This will show a fps meter with a lot of details in the upper-right corner of the screen. Among other things it will tell you the number of milliseconds of runtime on the main thread, and for the GPU render, per-frame, and it will tell you which of these is the limiting factor on frame rate.
If the limiting factor is the main thread, then DLSS will do little to help.
If the limiting factor is the GPU, then it’s probably spending a lot of time pushing pixels and you may benefit from pushing fewer pixels by enabling DLSS upscaling.
Then, try it and find out! It’s a simple matter of flipping the switch and looking to see what happens.
… my hope was, that DLAA is much better. But not sure whether its because of the “current” nvidia driver, but at the moment is DLAA not what I expected.
I have a Amd 3700X and 3060TI with 16 ram, and my Xbox Series X runs much much better than the Pc versjon after the latest update, even if i use DLSS ultra perfomance!
Try not to use 2K to describe 1440p (2560x1440p or officially WQHD), as it may confuse.
2K means approximatly 2000 pixels horizontally and is officially used only describe DCI 2K at 2048x1080 used in cinema (which is where 2K came from), though some do unofficially refer to 1920x1080 as 2K due to being approz 2K in width, but that should be refered to Full HD or FHD
On a side note 4K should refer to DCI 4K which is 4096x2160 another cinema format, with what we call “4k” on our monitors (3840x2160) actually being UHD, though because many TV manufacturers kept selling those TVs as “4k” they have been given a special tag of “4K UHD” to differentiate with the cinema version (DCI 4K)
I know! sorry if i sounded a little complicated! It actually took me a while to understand myself! I too did the same mistake of thinking 1440p = 2K, until i was corrrected, and to be honest it makes sense once you know!
The 2K thing kind of bugs me as well. While used by people everywhere, I find it tends to be used more to refer to 1440p by Brits and other Europeans than North Americans, where we tend to refer to it by its correct name of 1440p. I understand what people mean, of course. But the naming is incorrect.
720p = HD
1080p = FHD
1440p = QHD (Quad-HD, or 4x 720p)
2160p = UHD (4x 1080p)
4K being used to describe 3840x2160 monitors is also a misnomer, but that generic term seems to be used by everyone as a catch all term for 16:9 TVs and monitors. 4K really refers to 4096x2160 cinematic UHD where you have 4000(+) pixels, just like 2K means 2048x1080. But it seems that it’s been applied to 16:9 monitors by manufacturers in their marketing, even though it isn’t “really” 4K.