Another reason I think it’s not a monitor issue per se is because I don’t have this problem with clouds on Ultra. It’s just the way the game renders clouds on High (and perhaps Low/Medium is even worse, I haven’t looked).
My system can usually handle clouds on Ultra but it really makes my GPU work hard all the time and raise temperatures higher than I would like.
I too find ultra not worth the 8 frames I lose and anyway high is way better than it was before.
I suspect ChatoticSplendid is right that your problems may be temporal. Low latency mode in NVCP on, off or ultra may or may not help but with the latter you may well run into other problems e.g. frame drops.
I get the same banding as @ChaoticSplendid too, sometimes, but I believe that is a somewhat different issue related to general color precision optimizations.
I don’t use Nvidia low latency mode because I’m CPU limited and that will only make things much worse.
You might be right, looking closer at your screenshot, it seems that this is a case where the low sample count of the clouds is failing to get denoised properly. The voxels themselves are very clearly visible.
In the pixelated/grainy clouds thread there was a discussion, to which DensestSnail693 contributed, regarding the effect the overall frame rate has on the appearance of clouds in motion. Basically, since they rely on temporal accumulation (like many other rendering techniques in MSFS), increasing your frame rate gives the algorithm more samples to work with, and a couple of people found that by targeting 60 FPS instead of 30 greatly reduced cloud pixelation.
Can you test the same case but with a higher frame rate and see if the problem becomes less apparent? If not, what happens if you reduce cloud motion as much as possible (by using Active Pause, keeping the camera still, and setting wind speed to zero)?
No difference at 60+ FPS, active pause, camera still, wind speed 0
The clouds are still dissipating in that scenario as well.
I’ve become a pro at reproducing this with the scattered clouds preset. Happens every time. Tried Low, Medium, High, Ultra. They all have this issue to various extent. But with Ultra I don’t notice it while flying, only when looking for it with the drone camera.
I think this type of phenomena or rather graphics glitch appears when you are inside the cloud layer of very low density. I had something very similar when I was try to simulate a fog layer with low level overcast clouds with almost 0 density.
I was really interested in what “Model” nvidia GPU the OP was using. I looked back through this thread but never found what Model GPU the OP was using. Just wondering, what model…1000, 2000 or 3000 series is being used here.
Are these problems back (reffering to main post). Pixelated weird looking clouds (DX12, High settting). Could this be a server issue? Quality of clouds getting lower then…? Just a thought… Use to look fine.
There are many posts around cloud depiction issues or missing cloud types. In my opinion, the main general ones are the ones below. At launch, FS2020 was able to render ver nice clouds but a regression happened between SU5 & SU7.