I’m on that driver too. What’s your GPU out of curiosity?
Another one with a few giant trees and severe FPS reduction on the ground. Went back to DX11.
I’ll have to check again and look at my VRAM usage.
Not sure I understand why I would reduce settings just to use DX12. Is there some other benefit that I’m not understanding?
Dx12 uses the CPUs cores more efficiently, ie, it spreads the load around more and you don’t become cpu limited as quickly (at least that’s what I am seeing on my system, i9 12900k, where I monitor the CPUs usage by core and that loading benefit has meant less micro studders for me which happened when one of the cores would hit a short spike to 100%). SU9 with dx11 had a lot of studders for me (micro studders), and dx12 in SU10Beta they are gone. General fps levels are about the same for me as I’m GPU limited, but with dx12 I have smoother visuals and no micro studders anymore.
It’s an RTX 3090 Founders Edition. The rest of my specs are in my profile.
There’s been similar glitches in the past and it’s always been 3rd party stuff such as ‘Seasons’ etc.
Theory sounds good, but haven’t seen this play out yet
I have Accuseason running. It’s not that
I am seeing more use of the p and e cores across the cpu in the SU10Beta with dx12. Hard to say how much more efficient it’s making the sim but the microstudders I had before are gone. That could be due to server side fixes, lower traffic on their servers as the beta has fewer users, better cpu utilization, who knows. But when I look at the average and peak usage of each core I am seeing better usage in dx12 than before.
Anyone find it interesting that those with 3090 have an amazing experience and those with 3080 Ti or 3080s do not? Could it be something there?
People on lower cards are having good experiences as well by the sounds. Seems something specific to the 80’s. It’s the fact that the 3090’s arent that much better, performance wise, than the 3080/ti that makes it odd.
Perhaps just the extra RAM on the 3090.
rtx3060(12gb) here: Never been happier.
But back to the glitches, IMO they are 100% a scenery mod so get testing.
Or maybe not?
I have accuseason too but it just changes the bionomes definitions and the spawn ratio basically. If that was the reason the effect would be also reproduced in DX11. It looks more like a LOD and reference coordinates issue in DX12. There´s no reason to spawn trees floating at 10.000ft and with a wrong size unless game is instructed to do so by coordinates system and/or LODs configuration. Maybe the trees are just faulty themselves because all look of the same type is the pics shared so far.
Cheers
The screen shots of the Developer Mode FPS from those on 3080/3080ti tend to show that they have exceeded the VRAM limit on their GPU, whereas the 3090 users have 24gb of VRAM. I noticed in my own testing (on a 3090) that DX12 is using more VRAM (up to 12gb on standard airports/sceneries, nothing custom…some 3090 users posted 18gb of vram usage at highly custom payware airports). My theory is that the VRAM limit is being hit more often under DX12 than DX11, and that kicks it over to regular system RAM and is causing problems with latency, visual artifacts, general performance degredation…and those on 3090 don’t have that VRAM limit and are getting very smooth and positive results. That’s my speculation but I am glad it has been reported widely and is likely on the feedback to the SU10 Beta team which is the whole point of the Beta testing.
I finally got a repro of the trees issue after a 1.5hr flight and messing around with graphics settings, such as switching from high to ultra preset and raising/lowering TLOD several times. I don’t have any addons/mods at all, and my caches/indexes were all cleared out. DX12 mode, no DLSS.
Good post and well spotted … I’m pretty sure it’s v-ram that is causing the (or most of the) out of memory CTD’s too so maybe these visual glitches are part of an attempted fix. It’s totally different Cheddar and a wonder that Cuda coders get paid so little
I did a comparison.
Total GPU memory usage with DX12 was 46% higher than DX11. (dedicated+shared+commit)
Physical RAM usage was 25% higher. (peak working set)
They’re not scenery mod related. It happens with no mods in community.
Yes I seem to find the DX12 glitches when changing settings too (although I’ve also seen it happen at the spawn point).
If others are having “successful” DX12 performance, curious what happens if you try to change your settings while in the sim. Go from ultra to low to back to ultra. Anything happen?