Folks with a RTX3090 - Quick?

Just curious what your vram usage is like on a 3090. Im running a 2080ti and Im nearly maxed out on vram especially in VR. Was wondering if this same makes use of all that extra vram or not. Ultimately trying to figure out what my next move is if and when anything comes back in stock.

I have not checked, am at work, but i am glad i went with the 3090, not even a 3080. It kills on 2d resolutions, you can max everything out, and even on the very intensive VR simming, the VR quality is amazing.

I have the EVGA 3090 FTW 24GB. Quite happy with it, but you have to have at least a 1000Watt PSU. my 850 PSU was not enough. This GPU spikes tremendously, my PC kept resetting when i used the very fps / poeer intensive MSFS with the SDK activated.

Also you will need 3 separate power cables going from the PSU to the GPU.

1 Like

I should be good in that department. I have a 1200 watt currently (Corsair AX1200i). Thats normally how I like to game in any title, max’d graphics. Sounds like a 3090 will be where its at.

3090 Asus Rog Strix OC here.
London City take off with v-ram over 14gb.
upgraded from 1080ti and had big improvement
Beast GPU

Out of curiosity, what CPU did you pair that 3090 with and do you have 32gig RAM?

8700k delided at 5.1ghz + Process Lasso as helps a lot for cpu stability and high performance.
waiting for 11900k for the next upgrade

2 Likes

Yeah that’s the one I want, nobody is selling it! :frowning: any chance you’ll go back to the 1080 anytime soon! :slight_smile:

1 Like

Interesting bit about the PSU. Mine is 850w and everything seems to run just fine with the 3090.

Unless money is of no object to you I don’t see the point in the 3090. All the benchmark tests I’ve seen give it about 10% advantage over the 3080 and unless you’ve very lucky its about £700 more. If you pair a 3080 up with a good processor you’ll get similar performance for quite a lot less money. If 4K isn’t essential then a 3070 or 3060ti paired with a good processor is a good buy at this point.

I’ve got an overclocked 3080 paired with a Ryzen 5 5600X and from what I can see any performance gains at this point will be down Asobo optimising things better rather chucking every more expensive hardware at the problem. My VRRAM is not maxed out with my 3080.

Very dependent on the scene displayed, but I have seen it as high as 16 G.

My computer / PSU would only restart when i was using MSFS with the SDK activated (while doing some scenery design work ). But putting in the 1000W PSU fixed that. It would not happen when just playing MSFS

Hi, firstly I don’t use VR yet, so can’t comment on performance in VR. I recently went from a 2080ti to a 3090, and from settings around the mid to high, to almost everything at ultra. I’m getting around the same fps now at ultra as I did with the 2080ti at med/high. Not sure about vram usage, will have to check that later when I get time, but the biggest improvement for me is the visual quality and clarity while maintaining reasonable fps. I still get occasional minor stutters as I did with the 2080ti, but hopefully that will improve with DX12. I have a Seasonic 1000w psu and got pc resets when starting MSFS, no other program did this, and I use some graphics intensive programs like Lightwave and Maya, so I have had to undervolt the GPU slightly to prevent this. CPU is a 10900K and 64GB ram. Hope this helps.

Apreciate all the insight from everyone. I’m pretty much going to keep my sights set on a 3090 at this point. That should keep me nice and future proofed for years to come, especially with the G2 resolution and future VR headsets.

I have a PNY 3090 paired with a 10900k running at 5.2 ghz on a 750 watt power supply. No issues whatsoever.

My 3090 at 5120x1440 ultra settings will get 14Gb vram over London, if you want to Increase the LOD via the *. ini file you can expect anything upto the 20Gb+ vram use, this is where the low vram on the 3080 misses out, can’t understand why they gave the 3080 less vram than the 2080ti?

VRAM Usage or Allocation? Are you measuring via dev tools or something like afterburner?

I was using task manager in W10. There’s a dedicated section for GPU memory usage. I supposed to could cross-check that with afterburner to make sure its accurate.

Check the sim dev tools.

Afterburner/Task Manage doesn’t show actual usage, just allocation