I bought my 3090 Ti 24GB card even after seeing nearly every YouTube influencer pan it:
“It’s pointless!”
“Nobody needs 24GB VRAM!”
“It’s too expensive for what it does!”
Who’s laughing now?
I bought my 3090 Ti 24GB card even after seeing nearly every YouTube influencer pan it:
“It’s pointless!”
“Nobody needs 24GB VRAM!”
“It’s too expensive for what it does!”
Who’s laughing now?
From what i read up, it doesnt matter from where VRAM is coming from. The system doesnt care if its a dedicated or an iGPU. But yeah, i will come back to you tonight, after i have installed my new 64GB RAM and tested more detailed.
I agree, that just showing more dedicated VRAM usage on RAM, might not necessarily mean that it is actually used by the game. As the system will jsut reserve it for the GPU.
Ill try to make an accurate 1:1 comparison and see if HW info can read more data about this than the very limited Taskmanager.
absolutely true i didnt know the real number difference, out of my head, but yeah. Its still fast. But yeah, as said, it wont replace real VRAM of course.
DDR5 is pretty fast, and I certainly don’t understand why the use of shared memory is causing such problems. Maybe it’s that the VRAM to GPU pipeline is as direct as it can get, whereas having to page data in and out of DRAM over the PCI bus is not so direct.
Not really my area of expertise.
(But I did stay at a Holiday Inn Express last night!)
I am in no way diminishing others experiences, but its very weird that I have an i5, GTX 1660 6GB, 32GB RAM (I think, might be 24GB) and get mostly 40-50 fps with high terrain, buildings and clouds, and a mix of medium and low for everything else.
Turning on the AMD frame generator (performance mode) doubled it from 25 to a steady 50 on the medium overall setting and so I bumped some things up to high as anything above 30 is fine by me as long as its stable.
Could it be to do with the resolution? I’m on a standard LG HD TV and so am on 1920 x 1080, which again is fine for me personally but totally get that if you have a nice swanky 4k TV and a high end gfx card you want to utilse those.
Again, not dismissing anyone else experience, just thought some of the techies in here might be able to draw something from the opposite side of the coin.
yeah, i think the speed itself could be pretty similar, but im really no expert in this, it was a random idea/goodle thingy
From the very few bits i quickly read up, i also think the diff is coming from the bus too. But maybe a real expert can help us on this.
So far, i can only say, test before, PC12 at KASE, showed about 10 FPS difference and reduced VRAM usage, after this, with exact same settings of course. But there might be other random factors at play here…
Only thing i can say for 100% sure, my Low FPS issue is solely caused by excessive usage of VRAM. As soon as i manage that, its very acceptable.
Low Resolution and low/medium settings will reduce VRAM usage alot, of course. So id say its about that. Im tending to use way higher settings and 3440x1440 resolution. Also have texture quality on Ultra for now.
Just to really see where the biggest VRAM usage comes from, and then adjust accordingly.
As said, im just trying to find any way to get to a (for me) usable VRAM usage.
I’m of an age where low resolution means 480 x 240
It just seems really odd to me that I get what I think is pretty good performance for a 6 year old PC that was only mid-range when new, whereas there are loads of others who have waaaaay better systems than me that struggle to get even playable framerates.
I hope whatever it is gets fixed soon for those having issues.
Absolutely. Consider the number of pixels per frame.
At 30 FPS the number of pixels per second is:
HD: 1920 x 1080 = 62,208,000
2K: 2560 x 1440p = 110,592,000
4K: 3640 x 2160 = 248,832,000
All in all we can keep trying to find workarounds but its clear there is something wrong with the optimization on the asobo part. It cant just start eating vram and all of a sudden everything is back to normal and ok.
HXArdito Please let us know your benchmarks with 64GB RAM. I’m currenty using 32GB DDR5 G-Skill Trident 6000Mhz CL30 RAM (2 x 16GB) with the 9800x3D and RTX 4080.
But if the 64GB is truely beneficial for MSFS 2024, I will also order them.
I’ll take your 100,000 pixels and raise you with BBC Mode 2, 160x256 on a black and white portable TV.
Yes, it’s latency. The latency difference between VRAM vs DRAM is massive when it comes to GPU rendering. When it has to use DRAM, it’s a massive memory bottleneck.
7900XTX with 24 GB VRAM. It’s blisssssss
yeah im thinking bout that one since a while, but i wanna wait what Nvidia has to offer in january.
Will swap my 4080 for 5090 when it launches.
bad allocation stategy
frequently used assets going into the shared space
all available vram is being treated as one whole thing
lack of planning and lack of testing
Same. I’ll wait and see how things turn out and what the 5000 series will be like. But if Nvidia is taking the ■■■■ and offers the 5080 with only 16GB of vram, i might switch to the 7900XTX, as i’ve heard from several people now, that this card works very well with the new sim. Would save me a lot of money as well.