The review, “Is the RTX 4090 Bottlenecked by the 9900K”,
startled me and shocked me and got me thinking.
My first thought was what fool would do this.
What a joke it. How stupid.
This is based on gaming and not content creation.
It is not AMD vs Intel or who is the fastest.
I take it as generation vs generation.
(AMD Ryzen 9 7950X/Intel i9-13900K) vs (Intel i9-9900K)
This review was based on the configuration of the PCs in
the way that the author would actually play in real life.
As in 4k, highest Preset, not 1080P and lowest.
Mostly native rendering but frame generation or DLSS if
actually used in real life.
The resulting FPS is not that much improved.
Not as much as I thought it would be.
Having recently upgraded from an i9-9900K to an i9-13600K,
I am completely happy. (Z390 to Z790) (DDR4 to DDR5)
But I was happy with the i9-9900K.
There was just this longing for the newest out there.
If I only had a 4090 or the new CPU.
And wondering how much greater FS2020 would be if I had the
newest/latest/fastest.
The difference/improvement is not what I thought it would be.
Note that my FPS is limited by my Intel ARC A770 16GB GPU.
For those with i9-9900K systems, keep on enjoying FS2020.
We need more reviews based on the way that users are actually using the reviewed product.
Conclusion
Thanks to @jokerproduction for a review that was well done and
thought provoking.
While it is an interesting comparison, for MSFS you’d have to keep in mind that MSFS relies a LOT more on CPU performance (and especially single threaded performance, including IPC for the main thread) than most other games.
I would say that if he had included MSFS into that video, you’d see a wildly different result.
Throw in a 7800X3D instead of the 7950X, and the difference would be even bigger, because MSFS really benefits from the extra cache on the CPU.
Interesting post/video. I am at the point where I am considering a complete rebuild of my rig. It’s over three years old and I had always planned a three year lifespan for this current build…
Current specs - Gigabyte z390, i9-9900k OC to 5.0GHz, 32GB DDR4, Asus TUF 3080Ti, Corsair RM850x 850 Watt 80 Plus Gold ATX PSU, Cooler Master MasterLiquid ML240R 240mm AIO
One thing I was considering is buying the new GPU - planned for Asus TUF 4090 - and trying it out with the existing build to see if I get any benefits. Nothing to lose, really, since I am still planning a complete rebuild. Since MSFS is, as mentioned above, so heavily dependent on CPU, I’m pretty sure I won’t see much benefit. But it would be fun to try. Also might see some benefit with the additional VRAM since I use VR only and stream most of my flights.
(planned upgrade is a z790, i9-13900k, 4090, 64 GB DDR5 RAM, probably a 1000w PSU and a new AIO)
I just upgraded to your proposed spec from an i7-8700K, NVidia 1080Ti, 32GB Ram. One thing I noticed about the 1080Ti was that it actually ran suprisingly well in 1440p with ultra settings. Of course, I had to run at fairly low LOD, and the framerates were low, but MSFS was always stable and playable. I would fly the PMDG 738 out of say, Ini Builds KLAX with FS Live Traffic enabled with no real issues. I suspect that had to do with the fact that it has 11 GB of VRAM. For that reason, I’d be very interested to see some comparisions between some older generation/higher VRAM cards vs newer generation/lower VRAM; i.e. a 2080Ti and a 3070. For people who value quality of display over raw frames, it might actually be the case that opting for more VRAM might be a better choice than an otherwise higher performing card with less VRAM.
Anyway, I’m extremely happy with my new rig. I was originally going to use an AMD 7950x3D, but changed my mind at the last minute after learning about the funky dual-die architecture on that chip and realizing how much work it would be to maximize performance with it in MSFS (not to mention the teething problems with Asus motherboards). I had also considered the AMD 7800x3D as it is unquestionably the performance king of gaming right now, but I didn’t want to limit myself to a CPU with only 8 cores–especially with MSFS 2024 on the horizon and the hope that it will be better optimized for multi-threading to take advantage of more cores. Finally, since I also upgraded my display to 4K and am GPU-bound, the performance difference between chips at that resolution wasn’t significant enough to matter, making the choice pretty easy for me.
I also thought about waiting a few more months for the Intel Raptor Lake refresh, but ultimately I decided that since MSFS will run well with an i9-13900K, there was no reason to wait a few months for better when good enough could be had today.
As I said, I’m extremely happy with my new setup. I haven’t looked at frame rates yet (as I don’t really chase them), but with DLSS enabled my experience at 4K with default ultra settings is exceptionally smooth.
I was going to build it myself, but I decided that I didn’t want to source the parts myself. I ended up buying this build from Dan’s Custom Built Gaming Beasts and I can highly recommend him. He uses top-quality components that are well thought out, is super responsive, and set the machine up properly with the latest BIOS, Windows updates, etc. All I had to do was drop in the video card (which is packaged separately) install MSFS, plug in my hardware, and transfer all of my add-ons over. It runs MSFS like a champ.
Deelee6800-I have exactly the setup you mentioned for your planned upgrade. I fly VR only with Reverb G2 and get superb performance; LOD clarity at distance is a challenge still and is a limitation of MSFS VR, but I have no complaints. I oc’d my i9-13900 to stable 5.4Ghz with no issues in MSFS. I had to do a fair amount of tweaking and experimentation with Nvidia and MSFS graphic settings to dial everything in. But you will see a positive difference.
I upgraded from an i9-10900/128GB DDR4 RAM system with RTX 3090 and noticed a difference in VR fluidity. The 128GB RAM is definitely overkill.
I don’t track FPS. I don’t limit FPS in Nvidia settings either.
Hey @MSFSRonS - your message formatting has a ‘code block’ set in it and makes it a bit hard to read. They are done like this:
```code block```
…shows as…
code blockl
or
```
code block
some more
```
code block
some more
If you want to quote text you can use the > sign like this (or use the quote symbol in the editor toolbar with the entire text selected):
> Here’s a line of quotes
> Plus here.
…shows as…
Here’s a line of quotes
Plus here.
Note: If you want to show those sequences in a message like I did above then you can use a backwards slash \ before each character to ‘escape’ it to not being used in this special way.
As for your topic, I’m still hanging on to my 9900K as I have it overclocked to an inch of its life (5.1 GHz) but will revise that thinking when 2024 beta comes out, as that might rebalance some things and the team did talk about threading architecture as something that might change in the future.
I use the option </> which says “Preformatted Text” which displays:
type or paste code here
which I paste my formatted text over the “type or paste code here”.
Basically to keep the formatting for my text which I format in Notepad.
I tried to type directly into screen but keep getting confused in the manner of the way the formatting works and never did learn or understand it.
I look at my reply above and don’t see any code block.
I scroll down the preformatted text block with my mouse wheel or
keyboard arrow keys and can read it.
I am not quoting text.
I am inserting my own new text from my clipboard. (copied to it)
Pre-formatted is a code block, just another name for it. On a phone or smaller screen it puts big scroll bars at the bottom and colors the words around programming terms like ‘and’ and ‘not’. Anyway, it’s been edited back to normal now, so no worries.
You pretty much have the same exact system I have other than I have 64GB and different manufacturers. I was originally planning on starting my upgrade path like you by going for the RTX 4090 since like you I’m going to need it anyway. But after seeing the reviews that the 4090 still won’t use a 13900K to its fullest potential I’ve been rethinking my upgrade path. I might start with the Z790, i9-13900K with 64GB DDR5 Ram before going for the 4090. Price will likely be similar, just a lot more work to change motherboard instead of just swapping in a GPU & power supply. Let us know how your upgrade goes if you decide to follow through with it.
@MSFSRonS I went from 3090 with 9900k to 3090 with 7950x3d. I was very aware of the actual uplift I would get as I had done a tonne of research before purchasing the CPU (Benchmark after benchmark comparing the two). I was struggling to hold 30fps in heavy areas with TLOD 4.00, and found tlod 3.50 to be the sweet spot that would get me anywhere. Now onto after the pc upgrade. Before AAU2 update, TLOD 5.00 everywhere was the standard for my X3D and keeping my 40fps cap. That update dropped just before I started really working my set-up over. The update sent me back to TLOD 4.00 with 40fps. Once I got my cpu OC’d (UV/PBO) and my memory OC’d (Just got the sticks tweaked as far as they will safely, and stably go) And I’m back to TLOD 5.00 everywhere and keeping 40fps. It wasn’t a giant leap by any means, but one hard fought. The most difficulty is getting a big RTX gpu limited to get the boost from nvidia reflex + boost. Almost everyone using msfs is CPU limited, and it’s very difficult now to get it GPU limited. When the 5090 hits in the spring most likely, as they’ve cancelled the 4090ti, it will be no different. Or only different that the gpu is now massively overpowered and becomes impossible to get GPU limited. If what they say is true with 2024, the balance of power in our systems might just change.
I can’t wait either!!! Your set-up, and projected set-up will probs work great with MS2024. TLOD 5.00 at 27fps is no slouch at all!!! TLOD 4.00 is pretty darn good though. As far as I can tell @MSFSRonS OLOD above 2.00 doesn’t do anything anymore but take away some fps. I used to have that one cranked myself. In my testing. Objects do not appear any further than OLOD 2.00. It’s broken