AMD 5800X3D performance

Thanks but I really don’t think having temps in the 70s is an issue at all. And I’d rather not run random executables from the internet to mess with these things.

1 Like

That is huge.

I hope this doesn’t happen. It will make me feel bad for jumping the gun and thinking I already bought the fastest (gaming) AM4 CPU. Yes I’m that selfish :joy:

I have done a comparison test in Tokyo using 5800X3D and 5950X, but it was a bit too large to put in this thread, so I have started a new thread. Please take a look if you like.


Recently went from 5800x to 5800x3d. I was main thread limited and running High end with some Ultra presets. Now I’m GPU limited on Ultra with a 2080ti and get a consistent 52 FPS


In the 5800X3D vs. 5950X comparison thread, we also did a 5800X3D SMT vs. WithoutSMT comparison, if you are interested.


MSFS2020 is CPU limited in the majority of cases.
If it is GPU-limited, then it is a very balanced system.

Hello Waldo, I’m about to build a new rig this week, with 3090 TI, everything at max specs. Would you recommend to take the 5800x3d instead a 12900K even the 12900KS? I see that the x3d is awesome, but is the last breath of AM4 and that’s drives me crazy, the non upgrade route. Mainly I will fly MSFS, DCS on VR and some shooters…

No Intel CPU has an “upgradable route” and DDR4 is at it’s end.

This time around Raptor Lake will explicitly go into Alder Lake mobos. And support DDR4.


That’s a shocker, usually not the case.

I stand corrected.

Also there’s rumors of a 5900X3D that may have over 200mb L3 so we both may be wrong. :smiley:

Right now is not the best time for a new build. AM4 is indeed not going to provide an upgrade path, both Zen 4 and Raptor Lake are coming in the next few months, and the RTX 3090 Ti, besides being way overpriced right now, will lose its value once the next generation is released. GPU prices are slowly improving, and if the Ethereum merge takes place in a couple of months from now, we could see another wave of used GPUs flooding the market.

If you really need a new build right now, then you should go for the 5800X3D if you want the absolute best performance at this time, or Alder Lake if you would like to upgrade to Raptor Lake later (riskier because outside of a few engineering sample leaks, we don’t know how Raptor Lake performs yet, and you will have spent more in the end).

As for the GPU, look into the mid-range segment or used market where you can find some decent deals if you can’t do with the one you have right now. Again, if you want the very best at this time, including lots of VRAM for DirectX 12, the RX 6900 XT is consistently selling below MSRP now and is a much better deal than the RTX 3090 series. You will miss out on DLSS and better ray-tracing performance, however FSR 2.0 is coming soon as well, and once ray-tracing is released you’ll probably want to upgrade to something even better anyway.

Hi, Banzonho.

Without question I prefer the X3D. My gut instinct is that you’ll get more performance per dollar with it. You can pair it with an excellent x570s board in the $200 dollar range like the MSI Tomahawk. The X3d doesn’t come with a cooler, but neither does the 12900k. I’d pair the X3D with a cheap AIO watercooler to get the most out of the boost clocks, or a good high end heatsink.

When it’s time to upgrade I usually sell my components to friends, imo the X3D will have better than average resale value.

Right now it’s 92 degrees outside my window. The 12900k heated up the room noticeably, way more than the X3D does. The X3d is easier on the electric bill too. I don’t really like Intel’s power hungry, hot running nature. With the 12900k I didn’t need to open the room’s heater vents in the winter, and I’d have to run the AC constantly to keep the room cool in the summer. I too have a 3090, and having a 450 watt video card is already bad enough. Less heat, less power, less fan noise, happier universe, cooler room. (you can solder with a 35 watt iron)

One good thing is that the X3D doesn’t seem to be very picky about needing fast dram.

Forgot to mention that right now nVidia has lowered the prices of it’s cards substantially. They have a substantial backlog of 3xxx cards on the market which they need to burn through before they can launch the imminent 4xxx line. 4xxx cards should drop anytime now.

Like Salem stated, here’s rumors of a 5900X3D that may have over 200mb L3. Rumors state that it might be on AM4. But you know how that is, in the PC world the next thing is always right around the corner. You have to decide where to jump in.

Oh, wow.

I wasn’t expecting such a big difference.

Disabling SMT now.

Thanks for your amazing work!


I just upgraded tonight from a 3700x. Very interested to see what kind of improvements I get in VR.

Paired with a 3080.

I came from a 3800x and the difference in FPS was about 2x sometimes better, sometimes less, all depending on graphics settings etc. I have a 3080 as well.


The difference between 3800X and 5800X3D (and perhaps even in general) shrinks the more raw computing power is required.
It is difficult to generalize because different people have different demands on their simulators and different areas where they want them to be comfortable.
However, if you already have a GPU like the GF3080, it is not difficult to get over 40 FPS overall with the 5800X3D.
If it is that much in many areas, people will probably find it comfortable.

1 Like

We believe it is effective with respect to VR. (I have all OculusHMDs from Oculus DK2 onwards).
We have not yet done any quantitative verification of MSFS2020VR, although we have played with it. I assume that large L3s are strong in scenarios with large amounts of parallel access to the same memory space, so I would guess that they would work well for VR.

A side note below.
The permanent factor that makes the frame rate worse in VR is to have independent view angles from the left and right eye for a single screen.
NVidia has provided hardware support for cases like this (generally VR) with a solution to render in a single pass when multiple viewpoints are used for a single scene, but it didn’t work very well.
This is because screen space techniques such as SSAO, which have since become commonly used, cannot support different left and right viewpoints because they defer rendering calculations to a screen that has already been computed. (This is because it is normal for the shadows to be different in the left and right views, and a simple application of SSAO can only generate one shadow shape. This creates artifacts).

The developer may not like this because it does not result in the intended screen effect.

1 Like

(I am not trying to be hostile, just a trend analysis)

Presumably the 7950X3D will have a 2CCD+1IOD configuration, similar to the 5950X. This is my guess based on the fact that the 5800X3D has a small configuration similar to the EPYC Milan-X (Zen3) and EPYC Genoa-X (Zen4).

The minimum configuration would be 1CCD+VCache and 1IOD. This is 5800X3D.
(Currently the VCache chip is only 64MB, and there is no solution to use it as 128MB with through-silicon vias, nor are there any micrographs of such; the VCache area is already stacked on top of most of the CCDs, so there is no space.)
VCache will be limited to 64MB for a while. (This is also the case when used for EPYC: Milan-X has up to 768MB of L3, but the chips connected with fast access from one CCD are 32+64MB. (The others are accessed via CCDs, which are slower in comparison.)

The 7950X3D will operate as a very similar topology to the 5800X3D with fast L3 cache connections. It does not appear as an integrated 200MB L3, but as 2 x 96MB.
In other words, in MSFS2020, I don’t think the speed will change significantly, only the IPC increase and clock improvement.
Since it will be a TSMC 7nm IOD and power saving, the room for clock improvement may be larger than the current situation(GF 14nm++ process currently RyzenIOD).

Hi there,
planning to switch from 3700X to 5800X3D, 1080p here for now and will remain on this resolution until 4000 series become available with reasonable price (lets see if… :smile:). Anyway would you say that this replacement will give me a nice boost while using RTX 2070 Super?

Not sure whether I should go for it, but planning to get top tier AM4 and last on it for next few years until ZEN 5 appears…