Help in choosing the new pc specs for msfs

I wouldn’t be too sure - AMD did their testing at a meagre 1920x1080 https://www.amd.com/en/product/12731

That’s lame.

The verge’s review was the same - 1920 x 1080 for the 7950x3d. Tom’s Hardware reveiw didn’t actually do any testing, Eurogamer tops out at 1440. Doesn’t seem like PC World testers own a decent screen - everything is 1080p.

We’ve been here before - initial benchmarks that don’t tell the full story. I’d hold off making any decisions until a few people actually pony up the money and buy and test them at higher resolutions - because that’s where the world is heading for. Lies, ■■■■ lies, and benchmarks. Especially in the use case of MSFS, where framerate isn’t that critical past a certain point.

1 Like

That’s because it’s how you benchmark a CPU. You want to remove possible GPU bottlenecks, so the CPU can push all the frames it can. It’s no different than benchmarking a GPU, where you do the opposite and use the best CPU possible to prevent it from being a bottleneck. If you benchmark a CPU at 4k, but get GPU limited, the results will all show the same average fps until you get to a low enough CPU that the bottleneck returns.

In traditional games, having a screaming CPU wasn’t necessary. With all the AI traffic and scenery going on in MSFS, plus all the add-ons that can tax a CPU, you’ll want the best you can get for your budget. For example, FSLTL is a popular traffic add-on that is known to easily cause about a 10-15 fps drop due to the CPU being impacted hard. That happens even if a plane isn’t being rendered by the GPU.

3 Likes

There are no lies here - this is done for a technical reason. CPU benchmarks are done at 1080p to force a CPU bottleneck. Doing a benchmark at 4K would largely be GPU bound, and the results would be pointless for assessing actual CPU performance.

1 Like

If you REALLY want to force a CPU bottleneck, you run at the maximum resolution. Look at how many systems can’t push out 4k without a CPU upgrade. Or if you want to test raw CPU-only performance, don’t use a game that can benefit from GPU acceleration, use 4k or 8k movies straight to a 4k or 8k display. Or do transcoding - which the same benchmarks show that you’re still better off going with an i9-13xxx. These “benchmarks” have nothing to do with the real world, and nobody should base a purchasing decision on benchmarks anyway.

Here’s the thing: If you’re happy with 1080 or 1440, then you don’t need these CPUs. If you want 4k/8k or greater, these CPUs are still behind the i9-13900. So who are these CPUs for anyway? The simmers doing a new build? They can do better, or they are running at 1080 or 1440, in which case they simply don’t need it.

“Future-proofing?” You’re better off saving your money, then when the future actually gets here, you’ll be able to buy better, cheaper. That’s how it’s always been.

It’s sad, because I used to always buy AMD. But this round, Intel is still the champ. And if they wanted to rework their 13xxx series and get even more return on their investment, they can do a process feature shrink, get rid of the E cores, and stuff 12 p cores on the die. I’d pay a 50% premium for that. The advantages are (a) more deterministic code execution, and (b) raw performance. Of course they won’t it would compete too well against their xeons for use in labs where raw power means saving time means saving money.

Anyone tried to run MSFS on a dual-cpu xeon with a half-terabyte of RAM? Seems it would cost less than some of us already spend building out cockpits. The only poster running an XEON only has 32 gb of RAM, so certainly not representative of the product category. The motherboards I’m looking at for a new build in 2 years support dual CPUs, 16 slots for RAM, and 7 slots for video cards. Why you would buy an Xeon mb and then not USE it to a reasonable level of it’s capacity, IDK.

The 7 slots can only accommodate 4 dual-width cards, but hey … even that would be awesome. And with software bloat being what it is, and the ability to run data models that you simply can’t run in a reasonable time on a desktop machine … something to keep in mind. Because I have a data analysis project I’m probably going to run in 3 years, when it will be cheaper to make fractional petabyte storage machines (estimate I need 1/3 pb, and in 3 years that will be “affordable”, at least compared to, say, last year).

1 Like

This is incorrect. Increasing the resolution forces a GPU bottleneck, and exactly why CPU benchmarks focus almost entirely on 1080p and not higher resolutions.

I would recommend to give this article a read, as it explains the technical reasons behind CPU benchmarks, as well as incorrect misconceptions that you’ve just repeated.

Recently we’ve started to see a more overwhelming number of comments and complaints about how we test hardware in our reviews – namely, CPU reviews that look at gaming performance. In this case, we typically focus on low resolution testing (1080p), as this helps to alleviate the potential for GPU bottlenecks, allowing us to more closely look at actual CPU performance.

4 Likes

The only test that didn’t rely on GPUs (doing video encoding) showed that these CPUs are still laggards in terms of raw processing power. So, no need for a new motherboard, just stuff an i9 into the box.

Throwing in GPUs adds another, confounding, variable to the mix. And both 1080 and 1440 will seem like ancient history in a few years. Most new content is available at 4k. As 4k TVs crashed through the $1,000 price barrier, adoption is growing. The person who bought a 50" Sony FHD dumb TV 15-20 years ago for $3k isn’t going to balk at paying $1,000 for an LG or Samsung 65" 4k Smart TV today. The “fleet” of consumers capable of using 4k is just going to grow - and gamers are going to expect the same of their games as they get everywhere else. 4k or die. Same as nobody is buying FHD TVs any more. Retailers don’t carry them - only the bottom-feeders on Amazon and Breault and Martineau selling stock that was discontinued in 2018.

1 Like

Higher L3 cache typically doesn’t benefit performance outside gaming. Because of the slightly lower base frequency on the 7950X3D, the 7950X3D can have slightly lower performance than the 7950X in certain programs that don’t benefit from L3 cache. The 7950X3D is advertised primarily as a high-end gaming CPU for this reason.

2 Likes

And as the 5800x3d has shown, raw clock speeds and core counts aren’t always what matters. This is all in the context of MSFS anyway. If you have other uses, especially production work, then the best CPU for you specifically may very well be the 13900K.

2 Likes

A while back someone on this forum was reporting an issue with MSFS on a Threadripper 5995wx / WS with 128GB ram, triple 6900xt setup. Hopefully his problems were resolved.

1 Like

Anyone with such obscure hardware is very quickly going to run into trouble. Quite simply Asobo don’t test with it, won’t tune performance to it and you would probably need your own cuda engineer just to get off the ground.

2 Likes

My point is that until we see actual users telling us how good they are, we should get the salt shaker out, along with a BIG shovel. Benchmarks rarely tell the whole truth. So let’s see some of the users who are looking to upgrade and see what they say - gotta trust them more than benchmarks. Especially since they don’t cover 4k - and most new content is being made in 4k, and games are doing 4k (at least).

Can't figure out why none of them tested at 4k. Even the XBox X and Sony Playstation can do 4k. And that's not exactly cutting-edge hardware any more.

My guess is that early access came with the requirement that they restrict benchmarks to 1440 and under.

1 Like

I will take benchmarks over peoples’ subjective “feelings” any day. While benchmarks can be scrutinized for various reasons, they shouldn’t be outright dismissed because you don’t agree with the numbers.

I would suggest you read the article I posted earlier, as this subject is what the whole article tries to explain.

1 Like

Here’s the exact reason why you benchmark the CPU at lower resolutions, using the 13900k as an example.

Compare the charts at 11:56 (1080p) and 12:40 (1440p). Using your own ideal testing method, there is almost no difference in performance between the top 12 CPU’s in 1440p. That would mean the 13900k is pointless and you should save your money by going with the 5600x. If you were to bump it to 4k (which they didn’t test), I suspect the cheapest CPU would perform almost at good as the 13900k, and the results would be flat. So what does the higher resolution benchmark tell us? Nothing.

If you look at the 1080p chart, there you see noticeable differences between the CPU’s. A few are bottlenecked by the GPU they are using, and they showed a few results later after recently upgrading the GPU on their test bench. With the higher performance of the GPU, there was more delineation between CPU’s thanks to the lesser GPU bottleneck.

1 Like

To add to this point, these are CPU benchmarks for Cyberpunk at 4K and 1080p. For 4K, notice how the top performers cluster around the same performance - a telltale sign of a GPU bottleneck. For 1080p, performance scales with CPU performance

1 Like

Why would anyone spend x amount of extra dollars for a cpu that doesn’t improve performance at the resolution they play at? Forget 1080p benchmarks, what we really need is side by side video of 2 or 4k gameplay to make proper comparisons.

1 Like

Agreed 100%, the proper way to do a head to head CPU benchmark is to go low on resolution, and 1080p is the baseline nowadays.
Of course it does not tell the whole story. For instance people would like to see 1% perf. comparisons for their preferred games, resolution and settings, ideally with as many setups as possible.
As you pointed out in a separate thread, it would be nice if MSFS had a built-in benchmarking functionality, to facilitate the exercise. Maybe with the ability to automatically upload the resulting data anonymously in a centralized repository, accessible by everyone.

1 Like

Too bad that article is already outdated. The question was AMD’s latest CPUs. They obviously didn’t test them. Will those CPUs outperform Intel’s i9-13900 is the question people who are buying in the near future want to know, and we really need to wait for real-world feedback from real-world users in real-world use cases. Because, as I keep reminding people, in theory, theory and practice should be the same, but in practice they almost never are. Any experienced hardware/software systems dev will tell you that.

1 Like

The review embargo for the X3D processors was already lifted. You can find numerous independent reviews for the 7950X3D, including a thread on this forum dedicated to the 7950X3D and MSFS.

1 Like

The article was written less than a month ago and was about principles and methodology of testing, so far from outdated. Also, the question wasn’t about any specific CPU at the time, but whether or not CPU gaming benchmarks at 4k are meaningful (and everything points to no here).

2 Likes

Not at 4k it’s not. And that’s where the present is for many making a buying decision. And of course, once the Arc gets XeSS support, the CPU will matter less than it does now. And I’d hate to see the specs at 8k, which is where I’m at (and everyone will be within a decade). So, to sum up, AMDs latest is not the greatest for the desktop, and it’s a wash for gaming at best. And that’s according to your graphic.

Because seriously, if you’re doing a new build, you’re not running at 1080, and if your budget won’t allow you to build a decent gaming rig at 4k, you’re better off getting the XBox X, which CAN do 4k.

And that’s the title of the thread - Help in choosing a new pc specs for msfs. Most pf the people asking this question aren’t dual-use - they want a rig to play MSFS, end of story. And they will save a bundle buying an XBox X instead. The XBox X takes up to 8 peripherals, so they can knock themselves out with all sorts of hardware add-ons. And WASM is going to be kind of a big thing on both XBox and PC, allowing many add-ins to work pretty much the same on both. If I didn’t need a PC for working on my current and future projects, and was actually into games, I would have probably gotten an XBox X instead. And it’s probably the best option for those who just want t build a gaming rig that can do 4k.

1 Like