Is there a good guide to Intel vs. AMD?

The comment about better memory support on Intel is a bit misleading relative to the latest lineups. The Ryzen lineup uses a 1:1 Infinity clock concept, which significantly reduces memory latency but limits RAM overclocking. You can run high-frequency DDR4 kits if you uncouple the Infinity fabric clock, but usually introduce a performance hit. Intel has essentially adopted this same concept with the “gear modes” for 11th gen and upcoming 12th gen Alder Lake.

It’s also important to note early-gen DDR5 will have significantly worse latency than current DDR4. This same thing happened during the DDR3 to DDR4 transition. You’ll actually gain more by waiting a year or two before adopting DDR5.

1 Like

TBF I have not built an Intel system since a 6700k I owned. I see no benefits for myself with Intel. Most of that is borne out of the fact I have persevered with Ryzen and gone through the pain barrier. For someone knew to building their own system, I could well imagine Intel being the easier alternative. Where as there is also very little loss from doing that.

I specifically did not mention how the latency would affect Alder lake. For me Alder lake has the potential to be a Ryzen first gen. It is Intel’s first attempt at gluing cores together. OFC it is in our all interests if they do it well. AMD is proving to be another Intel if think they can get away with it. This is one area a lot will be watching.

DDR5 was always touted as being a server product. As they rely less on latency and more on bandwidth. Unless you are prepared to waste your cash for the fun of it, wait for reviews before you commit. I know that the people who develop memory will be well aware that latency only trumps bandwidth to a point. I also know that they don’t spend money on stuff they don’t think they can get a return on. If it has no potential, then the uptake will be weak.

2 Likes

(post deleted by author)

1 Like

20% This is the same performance increase that Alder Lake is also rumoured to have which is interesting. I wonder if those numbers are just pure CPU or are also part of the shift to DDR5?

I’m really looking forward to seeing how both really look in reality and will probably hold off on any upgrade plans until AMD release and we have a better picture.

We have to take leaked benchmarks of Alder Lake with a grain of salt. I recall initial reviews of Rocket Lake were promising some impressive gains, but look how that turned out. Regardless, it looks like Alder Lake will compete quite nicely with the current AMD flagships. Competition is always good for the end-user.

It should be worth noting the promised 20% bump in performance for stacked v-cache is actually raw gaming performance and not just an IPC increase as usually seen between generations. Gaming is usually rather limited by cache on the die, which is why the whole stacked v-cache approach is quite an interesting concept. If you can get up to 20% increase in gaming performance BEFORE an IPC increase, that opens up some impressive gains.

The 20% gains are rumour only. Manufacturers tend to spill rumours to promote their own products with hype before any actual figures need to be confirmed. That doesn’t mean they have no validity, because they would have to be near the mark or no one would believe them in the future spilled information leaks.

Moore’s law is dead tends to be the most vociferous opinion on leaks right now. He stresses the gains need to be taken with a grain of salt. But he also said that it would be close to the mark.

Another thing manufacturers do is weight their best gains for press release. AMD did badly in AVX256/512 in the past, so they avoided it like the plague. Intel’s best gains were in single threaded. All their benchmarks were set in 1080p. Any information that comes from the vendor will always be skewed in their favour. If you want the best bang for buck, let someone else check it out first. There is always plenty of reviewers on youtube making money from doing just this. Reviews are never in short supply on a large product launch.

Product reviews are also pegged with “non-disclosure agreements”. Where the vendors try to keep verified information under wraps until they can show it in the best possible light. Having a review show something bad is something that can never be rectified. Once on youtube, always on youtube. No one will ever amend a bad review to say something has been fixed either.

1 Like

Exactly… 20% gains typically actually mean 5% in actual use.

It will most likely be true, but as I said it will be skewed in favour of the vendor. It could simply be 20% more cores. 20 cores would be a 20% gain on a 16 core system.
It will most likely be IPC gains over pure core speed. This will mean your pc does lots of tasks 20% faster. It does not means it has 20% more frequency gain.
Ryzen has outstripped Intel on multicore performance on each launch, through IPC gains. Up until the 5000 series, Intel always had the better single core performance, and hence better for most games.

There has been rumours stating that 6ghz PCs are not that far off. Remembering that this is a multiplier with the IPC gains we already have, that will give us a real PC jump that most will be able to feel. So for all PCs are supposed to be hitting that saturation limit, where as people are happy with what they have, more is coming.

1 Like

Man, I would love to see 6GHz. We’ve been stuck at around 4GHz for 11 years or more.

1 Like

The newer CPUs will feel like a rocket to you. If your CPU is that old then the IPC difference will really stand out. The cache added into CPU modules are massive in comparison to some of the older ones. The 3D V-cache is touted to be AMD’s next big improvement, and is rumoured to be so good that AMD has reigned it back in. They don’t want to add massive gains to the next CPU release as they are meant to be 2nd gen 5000 series. They are said to hammer the V-cache benefits hard on the next revision of Ryzen. Intel are not slouching either. Moores reckon they will catch back up to AMD by 23/24, which is sounding like the next big area for releases. RDNA4 will be on the horizon by then also as well as NVidias next iteration, which I cannot remember the name of.

My current CPU is an i7-4790K (and I’m just now bringing an i7-11700K up). My point was that I’ve been running at ~4GHz since about 2010 I think. There’s been a few CPU’s since then along the way. Yes, I know, there are people running at 5GHz and higher, but, in general, the max frequency of stock CPU’s in stock configuration has been stuck at around 4GHz for nearly 11 years. A stock 6GHz CPU has been a long time coming, and I’m happy to see it. I agree with you that Moore’s law died a long time ago.

1 Like

Yep Intel gave little to nothing in way of upgrades for a very long time. I have spent too long throwing cash on being a tech geek in the past. I had a 4970k also, but I will be honest I had no idea what to do to overclock until I had to learn with Ryzen. I only ever ran at stock. I think it was 3.2ghz stock on that one. I guessing without googling.
I then got a 6700k where I saw a 10% uplift in gaming. I think they ran at 3.6ghz stock. I had a mate call me stupid because you would never use the 4 cores. I was into streaming for personal use, and I noticed just how badly the 6700k really was for streaming so picked up an 1800x Ryzen. Came out on par with a non OCed 6700k performance wise. The Ryzen was a good bin also, I think.
I added a 2700x after that which was a noticeable difference, before striking lucky and getting 5950 last year. The 5950 was the biggest jump of all.
I think my itch is done now though. I won’t upgrade till RDNA4 comes along now.
Hopefully the bloody prices will be lower by then. I got everything at MSRP on this system as well as the clone system I built for a mate. The prices right now are ludicrous.

The 4790K is 4GHz stock.

I got pretty good at relatively mild overclocking CPU’s starting back in the late '80’s, just relying on fan based coolers for CPU’s. I haven’t bothered to OC this machine though.

Yes. Super sad about what’s happened to the market.

Are there any post WU6 benchmarks for MSFS out there?

it is not a good game for apples to apples comparisons.
For all it is intensive, no one uses it, AFAIK, for performance testing. Aside from the huge variation you get from one area to the next, it has not been a level playing field. Asobo have tweaked it massively in the few months I have had the game.

(post deleted by author)

Mobias7 I’m mostly music production and stock market programming (c++) and not much gaming except for MSFS …

I’m looking at graphics 3070 with either 11900k or AMD 5600X or maybe 5900X (32 GB RAM) SSD storage … I use many music programs including Ableton, Code, Reason and some custom programming … What do you think

Sounds good. Just be aware that the 3070 will struggle with 4K but if you’re a 1440p simmer then its got great bang for buck performance. As is the 5600X. If you can stretch to the 5900X then obviously its even better, especially for music production stuff.

SSD storage is good but if you can get an Nvme drive get one of those instead. Better future proofing I think.

1 Like

(post deleted by author)

2 Likes

Nice summary, thanks for posting. Shows nicely CPUs don’t make that big of a difference in CPU-bottlenecked settings, like 1080p and even less at 1440. At 4k and above resolutions, your GPU will be the bottleneck and processor choice makes no difference.

TLDR: If you want to game at 4k, prioritize the GPU over CPU. If gaming at 1080p or 1440p, you CAN prioritize the CPU for a slight edge, but it realistically won’t matter much. Value vs performance matters in the latter scenario.