When DX12 (Beta) was implemented I thought to give it a try so switched to DX12 and, as forecast, didn’t really see any changes running all settings on High. The CPU was still only mostly utilising two of the eight cores. As there were no ill effects I left the option selected to DX12.
So, system:
MB: MSI MEG X570 Unify
CPU: Ryzen 7 5800x
GPU: MSI Radeon RX5700XT Evoke
RAM: 32Gb Corsair Vengeance 3600
3 x LG 1920 x 1080 monitors (2 x 24" 1 x 27")
I have just replaced the 27" 1080 monitor with a 34" 3340 x 1440 curved monitor because I got fed up of waiting for multi monitor functionality. It is a big improvement. However, I noted pretty quickly that the fans were running at max chat continuously so I ran the sim alongside the Radeon Software and
the GPU Junction Temps were running at over 100 degrees. I logged the data, operated for about 5 minutes and then banged out and examined the data. In the whole time logging the Junction Temperature did not frop below 100 degrees in fact it went, at one point, to 116 which is above the prescribed max (110). GPU Current Temps were higher at around 65/70 and CPU temps were fine at around 50. Now, I had expected an increase in GPU temps, afer all it is working harder, but not to that extent and I actually initially thought that I would be unable to run that monitor. However, I dropped some of the more impactful settings to medium, de-selected DX12 and tried again. The temperatures on the GPU all dropped, the Junction Temps significantly and were now running at around 80 deg, I left the DX setting on 11, reset all of the graphics sliders to High and flew again and the temps were fine; Junction running 75-85 and Current at around 60 with the fans running appropriately.
I don’t pretend to understand why but it appears that DX12 has had a big impact on GPU temperatures and most particularly when combined with the extra monitor requirements. 