https://www.benq.com/en-hk/knowledge-center/knowledge/10-bit-vs-8-bit-does-monitor-panel-bit-color-depth-matter.html
Have you seen any difference in performance between 8-bit and 10-bit? Do you see any difference in colours and banding in FS2020 between the two options? Are FS2020 optimized for 10-bit and how much computer power does it take (and is this from cpu or gpu unit?). Also, are the cable that came with my LG 27GL83a “good enough” for 10-bit or do I need special cable for that (I guess the answer is no since it seems to work). =P
Any other good info in this matter? =)
The difference to the CPU/GPU work between 8-bit SDR and 10-bit HDR output is fairly small. The CPU work is unrelated to output render buffers, and GPU work is all done in floating-point space before it reaches the final output layer in 8-bit SDR or 10-bit HDR.
High dynamic range is only supported in 10-bit, afaik. If you do manage to get HDR10 running in 8-bit-per-pixel, it will likely have extra banding.
If you are not running HDR, 10-bit output would likely not change anything in a useful way.
Any modern DisplayPort cable should do just fine.
For HDMI, make sure it’s compatible with dual-link output and that it’s suitably compatible with the resolution, frame rate, and bit depth you want to run your monitor at. Consult the documentation that came with your monitor if necessary.
2 Likes
OP, you can take that ^^^^ to the bank.