We have been arguing about this issue for many years. Here, we do not evaluate the subjective biased attribute of the actual color performance, but list two basic facts against some restrictions of the two driving forces. The right and wrong are judged by each reader.
AMD's desktop level graphics card driver supports 10 bit color output earlier than NVIDIA's desktop level driver, and even supports 12 bit output. Yes, AMD's desktop level graphics driver has long added support for 10 bit color output, while NVIDIA didn't add support to geforce and Titan's desktop level graphics drivers until the end of July this year.
In addition, amd drivers have supported 12 bit output, while NVIDIA drivers haven't kept up.
With the increasing number of 10 bit displays in the market, we may even see the original 12 bit screen in the future. At this time, people have a demand for the depth of the output color.
NVIDIA driver color output dynamic range previously default
The dynamic range of output is simply the range of color values output. we know that in a 32-bit color system, the range of color values for each channel is 0 ≤ 255. this is
Because the architectures of the two graphics cards are different, the two cards must have different implementations in the final picture output, for example, some picture post-processing algorithms are different, so even for the same picture, there may be some differences in the specific picture presented by the two graphics cards, who is better? Different people have different opinions.