• Devorlon
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    7
    ·
    5 hours ago

    Currently most monitors use 16bits for colour (65,536 different possible colours).

    The human eye can see about 10,000,000.

    HDR / True colour is 24bits, 16,777,216 colour variations, which is more than what humans can see.

    You should care because it means images on your device will look true to life, especially as screens get brighter materials like gold will look much nicer.

    • Zamundaaa@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      13
      ·
      4 hours ago

      That’s not right. Most monitors use 8 bits per color / 24 bits per pixel, though some are still using 6 bpc / 18bpp.

      HDR doesn’t mean or really require more than 8bpc, it’s more complicated than that. To skip all the complicated details, it means more brightness, more contrast and better colors, and it makes a big difference for OLED displays especially.

    • accideath@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      3 hours ago

      That’s incorrect. While it can be assumed that HDR content supports at least 10bit colour, it is not necessary for monitor or content. The main difference is contrast and brightness. SDR is mastered for a brightness of 100 nits and a fairly low contrast. HDR is mastered for brighnesses of usually 1000 or even 2000 nits since modern displays are brighter and capable of higher contrast and thus can produce a more lifelike picture through the additional information within HDR.

      Of course you need a sufficiently bright and/or contrasty monitor for it to make a difference. An OLED screen or displays with a lot of dimming zones would produce the best results there. But even a 350nit cheap TV can look a bit better in HDR.