Wednesday, 25 January 2017

color management - Can I use 10bit effectively today and if yes how?


I found plenty of information, just nothing that would help me to make a final conclusion.




I could not really find a satisfying answer to any of these questions.


I am asking because I am interested in buying a new display (Eizo Color Edge CS2730) to replace my pretty old Flex Scan 23431W. I guess it will have a pretty much better image quality any way. ;)


Bottom line so far seems to be that 10bit support is pretty poor, no simple plug and play working out of the box after connecting the monitor.



Answer



The direct effect of using more bits to represent colors is "just" to have a larger range of colors. It's the same as having three types of color receptors in our eyes is for "just" being able to perceive more colors. You can describe it that way, but more colors is more better.


Color Calibration


More important than getting a display that supports more than the current standard is to calibrate the colors of whatever devices you do have. The bit depth won't matter much if the colors are all wrong anyway.


Image Editing


For image editing, the main benefit of having more colors is being able to edit more before the appearance of artifacts, primarily banding, caused by the limited range of colors. This holds even when the output device, whether display or print, has a much lower color depth. Although banding can occur with any file format or output device, regardless of how many bits it has to represent colors, by selecting too narrow a range to spread across too large an area, it is less likely when more colors are available.


Banding can also be mitigated by dithering, which is more effective when the "original" (higher bit-depth) colors are available for processing.



Software support for editing in 16- and 32-bit color is good:



  • Photoshop, before the CS versions even existed

  • GIMP, since 2.9.2

  • Imagemagick

  • Krita

  • All HDR processing software

  • All RAW image processing software

  • Many others



General Hardware and Software Support


Regardless of the number of bits available to it, the display cannot show more than it receives as input. There is simply no point to getting a 10-bit display to display 8-bit color, which is what will happen until JPEG is displaced and everything else is upgraded to output greater than 8-bit color. (Virtually everyone reading this has recently had a JPEG displayed on their screen.)


If you decide to upgrade, special video cards and drivers are needed to use more than 8-bit color. That pretty much guarantees hours of fiddling to try to get everything working. Outcomes include thinking it's working when it's not, but being unable to tell the difference. Or simply giving up and settling for 8-bits. If you ever do manage to get it working, people will continue to send you JPEGs even though you've insisted they send only HEIC or BPG (or PNG or WebP or EXR). They will also complain about not being able to open your files or about the colors in your images being "off" because they weren't considerate enough to also upgrade their equipment to display 10-bit color. (Or perhaps worse, they will compliment you on how warm the colors in your images are when you had intended cool tones...)


Is It Really 10-bits?


Apparently, some displays are really 6-bits pretending to be 8-bits. It's difficult to tell which are which because manufacturers aren't forthcoming with the information. How do we even know whether that new "10-bit" display isn't really 8-bits pretending to be 10-bits?


Some "10-bit" monitors do improve output by taking 8-bit input and "correcting" it in 10 bits. The value of this is a personal choice.


Why do manufacturers develop hardware that claim to use more bits?




  1. It's like the megapixel wars. They get bonus points for having bigger numbers.





  2. Early adopters are willing to pay more.




  3. If they make enough incremental improvements, eventually the difference will be significant.




What about Gamma... Linear... AdobeRGB... ???


Agh!!! Hours... Days... Weeks of fiddling with settings to get everything working properly.



Patience


If you wait until technology standards shift, it will be cheaper and easier to move to devices with increased color depth. They will be everywhere and widely supported without your having to specifically seek it out. You also avoid having your equipment becoming crippled, or even useless, overnight, should the industry decide to jump to 12-, 14-, or even 16-bit color. (Consider how many of us owned technology that became obsolete the moment the industry standardized in a different direction.)


It will be like the progression in video resolutions. Standard-definition television was good enough for everyone for decades. Early adopters bought into stuff like SVHS and LaserDisc. Then DVD came around and made it all obsolete. That was good enough for a while, but then came HD-DVD vs BluRay. Now, with 8-megapixel (4K) displays, the deficiencies of 8-bit color will be more apparent and manufacturers will target color depth, especially since the next step in resolution, 32 megapixels, is a bit much.


Though 8-bit color has been "good enough" for decades, we're on the verge of high color depths taking over. Graphics and video editing is done at high bit depths. Digital cameras and camcorders support high color depths. Video codecs support high color depths (AVC, HEVC). Graphics formats support high color depths (TIF, PNG), with another being pushed out (HEIC). Some graphics cards and monitors support high color depths (sort of).  The technology is mostly here already, but it hasn't been widely adopted and doesn't work well together ... yet.


iPads and iPhones already use a different colorspace. They already capture in a format capable of high bit-depth color (HEIC). They could become the first mass produced, widely distributed devices to support the capture and display of high bit-depth color. (Retina2 ®©™-$$$)


No comments:

Post a Comment

Why is the front element of a telephoto lens larger than a wide angle lens?

A wide angle lens has a wide angle of view, therefore it would make sense that the front of the lens would also be wide. A telephoto lens ha...