Sunday 4 August 2019

color management - When should I use my display's native gamma over the standard 2.2?


When it comes to color management and calibration, reading in official/unofficial sources, there are different oppinions telling that the native display gamma value can be left "as is" = "native" (2.43 measured in my case)


OR,



on the other hand, gamma value could be set at 2.2;
How can I approach this, which is the "right" seein color, at native monitor gamma value or other value (e.g. 2.2)? if the gamma have any influence in the viewing process excepting all the other variables (room light condition, color temperature, etc considering these the same in both cases).


—Official source: page 96,97 of this book for example.


—Another source Argyll.com:
Adjusting and Calibrating Displays
By default, the brightness and white point will be kept the same as the devices natural brightness and white point. The default response curve is a gamma of 2.4, except for Apple OS X systems prior to 10.6 where a gamma of 1.8 is the default. 2.4 is close to that of many monitors, and close to that of the sRGB colorspace.


—Another one dispcalGUI documentation:
Why has a default gamma of 2.2 been chosen for some presets?
Many displays, be it CRT, LCD, Plasma or OLED, have a default response characteristic close to a gamma of approx. 2.2-2.4. A target response curve for calibration that is reasonably close to the native response of a display should help to minimize calibration artifacts like banding, because the adjustments needed to the video card's gamma tables via calibration curves will not be as strong as if a target response farther away from the display's native response had been chosen.


Of course, you can and should change the calibration response curve to a value suitable for your own requirements. For example, you might have a display that offers hardware calibration or gamma controls, that has been internally calibrated/adjusted to a different response curve, or your display's response is simply not close to a gamma of 2.2 for other reasons. You can run “Report on uncalibrated display device” from the “Tools” menu to measure the approximated overall gamma among other info.

...


That's why I am still confused.



Answer



Setting display gamma to 2.2 or sRGB (in LCD OSD settings, that is) will make it closer to perceptually uniform (i.e. efficient for be used by human) and also make 8 bit images with gamma 2.2 or sRGB expose less banding (bigger colour depth) when viewed at 100%.


However, there is a problem, at least with DisplayCAL: creating profile for gamma 2.2 means that 1D LUT will be created. This 1D LUT can be applied in two ways:



  • using videocard LUT (most oftenly 8bit -> 8bit)

  • using display internal LUT (may have much bigger internal precision like 8bit -> 14bit)


Here comes the problem. Assume that you have an 8bit LCD which does not have gamma correction at all (or does not allow the gamma to be set enough closely to taget) and also does not use editable internal 1D LUT. If you calibrate your LCD for gamma 2.2 the DisplayCAL will make your LCD loose colour depth because it does not offer any interactive gamma adjustment. Therefore your LCD won't even be 8bit anymore (more like 7,5bit, for example).



If your entire workflow is 16bit there is zero sense in creating 1D LUT. However, since most LCDs have extra contrast, you may set your LCD gamma to compensate for it BEFORE profiling and then create profile for gamma "As measured".


Creating 1D LUT for video card (8bit to 8bit) will only cut your colour depth.


No comments:

Post a Comment

Why is the front element of a telephoto lens larger than a wide angle lens?

A wide angle lens has a wide angle of view, therefore it would make sense that the front of the lens would also be wide. A telephoto lens ha...