I saw this question from Leon Neal on Twitter a while back:
If a computer screen needs constant calibration and checks, how come the screens on our cameras lack that facility?
I can think of a few potential answers but I'm not sure any of them completely cover it:
- Because it's too small to be used for fully accurate work anyway (so near enough is good enough).
- Because it's only for previewing, not for editing.
- Because it's not usually the last screen we view a photo on: it's normally going to be copied onto a computer so colour management can be left until later.
However if you think of a press photographer at a news event, perhaps shooting JPEG for speed and filing the pictures over the air straight from the camera to a news desk, then none of those reasons really hold water. Surely they have a valid need to check that the camera's colour reproduction is faithful?
So: is it in fact possible to calibrate a camera's LCD? If so, why do so few of us bother? And if not, why not?
Answer
Two words: "ambient" and "context".
At the risk of making what sounds like a "No true Scotsman" argument, "real" monitor calibration is always in the context of the ambient lighting conditions. Not only do the pertinent standards (ISO 12647 and related) specify the lighting levels and colour temperatures under which critical colour work should be performed, pro-level calibration devices sample the ambient light as well as what's coming from the monitor. Since both the actual value of the max black and the apparent value of the max white depend on the ambient lighting, you can't really make even the basic global contrast adjustment without regard to ambient. (And if you are calibrating your monitor under one set of lighting conditions and using it under another, you're fooling yourself -- you're not really doing that much better than an educated eyeball calibration would have gotten you.)
In terms of actual calibration, then, you would pretty much need to do a fresh calibration every time you look at the LCD (or, at least, for every shooting session). And you'd have to hold the camera at the same angle under the same lighting every time you used that profile (or get used to using a sealed viewing environment, like a HoodLoupe or a dark cloth/viewing tent). If you have time to be messing about like that, you're probably not shooting breaking news or sports action.
Recent cameras will allow you to do an educated eyeball calibration, at least within the limits of the device. All you need is a reference image on the card. But unless the screen is way out of whack, I'm not convinced of the utility of anything other than the basic brightness setting (which is a contrast setting in the context of ambient lighting). If you can count the number of mireds you are off by on one hand, you won't be able to tell the difference without an external reference anyway, and I don't recall running across any cameras that allow you to manually adjust gamma conversion curves and so on in-camera (although many of the better ones will let you create your own presets externally and import them to the camera).
We have ways of calibrating the capture instead, which is of far greater utility. That includes things like setting a white balance (auto, camera presets, or custom, all with fine tuning for preferred deviations like warmth), including calibration targets (from a simple grey card to more comprehensive targets like a ColorChecker or SpyderCube) in images, global and channel histograms and blinkies (highlight warnings). And if the colours really are that critical (say for advertisement product photography), you're almost always going to be using a "colour to target" tool in post to match a Pantone specification anyway.
No comments:
Post a Comment