Photography has been my hobby for far too many decades — first on film, and now digital (for about seven years). I recently had occasion to use exposure bracketing seriously for the first time, when I was shooting difficult subjects (high contrast, large light value range, intense lights in shot, etc). Much to my surprise, the one stop underexposed shot was virtually always the best shot. I have since been experimenting with this and find that it applies to virtually all the shots I take. It also applies if the is a lot of sky in shot, and if there is very little. The colour range is more realistic, the level of detail is greater, and the balance much better.
So what is going on? Am I just getting old and decrepit with my vision crumbling? Do digital sensors react better to low exposures? Is the manufacturers calibration suspect? Something else?
What is "correct" exposure for a modern digital system?
FWIW I use Nikon equipment; with all but one lens (a Tokina extreme wide-angle zoom) being Nikon. All the shots I refer to above were taken by natural light in the daytime, and a substantial portion of them were landscapes.
Answer
I'm not sure if this is a "how is exposure defined" question or an "is my camera busted" question, so I'll try to address both. :)
Definition of proper exposure
ISO standard 1271 contains a definition for photographic exposure.
Bypassing the math, "correct" exposure averages a scene's luminance and renders that luminance at a particular (but arbitrary) level, measured in lux-seconds, at the image plane.
That level has been chosen based on an assumption that the the average scene's peak luminosity is <=7.8x its average luminosity (again, an arbitrary figure).
The standard provides manufacturers with a small amount of wiggle room (it specifies a constant K, the value of which the manufacturer can select, within a narrow, defined range) to compensate for transmission light losses in the optical pathway, as well as for a rendering a particular manufacturer feels is most pleasing.
In simpler language, "correct" exposure maps a particular shade of "average" grey in a scene to a specific RGB value in the image.
Anything in the scene brighter or darker than this "average" simply falls where where it falls in your image. Or put another way, depending on the average luminosity of the scene, dynamic range of the scene, dynamic range of your imager, etc., etc., it is entirely possible to experience clipping (in shadows and/or in highlights) in a "properly exposed" image.
Is my camera exposing correctly?
In practice, manufacturers have developed sophisticated metering systems to properly weight or discount areas of an image to achieve a higher rate of pleasing images. In effect, this adjusts the the shade of grey the camera considers to be "average" in the scene.
The fact that you prefer the image your camera delivers when its meter says the scene is one stop underexposed may indicate that your camera meter is out of calibration, or may simply be revealing your personal preferences.
Your histogram may offer some clues (be sure to evaluate a RGB histogram, not just a single-channel histogram), but it would be best to shoot a calibrated grey reflective target to see where your meter places the grey in your image file--should be at least 2.96 stops below saturation for 12.8% grey target or 3.46 stops below saturation for 18% grey target.
Hope that helps,
No comments:
Post a Comment