Several years ago, I asked Why are blown highlights particularly bad in digital photography?, and if you look there, you can see some fairly convincing answers as to why this is.
In an answer to a much newer question, it's noted that "highlight recovery on most modern DSLRs is superb", and I've seen similar statements elsewhere.
Is this, in fact, the case? If the above experiment were to be repeated, would digital fare better?
If so, is this because of:
- Improved sensor technology?
- Improved features like highlight-protection at a hardware level?
- Better RAW conversion algorithms?
- Or, something else?
Note that I'm not talking about expose-to-the-right, which is really just a fancy way of saying that there's more information when there's more light (which is kind of obvious when put that way). This is about the issue where the amount of light reaches the limit of exposure, and the falloffs around those areas as the exposure of a part of the frame approaches that limit.
No comments:
Post a Comment