Wednesday 18 December 2019

Do RAW files really allow more RAW dynamic range than shooting in JPG?



I've heard a lot of time people saying that shooting in RAW offers a better dynamic range than shooting in JPG. But in some way I've always felt it as hard to believe.


So, the question is: there is any evidence of this fact?


Follow the (probably wrong) reasoning that I have done, why it's hard for me to believe that RAW has really a great advantage in achieving more dynamic range.


The reason is that if that I know that having 12 bit for each channel (instead of 8) offers the possibility to memorize 8 time more shades, so theoretically it would be possible to save more info in a RAW picture.


But at the same time I also know that the final result of a perfect HDR processing is shown using 8 bit.


So, somehow if a picture is taken to JPG as having some zone burned out (clipped) I wonder why the firmware, having the RAW information with the higher dynamic range, prefers to burn out some zone deleting details instead of doing a simple HDR on the fly to save some detail.


Also considering that the human vision is very similar to a natural HDR (it's really rare that the eye see the sky as being white because too much luminous)


Refer also to this link: Why is Adaptive Dynamic Range incompatible with ISO Expansion?




No comments:

Post a Comment

Why is the front element of a telephoto lens larger than a wide angle lens?

A wide angle lens has a wide angle of view, therefore it would make sense that the front of the lens would also be wide. A telephoto lens ha...