Is there some kind of equivalency table or formula that expresses what kind of pixels you need in a digital camera to have roughly the same quality as a particular ISO graded film? What other variables would influence this (focal-length, exposure time, etc)?
Answer
I remember seeing a figure of 22MP was "as good" as 35mm resolution (of course, with film it isn't just the ISO, but the manufacturer and age of the film, skill of developer etc.)
Higher ISO film tended to have more grain; and higher ISO digital shots exhibit more noise - a similar cause, but the visual appearance is different.
Digital ISO noise is related to the size of each pixel, as the noise is per-pixel (so the more pixels you have, the less obvious noise is when viewed the same size). One analogy I've used in the past to demonstrate this is to ask several people to time with a stopwatch how long it takes a car to drive around a car park, and then to time how long a person takes to do the same journey - because the person is slower, the margin of error is a smaller in proportion to the overall figure, even though different people will give timings to within a few seconds of each other.
No comments:
Post a Comment