I understand how an ISO number affects the film sensitivity or a digital image, but I'm curious where did the numbers come from? How come we talk about ISO 100, 200, 400, and so on instead of ISO 1, 2, 4 or some other arbitrary sequence of numbers that indicates the relative differences?
Answer
Let's start with a magical history tour: when the the system we've inherited as linear ISO speed designations (the former ASA speeds) was developed, 25-speed film was pretty cutting-edge, high-speed stuff. Kodak's Panatomic X (the "X" was for "extra high speed" -- and it was ASA 160) was still the stuff of science fiction. There were at least two 25-speed films (and one that was slower than 25 when exposed and developed for continuous tone) on the market at the tail end of the Age of Silver, all from Kodak: Ektar 25 (later sold under the name Kodacolor Royal Gold 25), Kodachrome 25 and Kodak Technical Pan, which was generally shot at 16 or 20 for continuous tone black and white. A scale based on multiples of 100 might seem arbitrary, but what you're seeing is the tail end of a lot of technological advancement. It would not have been unheard of to use a film with a speed of 6 in the early days.
The speed of a film was determined by a standard process. The film was exposed to a scene with a known luminance range, then developed (in a standard developing chemical at a standard dilution for a standard amount of time at a standard temperature) to attain a standard contrast (density) range on the negative or transparency. That, of course, meant exposing the film for different lengths of times and at different apertures so that the developed image would eventually fall into the standard contrast range.
The contrast curve of the film was then examined to determine the amount of light required to make the minimum visible contrast difference between unexposed film (the fog density) and the darkest dark that was actually recorded. It is that amount of light (or, rather, the inverse of that amount -- 1/amount) measured in now-obsolete non-metric units, that determined the film's speed.
The process hasn't changed a lot. The calculations (for film) now involve a lot of conversion constants so that measurements made using current standard units closely match the speeds that would have been calculated using the older methodologies. You can't just obsolete all of the existing cameras and light meters on a whim, you know. And the results are rounded to the nearest standard film speed (based on the familiar 1/3 stop scale -- 100, 125, 160, 200, 250, 320,...).
Digital "film speeds" are calculated to adjust which data are used from the recorded data set, and match the exposure values (aperture and shutter speed) that you would have used had you been using film of that speed. The camera may do all kinds of mathematical trickery to increase or reduce the apparent contrast range when producing a JPEG to give a particular character to their "film", and may (depending on the camera) do a bit of analogue amplifying and bit-shifting in producing "raw" output as well.
I hope this comes close enough for government work -- I'd really prefer to avoid posting a bunch of graphs and equations if I can.
No comments:
Post a Comment