Given the wavelengths of light we see, presumably there is some limit of pixel size below which we cannot resolve – i.e., packing four pixels into the area of one pixel would not be able to better resolve the location of an incipient photon.
But I'm not a physicist, so is this a relevant limit?
If so, how close to it are modern camera sensors?
Answer
We're there.
Diffraction and color (wavelength) determine the physical limits on resolution at the sensor surface. The best explanation of this (and the many related practical considerations for digital imaging sensors) is from http://www.cambridgeincolour.com/tutorials/diffraction-photography.htm:
Once two airy disks become any closer than half their width, they are also no longer resolvable (Rayleigh criterion). Diffraction thus sets a fundamental resolution limit that is independent of the number of megapixels, or the size of the film format. It depends only on the f-number of your lens, and on the wavelength of light being imaged.
At an aperture of f/1.2 (which would produce the least diffraction) the Airy width for visible light is 1.6 microns. For a typical sensor (which is designed as a Bayer array) in an otherwise perfect electro-optical system this means that pixels smaller than about 0.7 microns offer no increase in resolution.
We have the technical ability to mass-produce image sensors with pixels that small, but the smallest I've seen in production is just over 1 micron (common on cell phone cameras). Note that premium production cameras still typically have a "pixel pitch" of a least 4 microns. Apparently the primary motivation for larger pixels is their improved signal-to-noise ratio. Presumably in a perfect electronic system that wouldn't be a factor and all sensors would sport pixels about 1 micron in width.
No comments:
Post a Comment