Monday, 11 September 2017

sensor - How much light and resolution is lost to color filter arrays?


Color digital cameras are typically implemented by putting a color filter array (CFA) like a Bayer filter, plus an infrared cut filter, in front of a sensor that is sensitive to light frequencies spanning the full visible spectrum plus some range to either side of it.


The filters have two degradative effects:





  1. They exclude light from reaching the sensor. (E.g., a "green" sensor pixel may only receive photons that are within the range 500-570nm. Most others are rejected.)




  2. Resolution is lost to "mosaic" effects. (E.g., a green image component is only seen by half of the pixels in a Bayer filter.)




How are these losses quantified, and what is their typical magnitude in practice?



Answer




The idea that any particular wavelength is only allowed to pass through one particular color of the three colors used in a Bayer masked filter has been perpetuated to death. Fortunately, it is false.


Here's a typical enough spectral response curve of a specific camera sensor.
Sony IMX249 absolute QE
The visible (to humans) spectrum ranges from 390 to 700 nanometers. Notice that the "green" pixels respond, to one degree or another, to the entire range of visible light. That response is greatest between about 500 and 570 nanometers, but it is by no means zero at other wavelengths. The same is true of the "red" and "blue" filters. Each allows some light from the entire visible spectrum to pass. What differentiates them is in just how much of the light of a particular wavelength is allowed to pass through and how much is reflected or absorbed.


There are Bayer masked CMOS sensors in current DSLRs that have quantum efficiencies approaching 60%. That should be enough to eliminate the fallacy that only 1/3 of visible light that falls on a Bayer masked sensor is allowed to pass the filter and be measured by the pixel wells. If that were indeed fact then the highest quantum efficiency of a Bayer masked filter would be limited to 33%.


Note that the human response to visible light is similar. The cones in our retinas also overlap significantly in their spectral response.
human spectral response


What we perceive as colors are the differences in the way our brains process the varying response of of our blue, green, and red cones to different wavelengths and combinations of wavelengths.


In theory the infrared cut filter doesn't reduce any light visible to human vision because none of the light it prevents from reaching the sensor is visible to human eyes. Infrared, by definition, begins just outside the range of visible light at 700 nanometers and extends wavelengths of 1,000,000 nanometers (1 mm). Digital sensors are typically sensitive to IR light from between 700 and 1,000 nanometers. In practice sometimes the near-infrared wavelengths just under 700 nanometers are attenuated slightly by the IR-cut filters.


So just how bad are the "degradative effects" identified in the question?




They exclude light from reaching the sensor. (E.g., a "green" sensor pixel may only receive photons that are within the range 500-570nm. Most others are rejected.)



As covered above, the best current CMOS sensors in DSLRs and other cameras have quantum efficiencies in the visible spectrum ranging from between 50-60%. In one sense you could say they lose roughly half the light that falls on them, or one photographic stop. But that's not a whole lot different than the human retina so the argument could be made that they don't lose much of anything compared to what we see with our eyes.



Resolution is lost to "mosaic" effects. (E.g., a green image component is only seen by half of the pixels in a Bayer filter.)



Again, all three colors in a typical Bayer array are sensitive to at least some of the "green" wavelengths between 500-570 nanometers. This overlap is leveraged when the monochromatic luminance values from each pixel well are demosaiced to create R, G, and B values for each pixel on the sensor. It turns out that in terms of the ability to resolve alternating black and white lines a Bayer masked sensor has absolute resolution that is about 1/√2 of a non-masked monochromatic sensor of the same pixel pitch.


No comments:

Post a Comment

Why is the front element of a telephoto lens larger than a wide angle lens?

A wide angle lens has a wide angle of view, therefore it would make sense that the front of the lens would also be wide. A telephoto lens ha...