Tuesday, 20 August 2019

sensor - Why don't cameras offer more than 3 colour channels? (Or do they?)


Currently, most (all?) commercially available cameras capture light in three colour channels: red, green and blue. It seems to me that it would be very useful to have a camera with a greater spectral range and resolution, and so I'm wondering why cameras aren't available that capture more than three colour channels.



What do I mean exactly?


There were some queries in the comments (since deleted) about what I meant, so I'd like to give a better explanation. Visible light ranges from around 390-700nm wavelengths. There are an infinite number of wavelengths in between these two end points, but the eye has a very limited capacity to distinguish them, since it has only three colour photoreceptors. The response curves for these are shown in part (a) of the figure below. (Bigger version.) This allows us to see different colours depending on the frequency of light, since low frequency light will have more of an effect on the blue receptors and high frequency light will have more of an effect on the red receptors.


enter image description here


A digital sensor in a camera works by having filters in front of its pixels, and usually there are three types of filter. These are chosen with response curves as close as possible to figure (a) above, to mimic what the human eye sees.


However, technologically speaking there is no reason why we couldn't add a fourth filter type, for example with a peak in between blue and green, as shown in figure (b). In the next section I explain why that would be useful for post-processing of photographs, even though it doesn't correspond to anything the eye can see.


Another possibility would be to add additional channels in the infra-red or ultraviolet, as shown in figure (c), extending the spectral range of the camera. (This is likely to be more technically challenging.)


Finally, a third possibility would be to divide up the frequency range even more finely, producing a camera with a high spectral resolution. In this version, the usual RGB channels would have to be constructed in software from the more fine-grained data the sensor produces.


My question is about why DSLRs don't commonly offer any of these options besides (a), and whether there are cameras available that do offer any of the others. (I'm asking about the kind of camera you'd use to take a picture - I know there are scientific instruments that offer these kinds of feature.)


Why would this be useful?


I've been playing around with editing black and white photos, from colour shots taken with my DSLR. I find this process interesting because when editing a B&W photo the three RGB channels just become sources of data about the scene. The actual colours they represent are in a way almost irrelevant - the blue channel is useful mostly because objects in the scene differ in the amount of light they reflect in that range of wavelengths, and the fact that it corresponds to what the human eye sees as "blue" is much less relevant.



Having the three channels gives a lot of flexibility in controlling the exposure of different aspects of the final B&W image. It occurred to me while doing this that a fourth colour channel would give even more flexibility, and so I wonder why such a thing doesn't exist.


Extra colour channels would be useful for colour photography as well as black and white, and for the same reason. You'd just be constructing each of the RGB channels in the same way that you construct a B&W image now, by combining data from different channels representing light of different frequency ranges. For most purposes this would be done automatically in software, but it would offer a lot more flexibility in terms of post-processing options.


As a simple example of how this could be useful, we know that plants are very reflective in near-infrared. This fact is often used to generate striking special effects shots, in which plants appear to be bright white in colour. However, if you had the infra-red image as a fourth channel in your editing software it would be available for processing colour images, for example by changing the exposure of all the plants in the image, while leaving less IR-reflective objects alone.


In the case of infra-red I understand that there are physical reasons why it's hard to make a sensor that isn't IR-sensitive, so that digital sensors usually have an IR-blocking filter in front of them. But it should be possible to make a sensor with a higher spectral resolution in the visible range, which would enable the same kinds of advantage.


One might think that this feature would be less useful in the age of digital processing, but I actually think it would come into its own around now. The limits of what you can do digitally are set by the data available, so I would imagine that a greater amount of spectral data would enable processing techniques that can't exist at all without it.


The question


I would like to know why this feature doesn't seem to exist. Is there a huge technical challenge in making a sensor with four or more colour channels, or is the reason more to do with a lack of demand for such a feature? Do multi-channel sensors exist as a research effort? Or am I simply wrong about how useful it would be?


Alternatively, if the does exist (or has in the past), which cameras have offered it, and what are its main uses? (I'd love to see example images!)



Answer





Why don't cameras offer more than 3 colour channels?



It costs more to produce (producing more than one kind of anything costs more) and gives next to no (marketable) advantages over Bayer CFA.



(Or do they?)



They did. Several cameras including retailed ones had RGBW (RGB+White) RGBE (RGB+Emerald), CYGM (Cyan Yellow Green Magenta) or CYYM (Cyan Yellow Yellow Magenta) filters.



It seems to me that it would be very useful to have a camera with a greater spectral range and resolution, and so I'm wondering why cameras aren't available that capture more than three colour channels.




The number of channels is not directly related to spectral range.



Is there a huge technical challenge in making a sensor with four or more colour channels, or is the reason more to do with a lack of demand for such a feature?



The lack of demand is decisive factor.


Additionally CYYM/CYGM filters cause increased colour noise because they require arithmetic operations with big coefficients during conversion. The luminance resolution can be better though, at the cost of the colour noise.



Do multi-channel sensors exist as a research effort? Or am I simply wrong about how useful it would be?



You are wrong in that spectral range would be bigger with more channels, you are right in that fourth channel provides a number of interesting processing techniques for both colour and monotone.




Alternatively, if the does exist (or has in the past), which cameras have offered it, and what are its main uses?



Sony F828 and Nikon 5700 for example, they and few others are even available second-handed. They are common-use cameras.




It is also interesting to know that spectral range is limited not only by the hot mirror present in most cameras but with the sensitivity of the photodiodes which make up the sensor. I do not know what type of photodiodes exactly is used in consumer cameras but here is an exemplary graph which shows the limitation of semiconductors:


Comparison of photosensitive semiconductors




Regarding software which may be used to extract fourth channel: it is probably dcraw but it should be modified and recompiled to extract just one channel.


There is a 4x3 matrix for F828 in dcraw.c which makes use of the fourth channel. Here is an idea: { 7924,-1910,-777,-8226,15459,2998,-1517,2199,6818,-7242,11401‌​‌​,3481 } - this is the matrix in linear form, most probably every fourth value represents the Emerald. You turn it into this: { 0,0,0,8191,0,0,0,0,0,0,0,0 } (I do not know what number should be there instead of 8191, gust a guesswork), recompile and the output image gets the Emerald channel after demosaicing in the red channel (if I understand the sources correctly).



No comments:

Post a Comment

Why is the front element of a telephoto lens larger than a wide angle lens?

A wide angle lens has a wide angle of view, therefore it would make sense that the front of the lens would also be wide. A telephoto lens ha...