Saturday 18 April 2015

digital - Do megapixels matter with modern sensor technology?


Are more megapixels good?


Are more megapixels bad?


Do more megapixels increase detail? Do they make my images sharper? On the other side, is there a point that is too much? Do megapixels cause increased noise and other problems? How does print and viewing size come into it?


Widespread Internet-forum wisdom used to be that 6 megapixels was the sweet spot — below that just wasn't enough, but above that, there wasn't much benefit. File sizes got larger, but detail was lost to noise and other problems. The arguments are that cramming too many pixels into a small sensor makes each pixel too small to provide any real benefit, and that higher-megapixel sensors outresolve cheap lenses anyway. (Substitute something like 12 instead of 6 if we're discussing APS-C DSLRs, or 24 for full-frame.)


By 2011, every camera introduced was in the range of 12-18 megapixels. In 2016 most were are in the 16-24 megapixel range. Certainly none are at 6 anymore; even smartphones come with 8 megapixel cameras. Does this offer any real improvement over that old "sweet spot"? Has technology improved to the point where the "wisdom" needs to be updated, or are we all suffering for marketing? Or have we gone past the sweet spot in some ways, but it's okay because of previously-un-argued points. (For example: more noise, sure, but more detail as well.)


Within the 16-24 megapixel common-today range, for the same sensor size, is there any actual benefit to the higher end? How do megapixels directly affect image quality with today's technology? What are the benefits and when do they apply? What are the drawbacks, and when do they apply? How should I adjust my technique (and expectations) based on my camera's megapixel count?



Answer



From a purely theoretical point of view: more megapixels good.


People often talk about how high megapixel sensors were now outresolving most lenses, thus there was no point going higher unless using the very best glass. This is not always true. System resolution is the product of lens resolution and sensor resolution. Thus if you improve one, your system resolution will improve regardless of the other. You do eventually get into diminishing returns, but from a theoretical viewpoint a sensor can't outresolve a lens until diffraction effects take over.



Theoretically for a fixed final output size, noise is independent of sensor resolution. Yes smaller pixels capture less light, therefore the per pixel noise level is higher. But if you resize a high megapixel image to match a lower one, you average pixel values and thus noise is evened out. People regularly complain about noisy high megapixel compacts when viewing images at 100%. But that's a totally unfair comparison.




From a practical point of view: more megapixels not bad


From a practical view the noise situation is more complicated, but evidence I've seen suggests that high MP sensors are not much noisier when compared at the same image size (see above). I'll look up some links.


The situation on resolution is complicated by the fact [most] sensors don't see in colour and thus have a bayer grid which requires an anti aliasing filter. Aliasing is worst when the sampling frequency matches your signal (i.e. image detail) frequency. Increasing the megapixel count faster than increases in signal frequency should improve aliasing, to the point where the traditional aliasing filter can be removed.


There are other practical issues which relate to your ability to extract extra detail from your sensor:




  • The 1/focal length rule no longer applies as you increase megapixels, you need ever increasing stabilisation, and also increasing shutter speeds as subject motion becomes more apparent.





  • Diffraction becomes more of a problem as you increase megapixels as the pixels become smaller than the Airy disk.




  • Data processing and storage requirements are higher.




It's worth emphasising that these are not disadvantages of higher megapixel counts, since you can always downsize your images, and you're not lost anything when compared to a lower megapixel count camera. The exception being in camera data processing, since the camera has to read the whole sensor when shooting stills and somehow process this information.




So how high can you go? I've seen calculations of the diffraction limiting aperture for red light with a 350 megapixel full frame sensor being f/2.8 (green and blue light requiring even larger apertures) so that gives you an idea. Personally I think your returns would get small past a 50 megapixel 35mm sensor, up to a maximum of maybe 75-100. Once you get noticeable diffraction at f/5.6 people are going to become disinterested, and once you have to open up to f/2.8 with a lens that's razor sharp at f/2.8, the megapixel race is over.



Larger formats allow more megapixels before diffraction sets in (at a given f/stop) however depth of field is shallower at the same f/stop, requiring you to stop down more for depth of field, so there appears to be no intrinsic advantage when it comes to diffraction (though it's easier to make lenses that are sharp at the diffraction limiting aperture for a larger format).


The existence of 80 megapixel medium format cameras points to the fact it would be possible, diffraction wise, given good enough glass. Though as users of such cameras point out how difficult it is to utilise 80MP this points to it being a good practical limit, if not a theoretical one.


No comments:

Post a Comment

Why is the front element of a telephoto lens larger than a wide angle lens?

A wide angle lens has a wide angle of view, therefore it would make sense that the front of the lens would also be wide. A telephoto lens ha...