I'm starting this thread to continue a tangent from another. Rather than derail the other thread, but in order not to lose the discussion, I thought we could continue it in its own thread. I think there is important information to be gleaned from the discussion, which started when I responded to a comment by @rs:
You can follow the quote above to read the precursor comments on this topic. So, continuing on from the last reply by @rs:
I think you are generally misunderstanding resolution in a multi-component system. It is not the lowest common denominator that determines resolution...total system resolution is the root mean square of all the components. To keep things simple for this forum, and in general this is adequate for most discussion, we'll just factor in the lens resolution and sensor resolution, in terms of spatial resolution. The way I approach this is to determine the "system blur". Diffraction itself is what we call "blur" from the lens, assuming the lens is diffraction limited (and, for this discussion, we'll just assume the lens is always diffraction limited, as determining blur from optical aberrations is more complex), and it is caused by the physical nature of light. Blur from the lens changes depending on the aperture used, and as the aperture is stopped down, diffraction limits the maximum spatial resolution of the lens.
The sensor also introduces "blur", however this is a fixed, intrinsic factor determined by the size and spacing of the pixels, whether micro lenses are used, etc. For the purposes of discussion here, lets just assume that 100% of the pixel area is utilized thanks to "perfect" microlensing. That leaves us with a sensor blur equal to the pixel pitch (scalar size, horizontal or vertical, of each pixel) times two (to get us lp/mm or line pairs per millimeter, rather than simply l/mm or lines per millimeter).
[NOTE: I assume MTF50 as that is the standard that historically represents what we perceive as clear, crisp, sharp, with high microcontrast. MTF10, in contrast, is usually used to determine what might be considered the maximum resolution at the lowest level of contrast the human eye could detect...which might be useful for determining the resolution of barely perceptible features on the surface of the moon...assuming atmospheric conditions are perfect, but otherwise it is not really adequate for the discussion here. Maximum spatial resolution in MTF10 can be considerably higher than in MTF50, but there is no guarantee that the difference between one pixel and the next is detectable by the average person (Rayleigh Criterion, often described as the limit of human visual acuity for 20/20 vision)...it is more of the "true mathematical/theoretical" limit of resolution at very low, barely detectable levels of contrast. MTF0 would be spatial resolution where contrast approaches zero, which is largely useless for general photography, outside of the context of astronomy endeavors where minute changes in the shape and structure of an airy disk for a star can be used to determine if it is a single, binary, or tertiary system...or other scientific endeavors where knowing the shape of an airy disk at MTF0, or Dawe's Limit (the theoretical absolute maximum resolving power of an optical system at near zero contrast level) is useful.]
For starters, lets assume we have a perfect (diffraction-limited) lens at f/8, on a 7D sensor which has a pixel pitch of 4.3 microns. The lens, at f/8, has a spatial resolution of 86 lp/mm at MTF50. The sensor has a raw spatial resolution of approximately 116 lp/mm (assuming the most ideal circumstances, and ignoring the difference between green and red or blue pixels.) Total system blur is derived by taking the root mean square of all the blurs of each component in the system. The formula for this is:
Where tb is Total Blur, lb is Lens Blur, and sb is Sensor Blur. We can convert spatial resolution, from lp/mm, into a blur circle in mm, by simply taking the reciprocal of the spatial resolution:
Where blur is the diameter of the blur circle, and sr is the spatial resolution. We get 0.01163mm for the blur size of the lens @ f/8, and 0.00863 for the blur size of the sensor. From these, we can compute the total blur of the 7D with an f/8 lens:
We can convert this back into lp/mm simply by taking the reciprocal again, which gives us a total system spatial resolution for the 7D of ~69lp/mm. Seems surprising, given the spatial resolution of the lens...but then again, that is for f/8. If we move up to f/4, the spatial resolution of the lens jumps from 86lp/mm to 173lp/mm. Refining our equation to stay in lp/mm:
Where tsr is total spatial resolution, lsr is lens spatial resolution, and ssr is sensor spatial resolution, plugging in 173lp/mm and 116lp/mm for lens and sensor respectively gets us:
With a diffraction limited f/4 lens, the 7D is capable of achieving ~96lp/mm spatial resolution.
The debate at hand is whether a 24.1mp APS-C sensor is "worth it", and whether it will provide any kind of meaningful benefit over something like the 7D's 18mp APS-C sensor. My response is absolutely!! However, we can prove the case by applying the math above. A 24.1mp APS-C sensor (Canon-style, 22.3mmx14.9mm dimensions) would have a pixel pitch of 3.7µm, or ~135lp/mm:
Plugging that, for an f/4 lens, into our formula from above:
The 24.1mp sensor, with the same lens, produces a better result...we gained 10lp/mm, up to 106lp/mm from 96lp/mm on the 18mp sensor. That is an improvement of 10%! Certainly nothing to shake a stick at! But...the lens is outresolving the sensor...there wouldn't be any difference at f/8, right? Well...not quite. Because of the nature of "total system blur" being a factor of all components in the system, we will still see improved resolution at f/8. Here is the proof:
Despite the fact that the theoretical 24.1mp sensor from the hypothetical 7D II is DIFFRACTION LIMITED at f/8, it still resolves more! In fact, it resolves about 5% more than the 7D at f/8. So, according to the theory, even if the lens is not outresolving the sensor, even if the lens and sensor are both thoroughly diffraction limited, a higher resolution sensor will always produce better results. The improvements will certainly be smaller and smaller as the lens is stopped down, thus producing diminishing returns. If we run our calculations for both sensors at f/16, the difference between the two is less than at f/8:
18.0mp @ f/16 = 40lp/mm
24.1mp @ f/16 = 41lp/mm
The difference between the 24mp sensor and the 18mp sensor at f/16 has shrunk by half to 2.5%. By f/22, the difference is 29.95lp/mm vs. 30.21lp/mm, or an improvement of only 0.9%. Diminishing returns...however even at f/22, the 24mp is still producing better results...not that anyone would really notice...but it is still producing better results.
No, it certainly isn't 18mp of perfection great, because it is only a quarter of the frame. It is more like 4.5mp "great". My 100-400 wouldn't do as well, not because it doesn't resolve as much, at f/9 it would resolve roughly the same...but because it would produce lower contrast. Microcontrast from the 300mm f/2.8 II lens is beyond excellent....microcontrast from the 100-400 is bordering on piss-poor. There is also the advancements in IS technology to consider. I forgot to mention this before, but Canon has greatly improved the image stabilization of their new generation of lenses. Where we MAYBE got two stops of hand-holdability before, we easily get at least four stops now, and I've managed to get some good shots at five stops. As a matter of fact, the Sandpiper photo was hand held (with me squatting in an awkward manner on soggy, marshy ground that made the whole thing a real pain), at 600mm, on a 7D, and the BARE MINIMUM shutter speed to get a clear shot in that situation is 1/1000s.
So, I still stress...there are very good reasons to have higher resolution sensors, and with the significantly advanced new generation of lenses Canon is releasing, I believe we have the optical resolving power to not only handle a 24mp APS-C sensor, but up to 65-70mp FF sensors, if not more, in the future.
Thanks! ;D
jrista said:rs said:Ps - I really hope Canon resist the temptation to take their 1.6x crop sensor up to 24mp. It'll suffer from softness due to diffraction from f6.0 onwards - mount an f5.6 lens on there and you've got little in the way of options. Even the legendary 300/2.8 II with a 2x TC III will underperform, and leave you with just one aperture option if you want to attempt to utilise all of those megapixels. Leave the MP lower, and let those lower processing overheads allow them to push the hardware of the small mirror and shutter to its limits.
Once again, this rhetoric keeps cropping up and it is completely incorrect! NEVER, in ANY CASE, is more megapixels bad because of diffraction! That is so frequently quoted, and it is so frequently wrong.
You can follow the quote above to read the precursor comments on this topic. So, continuing on from the last reply by @rs:
rs said:I'm not saying its worse, its just the extra MP don't make any difference to the resolving power once diffraction has set in. Take another example - scan a photo which was a bit blurry - if a 600dpi scan looks blurry on screen at 100%, you wouldn't then think 'let's find out if anyone makes a 10,000dpi scanner so I can make this look sharper?' You'd know it would offer no advantages - at that point you're resolving more detail than is available - weakest link in the chain and all that...jrista said:Once again, this rhetoric keeps cropping up and it is completely incorrect! NEVER, in ANY CASE, is more megapixels bad because of diffraction! That is so frequently quoted, and it is so frequently wrong.
I think you are generally misunderstanding resolution in a multi-component system. It is not the lowest common denominator that determines resolution...total system resolution is the root mean square of all the components. To keep things simple for this forum, and in general this is adequate for most discussion, we'll just factor in the lens resolution and sensor resolution, in terms of spatial resolution. The way I approach this is to determine the "system blur". Diffraction itself is what we call "blur" from the lens, assuming the lens is diffraction limited (and, for this discussion, we'll just assume the lens is always diffraction limited, as determining blur from optical aberrations is more complex), and it is caused by the physical nature of light. Blur from the lens changes depending on the aperture used, and as the aperture is stopped down, diffraction limits the maximum spatial resolution of the lens.
The sensor also introduces "blur", however this is a fixed, intrinsic factor determined by the size and spacing of the pixels, whether micro lenses are used, etc. For the purposes of discussion here, lets just assume that 100% of the pixel area is utilized thanks to "perfect" microlensing. That leaves us with a sensor blur equal to the pixel pitch (scalar size, horizontal or vertical, of each pixel) times two (to get us lp/mm or line pairs per millimeter, rather than simply l/mm or lines per millimeter).
[NOTE: I assume MTF50 as that is the standard that historically represents what we perceive as clear, crisp, sharp, with high microcontrast. MTF10, in contrast, is usually used to determine what might be considered the maximum resolution at the lowest level of contrast the human eye could detect...which might be useful for determining the resolution of barely perceptible features on the surface of the moon...assuming atmospheric conditions are perfect, but otherwise it is not really adequate for the discussion here. Maximum spatial resolution in MTF10 can be considerably higher than in MTF50, but there is no guarantee that the difference between one pixel and the next is detectable by the average person (Rayleigh Criterion, often described as the limit of human visual acuity for 20/20 vision)...it is more of the "true mathematical/theoretical" limit of resolution at very low, barely detectable levels of contrast. MTF0 would be spatial resolution where contrast approaches zero, which is largely useless for general photography, outside of the context of astronomy endeavors where minute changes in the shape and structure of an airy disk for a star can be used to determine if it is a single, binary, or tertiary system...or other scientific endeavors where knowing the shape of an airy disk at MTF0, or Dawe's Limit (the theoretical absolute maximum resolving power of an optical system at near zero contrast level) is useful.]
For starters, lets assume we have a perfect (diffraction-limited) lens at f/8, on a 7D sensor which has a pixel pitch of 4.3 microns. The lens, at f/8, has a spatial resolution of 86 lp/mm at MTF50. The sensor has a raw spatial resolution of approximately 116 lp/mm (assuming the most ideal circumstances, and ignoring the difference between green and red or blue pixels.) Total system blur is derived by taking the root mean square of all the blurs of each component in the system. The formula for this is:
Code:
tb = sqrt(lb^2 + sb^2)
Where tb is Total Blur, lb is Lens Blur, and sb is Sensor Blur. We can convert spatial resolution, from lp/mm, into a blur circle in mm, by simply taking the reciprocal of the spatial resolution:
Code:
blur = 1/sr
Where blur is the diameter of the blur circle, and sr is the spatial resolution. We get 0.01163mm for the blur size of the lens @ f/8, and 0.00863 for the blur size of the sensor. From these, we can compute the total blur of the 7D with an f/8 lens:
Code:
tb = sqrt(0.01163mm^2 + 0.00863mm^2) = sqrt(0.0001352mm + 0.0000743mm) = sqrt(0.0002095mm) = 0.014475mm
We can convert this back into lp/mm simply by taking the reciprocal again, which gives us a total system spatial resolution for the 7D of ~69lp/mm. Seems surprising, given the spatial resolution of the lens...but then again, that is for f/8. If we move up to f/4, the spatial resolution of the lens jumps from 86lp/mm to 173lp/mm. Refining our equation to stay in lp/mm:
Code:
tsr = 1/sqrt((1/lsr)^2 + (1/ssr)^2)
Where tsr is total spatial resolution, lsr is lens spatial resolution, and ssr is sensor spatial resolution, plugging in 173lp/mm and 116lp/mm for lens and sensor respectively gets us:
Code:
tsr = 1/sqrt((1/173)^2 + (1/116)^2) = 1/sqrt(0.0000334 + 0.0000743) = 1/0.0001077 = 96.34
With a diffraction limited f/4 lens, the 7D is capable of achieving ~96lp/mm spatial resolution.
The debate at hand is whether a 24.1mp APS-C sensor is "worth it", and whether it will provide any kind of meaningful benefit over something like the 7D's 18mp APS-C sensor. My response is absolutely!! However, we can prove the case by applying the math above. A 24.1mp APS-C sensor (Canon-style, 22.3mmx14.9mm dimensions) would have a pixel pitch of 3.7µm, or ~135lp/mm:
Code:
(1/(pitch µm / 1000µm/mm)) / 2 l/lp = (1/(3.7µm / 1000µm/mm)) / 2 l/lp = (1/(0.0037mm)) / 2 l/lp = 270l/mm / 2 l/lp = 135 lp/mm
Plugging that, for an f/4 lens, into our formula from above:
Code:
tsr = 1/sqrt((1/173)^2 + (1/135)^2) = 1/sqrt(0.0000334 + 0.0000549) = 1/sqrt(0.0000883) = 1/0.0094 = 106.4
The 24.1mp sensor, with the same lens, produces a better result...we gained 10lp/mm, up to 106lp/mm from 96lp/mm on the 18mp sensor. That is an improvement of 10%! Certainly nothing to shake a stick at! But...the lens is outresolving the sensor...there wouldn't be any difference at f/8, right? Well...not quite. Because of the nature of "total system blur" being a factor of all components in the system, we will still see improved resolution at f/8. Here is the proof:
Code:
tsr = 1/sqrt((1/86)^2 + (1/135)^2) = 1/sqrt(0.0001352 + 0.0000549) = 1/sqrt(0.00019) = 1/0.0138 = 72.5
Despite the fact that the theoretical 24.1mp sensor from the hypothetical 7D II is DIFFRACTION LIMITED at f/8, it still resolves more! In fact, it resolves about 5% more than the 7D at f/8. So, according to the theory, even if the lens is not outresolving the sensor, even if the lens and sensor are both thoroughly diffraction limited, a higher resolution sensor will always produce better results. The improvements will certainly be smaller and smaller as the lens is stopped down, thus producing diminishing returns. If we run our calculations for both sensors at f/16, the difference between the two is less than at f/8:
18.0mp @ f/16 = 40lp/mm
24.1mp @ f/16 = 41lp/mm
The difference between the 24mp sensor and the 18mp sensor at f/16 has shrunk by half to 2.5%. By f/22, the difference is 29.95lp/mm vs. 30.21lp/mm, or an improvement of only 0.9%. Diminishing returns...however even at f/22, the 24mp is still producing better results...not that anyone would really notice...but it is still producing better results.
rs said:You've got some great shots there, very impressive - and it clearly does show the difference between good glass and great glass. But the f9 300 II + 2x shot isn't 100% pixel sharp like your native 500/4 shot is. I'm not saying there's anything wrong with the shot - it's great, and the detail there is still great. Its just not 18MP of perfection great. A 15MP sensor wouldn't have resolved any less detail behind that lens, but that wouldn't have made a 15MP shot any better. This thread is clearly going off on a tangent here, as pixel peeping is rarely anything to do with what makes a great photo - its just we are debating whether the extra MP are worth it. And just to re-iterate, great shots jristajrista said:The aperture used was f/9, so diffraction has definitely "set in" and is visible given the 7D's f/6.9 DLA. The subject, in this case a Juvenile Baird's Sandpiper, comprised only the center 25% of the frame, and the 300 f/2.8 II w/ 2x TC STILL did a superb job resolving a LOT of detail:
No, it certainly isn't 18mp of perfection great, because it is only a quarter of the frame. It is more like 4.5mp "great". My 100-400 wouldn't do as well, not because it doesn't resolve as much, at f/9 it would resolve roughly the same...but because it would produce lower contrast. Microcontrast from the 300mm f/2.8 II lens is beyond excellent....microcontrast from the 100-400 is bordering on piss-poor. There is also the advancements in IS technology to consider. I forgot to mention this before, but Canon has greatly improved the image stabilization of their new generation of lenses. Where we MAYBE got two stops of hand-holdability before, we easily get at least four stops now, and I've managed to get some good shots at five stops. As a matter of fact, the Sandpiper photo was hand held (with me squatting in an awkward manner on soggy, marshy ground that made the whole thing a real pain), at 600mm, on a 7D, and the BARE MINIMUM shutter speed to get a clear shot in that situation is 1/1000s.
So, I still stress...there are very good reasons to have higher resolution sensors, and with the significantly advanced new generation of lenses Canon is releasing, I believe we have the optical resolving power to not only handle a 24mp APS-C sensor, but up to 65-70mp FF sensors, if not more, in the future.
rs said:You've got some great shots there, very impressive - /* ...clip... */ And just to re-iterate, great shots jrista
Thanks! ;D