Mikehit said:
bdunbar79 said:
vs. a sensor able to resolve the effects of diffraction.
Sorry to labour the point, but what are 'the effects of diffraction' that the higher MP lens is resolving (as in 'that there is diffraction, not a pixel limitation')? And if it is indistinguishable from other forms of blurriness then is this all just angels dancing on the head of a pin?
The image with diffraction blur is a
convolution of at least 2 things:
1) The perfect image with no diffraction blur
2) The Point Spread Function (PSF) of the lens system, including that arising from diffraction
Imagine you sample the image at (let's take an extreme example) 2 resolutions:
a) 1000 x 1000 pixels: Here, the diffraction blur pattern extends just, say, 2x2 pixels in (x,y) ;
b) 10,000 x 10,000 pixels: Here, we can resolve those 2x2 pixels above, into 20x20 pixels.
In case (a), it's hard to say EXACTLY where the "centre" of the blur circle is. It's "somewhere in the 2x2 pixel circle/square on the sensor".
Imagine that we're imaging an ideal point source of light. The diffraction blur in case (a) makes it hard to tell exactly where in the image the point source of light actually is.
The point source of light becomes, after convolution with the PSF, a circular pattern on the imaging sensor.
When we go to case (b) above, we can be much more precise about where the PSF/blur circle is, which in turn means that we can deduce much more precisely where the ideal point of light is: Typically at the centre of the blur circle.
So we have shown that despite a significant (but not overwhelmingly large) diffraction blur circle, we can better resolve the position of the point source of light when we sample the image at greater resolution.
So our whole image (whether consisting of unlikely ideal point sources of light, or whether a real scene) is better resolved at high sampling resolution, despite the diffraction blur being larger than the sampling pixels.