I checked through some of the late night photos I have taken with 28/1.8 and 40D to see how large the star points actually are on the sensor. With 8-15 second exposure, I'm seeing that the star spots are within 4x3 pixels or 5x4 (the extension along the other dimension is the most dominant here, and is due to exposure time and Earth's rotation).
This is measured from a straight out of camera JPEG, with F/3.5 and with a lens that had slight amount of frost on the front element. And even then the spots are quite limited. I do think that had the conditions been better (no frost and better lens with equal aperture and better ISO than 40D has) and had I taken RAWs, most of the star images would fall within 3x3 region (as I said earlier) - spread is mainly because of the AA filter, otherwise stars should fall within a single pixel assuming any kind of reasonable performance of the objective and with F-numbers less than 5.6. With a PSF of size 3x3 pixels, it is hard for me to see how this could be used to compute the MTF even with sub-pixel sampling without averaging over larger area.
For example, 100 pixels with a 5 µm pitch respresents about 0.5 mm, which is significant. If we are talking about smaller averaging distance, for example 30 pixels, the uncertainty of the slant angle itself would be about 2 degrees. I haven't seen many error estimations to the slanted edge method, but I'm afraid I'll have to do that myself in the near future in a publication.
It is great to hear about your background in the tomography, it helps me understand how you think about these issues. But I have to remind you that we are talking about optical systems within visual wavelength range, where things are quite bit different from radio waves (MRI) or THz region or ultrasound. From my hazy memory, MRI actually measures time differences on different detectors, but it's 11 years since I have needed to think anything related to NMR? These medical wavelengths are used mainly because of the requirement of non-invasiveness and that's why you need to deal a lot with image processing techniques.
I think a better equivalent would be to compare image processing techniques in astronomical telescopes to get the state-of-the-art results in the visible wavelength range. Adaptive optics correction to the PSFs allows ground based telescopes just to match them with Hubble over a smaller field of view on good nights.
This is measured from a straight out of camera JPEG, with F/3.5 and with a lens that had slight amount of frost on the front element. And even then the spots are quite limited. I do think that had the conditions been better (no frost and better lens with equal aperture and better ISO than 40D has) and had I taken RAWs, most of the star images would fall within 3x3 region (as I said earlier) - spread is mainly because of the AA filter, otherwise stars should fall within a single pixel assuming any kind of reasonable performance of the objective and with F-numbers less than 5.6. With a PSF of size 3x3 pixels, it is hard for me to see how this could be used to compute the MTF even with sub-pixel sampling without averaging over larger area.
For example, 100 pixels with a 5 µm pitch respresents about 0.5 mm, which is significant. If we are talking about smaller averaging distance, for example 30 pixels, the uncertainty of the slant angle itself would be about 2 degrees. I haven't seen many error estimations to the slanted edge method, but I'm afraid I'll have to do that myself in the near future in a publication.
It is great to hear about your background in the tomography, it helps me understand how you think about these issues. But I have to remind you that we are talking about optical systems within visual wavelength range, where things are quite bit different from radio waves (MRI) or THz region or ultrasound. From my hazy memory, MRI actually measures time differences on different detectors, but it's 11 years since I have needed to think anything related to NMR? These medical wavelengths are used mainly because of the requirement of non-invasiveness and that's why you need to deal a lot with image processing techniques.
I think a better equivalent would be to compare image processing techniques in astronomical telescopes to get the state-of-the-art results in the visible wavelength range. Adaptive optics correction to the PSFs allows ground based telescopes just to match them with Hubble over a smaller field of view on good nights.
Upvote
0