Diffraction, Airy Disks and implications

AlanF

Desperately seeking birds
CR Pro
Aug 16, 2012
12,355
22,529
I am interested in the basic physics of optics, and it provides useful information on choice of equipment and settings. Diffraction plays an important part in image quality, and its importance is becoming increasingly relevant as high density sensors are becoming more widely used. So, I thought I would briefly present a few ideas, which the experts will know but may be of use to those getting into the subject.

The rays of light from a point source of light are diffracted by a circular aperture to form a series of circular rings with the major light intensity in the centre, which is called an Airy disk. The radius of the disk is given by: 1.22 x wavelength of light x f-number. For green light, radius ~ 0.5 x f microns. The point of light is smeared by diffraction, and the size of the disk is of the order of the size of pixels in digital sensor, which are in the region of 2-7 microns. The smearing increases with increasing f-number of a lens. At wide apertures of lenses, their resolution is usually limited by optical defects. At narrow apertures, diffraction is the limiting factor of resolution.
1_AiryDisk_Relabelled_final_titled.jpg
 
  • Like
Reactions: 3 users

AlanF

Desperately seeking birds
CR Pro
Aug 16, 2012
12,355
22,529
Two points of light are clearly resolved when they are separated by distances that are much larger than the radius of the disk. As the separation decreases, the disks start overlapping and the resolution decreases. When the separation is the same as the radius, the disks have coalesced. Closer points are not resolved on a sensor.
This problem has been the bane of astronomers and microscopists over the centuries.2-OverlappingAiry1.jpg
 
  • Like
Reactions: 1 users
Upvote 0

AlanF

Desperately seeking birds
CR Pro
Aug 16, 2012
12,355
22,529
Illustration how increasing f-number degrades resolution. The light is spread out over a disk of increasing radius. MTF, which is a measure of resolution, is found in practice to diminish linearly with f-number for high quality lenses as seen in the graph I have plotted from data from ePhotozine (average values from the best lenses on a 5DSR).

3-AiryDik_vs_f.jpg3A_MTFvsf.jpg
 
  • Like
Reactions: 2 users
Upvote 0

AlanF

Desperately seeking birds
CR Pro
Aug 16, 2012
12,355
22,529
Effect of doubling focal length of a lens on resolution. If the length is doubled and the f-number is the same, the image will be twice as large but the radius of the Airy disk will be the same so resolution will be doubled, leaving aside any optical aberrations. But, if the f-number doubles as when a 2xTC is used, the Airy disk radius also doubles so the resolution is not increased.
4-Airy_2xTC.jpg
 
  • Like
Reactions: 1 user
Upvote 0

AlanF

Desperately seeking birds
CR Pro
Aug 16, 2012
12,355
22,529
The radius of the Airy disk is independent of the size of pixels in a sensor, but there is an interplay between the two when it comes to the overall resolution of the camera (sensor and lens). A high resolution sensor will outresolve a low resolution sensor when the Airy disks are smaller than its pixels. But, as the size of the disks increase and becomes larger than the pixels, the higher density sensor begins to lose its advantage. Eventually both sensors will have similar resolutions for very large disks, but the smaller pixels will be better in the intermediate regions as there is more sampling. The aperture of the lens that gives a disk about twice the size of a pixel is called the Diffraction Limited Aperture, DLA. The definition of DLA is for when the maximum at the centre of one disk coincides with the first minimum of the neighbouring disk. The DLAs listed in various tables, such as that in The-digital-picture.com are for when the radius of the Airy disk = the pitch of the pixel.
184070
 
Last edited:
  • Like
Reactions: 1 users
Upvote 0

stevelee

FT-QL
CR Pro
Jul 6, 2017
2,383
1,064
Davidson, NC
Thanks for the explanation.

Shortly after I got a 100mm macro, I tried a test to see the depth of field at different apertures at 1:1 magnification. I took a yardstick turned toward its metric side and focused on a particular line. Then I took pictures at the different lens openings. I noticed that at f/32 the picture had a soft look to it, presumably from diffraction. It was a rather pleasant softness, and I wondered if I might sometime use diffraction to get a particular look to a picture.

The little exercise did give me a sense of the trade off between stopping down to get more in focus and opening up to lessen diffraction effects. I may try it again on the 6D2 to see how much difference a full frame sensor might make.
 
Upvote 0

Don Haines

Beware of cats with laser eyes!
Jun 4, 2012
8,246
1,939
Canada
Thanks for the explanation.

Shortly after I got a 100mm macro, I tried a test to see the depth of field at different apertures at 1:1 magnification. I took a yardstick turned toward its metric side and focused on a particular line. Then I took pictures at the different lens openings. I noticed that at f/32 the picture had a soft look to it, presumably from diffraction. It was a rather pleasant softness, and I wondered if I might sometime use diffraction to get a particular look to a picture.

The little exercise did give me a sense of the trade off between stopping down to get more in focus and opening up to lessen diffraction effects. I may try it again on the 6D2 to see how much difference a full frame sensor might make.
Interesting! If you still have your crop camera, it would be neat to see a shot from each......
 
Upvote 0
Mar 25, 2011
16,848
1,835
Thanks for the explanation.

Shortly after I got a 100mm macro, I tried a test to see the depth of field at different apertures at 1:1 magnification. I took a yardstick turned toward its metric side and focused on a particular line. Then I took pictures at the different lens openings. I noticed that at f/32 the picture had a soft look to it, presumably from diffraction. It was a rather pleasant softness, and I wondered if I might sometime use diffraction to get a particular look to a picture.

The little exercise did give me a sense of the trade off between stopping down to get more in focus and opening up to lessen diffraction effects. I may try it again on the 6D2 to see how much difference a full frame sensor might make.

Don't use autofocus, you can't actually be certain which spot it will focus on. Distance has a effect, if close, it could be very shallow such that you can't see the area in focus.

Here is a graph which easily shows the loss of resolution as the aperture gets smaller. Its for a 50D, the dropoff will begin sooner with a 7D MK II.

mtf.png


for a FF body, a 5D MK II the curve is different in part due to the larger photosites (6.4 micrometer for the 5D MK II versus 4.7 for the 50D) Resolution measurements are always higher for a FF sensor with a the same lens and reasonably similar photosite size..

mtf.gif
 
  • Like
Reactions: 1 user
Upvote 0

AlanF

Desperately seeking birds
CR Pro
Aug 16, 2012
12,355
22,529
What interests me is the method by which Canon's DLO claims to reverse some of the Diffraction affects. I've never read anything about why its possible to do that. I can sort of understand reversing distortions.
I am no expert on this but have done some reading. A brief discussion is in https://photo.stackexchange.com/que...lution-reverse-softness-caused-by-diffraction which is very informative.
The smearing out of a point can be described by a point spread function. If you know the equation of of the point spread function, then you can do a Fourier analysis to deconvolute the image and sharpen it, which is what the algorithms attempt to do. But, it is complicated and affected by noise in the image.
Clarkvision http://www.clarkvision.com/articles/image-sharpening-intro/index.html describes some of his experiments with different methods and states that unsharp mask just gives an illusion of sharpening and does not restore detail but a deconvolution method called Richardson-Lucy sharpening can give increased detail. https://en.wikipedia.org/wiki/Richardson–Lucy_deconvolution That method works when you know the point spread function. Clark does some testing where he blurs a chart with a Gaussian spread and then reconstitutes the original image by Richardson-Lucy. However, as he uses a known point spread function to blur the image, then the task of its removal is greatly simplified.
 
Upvote 0

stevelee

FT-QL
CR Pro
Jul 6, 2017
2,383
1,064
Davidson, NC
Interesting! If you still have your crop camera, it would be neat to see a shot from each......
I still have the camera, but I also still have the test shots archived somewhere on a hard drive. I could just duplicate the setup with the 6D2 and shoot another round, assuming I can recall the details from looking at the old pictures.
 
Upvote 0

stevelee

FT-QL
CR Pro
Jul 6, 2017
2,383
1,064
Davidson, NC
Don't use autofocus, you can't actually be certain which spot it will focus on. Distance has a effect, if close, it could be very shallow such that you can't see the area in focus.

I don’t generally use autofocus when shooting in macro range, and I definitely didn’t in this test. I wanted the focus to be right on a centimeter line so I could count the millimeters in front and back of it that appeared to be in focus.

As I recall the f/22 shots didn’t look as bad as I would expect from your table. The big change came at f/32, and of course f/45 was more so. I had used tubes and bellows back in olden days, but this was my first experience with a real macro lens. I also tried experimenting a bit with focus stacking, and that mainly taught me that I needed to buy a rail.
 
Upvote 0
Mar 25, 2011
16,848
1,835
I am no expert on this but have done some reading. A brief discussion is in https://photo.stackexchange.com/que...lution-reverse-softness-caused-by-diffraction which is very informative.
The smearing out of a point can be described by a point spread function. If you know the equation of of the point spread function, then you can do a Fourier analysis to deconvolute the image and sharpen it, which is what the algorithms attempt to do. But, it is complicated and affected by noise in the image.
Clarkvision http://www.clarkvision.com/articles/image-sharpening-intro/index.html describes some of his experiments with different methods and states that unsharp mask just gives an illusion of sharpening and does not restore detail but a deconvolution method called Richardson-Lucy sharpening can give increased detail. https://en.wikipedia.org/wiki/Richardson–Lucy_deconvolution That method works when you know the point spread function. Clark does some testing where he blurs a chart with a Gaussian spread and then reconstitutes the original image by Richardson-Lucy. However, as he uses a known point spread function to blur the image, then the task of its removal is greatly simplified.
I've head that before (I had forgotten), but never followed up on reading it. I remember Microsoft and Adobe with some demos, I think I even downloaded the Microsoft demo years ago.

I generally don't see enough improvement in the Canon solution to justify the huge file.
 
Upvote 0
Mar 25, 2011
16,848
1,835
I don’t generally use autofocus when shooting in macro range, and I definitely didn’t in this test. I wanted the focus to be right on a centimeter line so I could count the millimeters in front and back of it that appeared to be in focus.

As I recall the f/22 shots didn’t look as bad as I would expect from your table. The big change came at f/32, and of course f/45 was more so. I had used tubes and bellows back in olden days, but this was my first experience with a real macro lens. I also tried experimenting a bit with focus stacking, and that mainly taught me that I needed to buy a rail.
I have a rail and bellows. A bellows is generally superior to a rail Just focusing a lens is best if its possible. Moving the camera and lens to stack images produces distortions that are a problem if you are really in to it. With a Bellows, the camera is fixed and the lens moves, that avoids the issue as does just focusing the lens without moving the camera.

There is a really good resource discussing the usage here. I bought the Castel XL and a Minolta bellows after viewing the video. I picked it up on ebay for $100. I use the rail (misuse) to move my camera closer and further from the subject, its attached to a head bolted to a light table for product photos, it does not get used for stacking.

https://lensvid.com/gear/choosing-the-best-focusing-rail-for-macro-photography/
 
Upvote 0

AlanF

Desperately seeking birds
CR Pro
Aug 16, 2012
12,355
22,529
I've head that before (I had forgotten), but never followed up on reading it. I remember Microsoft and Adobe with some demos, I think I even downloaded the Microsoft demo years ago.

I generally don't see enough improvement in the Canon solution to justify the huge file.
I did some experiments with charts, stopping down and looking at the diffraction correction from DLO in DPP. It made no discernible effect, and DxO gave sharper images as I have always found from actual practice.
 
Upvote 0
So we played with very simple deconvolution in my degree class (many, many years ago) - just dividing our the PSF+white noise in the Fourier domain, while the class genius put together something like this decon. It was very effective. However, the question in my mind is how much does it really matter for photography? Unlike NASA we are not trying to perform scientific analysis on our images and at a certain point 'apparent' sharpness is good enough and it's just not worth all the extra work, especially considering how our images are going to be viewed.
 
Last edited:
Upvote 0
There is a utility for deconvolution from National Institutes of Health called ImageJ. I downloaded it but it failed to complete a process on any images I loaded into it. Don't know whether this is because it takes ages to process an image (it claims to be fast) or whether it had a bug in the download. I can imagine that a 10MP or more image might take a lot of computing power to perform the Fourier analysis, though my PC was 3GHz 64 bit so I did not expect that to have been a limitation. Has anyone else tried ImageJ?
BTW I was really annoyed when I found out that most "sharpening" is only edge detection and contrast enhancement. I thought, before getting into DSLR fairly recently, that algorithms had been developed since the late 90's to handle deconvolution, which is really the only true method of increasing apparent sharpness. Further, if lenses optical properties are known this should be possible to feed into the decon algorithms so they know what they are looking for.
 
Upvote 0

AlanF

Desperately seeking birds
CR Pro
Aug 16, 2012
12,355
22,529
I did some aperture tests with the 400mm DO II on the 5DIV (DLA 8.6) and 100-400mm II on the 5DSR (DLA f/6.7), using a small target about 12m away - the numbers on the target are lp/mm. The 100% crops are in the collages.
I also used Reiken FoCal to calculate the "QoF" ~ edge sharpness.
The DO/5DIV results were:
f/4 2074
f/5.6 2183
f/8 2092
f/11 2010
f/16 1857
f/22 1673
f/32 1511
The resolution and QoF start to drop at f/11 but reasonably crisp at f/16.
The 100-400mm II/5DSR
f/5.6 2028
f/8 1955
f/11 1944
f/16 1776
f/22 1621
f/32 1286
f/40 1164
The resolution and QoF start to drop at f/11 but again reasonably crisp at f/16. 184004184005184006
 
Upvote 0