AA filter in the 5D mark IV

neuroanatomist said:
In a self-canceling AA filter, they omit the 1/4-wave plate and have the two birefringent crystals in opposite orientations, so the first one spreads (for example) vertical the images, then the second one recombines the images.
Thanks for the explanation. But maybe you can answer one more thing:
Why on earth would you do that instead of just leaving the filter out? ;D
 
Upvote 0
Loibisch said:
neuroanatomist said:
In a self-canceling AA filter, they omit the 1/4-wave plate and have the two birefringent crystals in opposite orientations, so the first one spreads (for example) vertical the images, then the second one recombines the images.
Thanks for the explanation. But maybe you can answer one more thing:
Why on earth would you do that instead of just leaving the filter out? ;D

Because doing so allows you to make cameras both with and 'without' the AA filter while keeping all the other aspects of the production the same, since both flavors will have the same thickness of glass/etc. on top of the sensor (they replaced the 1/4-wave plate with clear glass). For example, with the self-cleaning sensor, it's not the sensor itself that vibrates, but rather some parts of the filter stack over the sensor that are moved by a piezoelectric motor. Changing the composition of the filter stack would mean different designs would be needed for the self-cleaning unit. Those sorts of things mean higher design and production costs, which means lower profits. Companies care about those sorts of things... ;D

Both Canon and Nikon used the self-canceling filter for the 'two flavor' models (D800/E, 5Ds/R). With the D810, Nikon no longer offered an AA-filtered version, so that camera just omitted the lithium niobate filters entirely.
 
Upvote 0
Another reason both Nikon & Canon did it that way was to maintain the same back-focus as the filter stack is part of the overall optical design.
Retaining an AA filter in the 5D MKIV is almost certainly because of the 4K video element on a moving image aliasing / moiré is not a pleasant feature and all the high end cameras like a Arri Alexa or Red Dragon still retain them (Red even has different versions).
 
Upvote 0
Scyrene
Here are some quantitative measurements from MTFs of various lenses on the 5DIII, 5DS and 5DS R

https://www.lensrentals.com/blog/2015/06/canon-5ds-and-5ds-r-initial-resolution-tests/

Alan
 
Upvote 0
East Wind Photography said:
It will be stronger due to the increase in pixel density. Every incremental increase in resolution requires a bit stronger AA filter.
Isn't the AA filter supposed to make the image satisfy the Nyquist–Shannon sampling theorem?
From this I would deduce that the AA filter strength should be proportional to the pixel pitch and therefore higher resolution requires a weaker AA filter.
 
Upvote 0
midluk said:
East Wind Photography said:
It will be stronger due to the increase in pixel density. Every incremental increase in resolution requires a bit stronger AA filter.
Isn't the AA filter supposed to make the image satisfy the Nyquist–Shannon sampling theorem?
From this I would deduce that the AA filter strength should be proportional to the pixel pitch and therefore higher resolution requires a weaker AA filter.

True – East Wind Photography is incorrect.
 
Upvote 0
I think the 5D-IV IQ looks very promising based on the DPR studio shots:
 

Attachments

  • DPR 5D-IV CCP - CMGB.jpg
    DPR 5D-IV CCP - CMGB.jpg
    791.4 KB · Views: 180
  • DPR 5D-IV CCP - PMRY.jpg
    DPR 5D-IV CCP - PMRY.jpg
    791.6 KB · Views: 197
  • DPR 5D-IV vs A7R II (moire).jpg
    DPR 5D-IV vs A7R II (moire).jpg
    922.6 KB · Views: 243
  • DPR 5D-IV vs 5Ds R (moire).jpg
    DPR 5D-IV vs 5Ds R (moire).jpg
    994 KB · Views: 187
Upvote 0
neuroanatomist said:
midluk said:
East Wind Photography said:
It will be stronger due to the increase in pixel density. Every incremental increase in resolution requires a bit stronger AA filter.
Isn't the AA filter supposed to make the image satisfy the Nyquist–Shannon sampling theorem?
From this I would deduce that the AA filter strength should be proportional to the pixel pitch and therefore higher resolution requires a weaker AA filter.

True – East Wind Photography is incorrect.

Please share some links so we can better understand. Higher pixel pitch should produce more moire on finer detail and therefore a higher degree of aa is required. However, I'm not afraid to stand corrected. Just trying to understand the reason for the opposite.

Incremental increases in sensor resolution have pretty consistently caused a loss of sharpness. Resolution does increase detail but the two are different.
 
Upvote 0
East Wind Photography said:
neuroanatomist said:
midluk said:
East Wind Photography said:
It will be stronger due to the increase in pixel density. Every incremental increase in resolution requires a bit stronger AA filter.
Isn't the AA filter supposed to make the image satisfy the Nyquist–Shannon sampling theorem?
From this I would deduce that the AA filter strength should be proportional to the pixel pitch and therefore higher resolution requires a weaker AA filter.

True – East Wind Photography is incorrect.

Please share some links so we can better understand. Higher pixel pitch should produce more moire on finer detail and therefore a higher degree of aa is required. However, I'm not afraid to stand corrected. Just trying to understand the reason for the opposite.

Incremental increases in sensor resolution have pretty consistently caused a loss of sharpness. Resolution does increase detail but the two are different.

I haven't run across anything that delves into clear detail about the 'strength' of an AA filter, but maybe it's confusion about semantics?

An AA filter essentially spreads the incoming light (introduces blur), and the amount of that spread is proportional to the pixel pitch. So, as pixel pitch gets smaller (more MP for the same size sensor), the amount of blur that the AA filter needs to introduce to counteract aliasing also gets smaller. I think convention would say that a filter that introduces less blur is a weaker filter.

What an AA filter does is to add blur to prevent the aliasing (e.g. moiré) caused by repeating patterns in a subject where the periodicity is approximately half that of the pixel pitch or higher (Nyquist limit). Incidentally, that's why the AA filter is also called an optical low pass filter (OLPF) – it allows frequencies lower than the cutoff to pass, while blocking (blurring out, in this case) higher frequencies. For example, if a sensor's pixel pitch is 6 µm (the 5DIII is close), then patterns that repeat every 3 µm would be at the Nyquist frequency, patterns repeating every 4 µm would be lower than the Nyquist frequency, and patterns repeating every 2µm would be above it. It's the 'at or above' that the filter is designed to reduce/eliminate.

But it's a bit more complex than that, for two reasons. The first is that lenses aren't perfect. As pixel pitch decreases, eventually the blur introduced by the optics will reduce and potentially obviate the need for an AA filter. If you use crappy lenses, you won't have to complain about moiré. ;) The second (and for now, more important) reason is that manufacturers make choices regarding the strength of the AA filter – it's not simply 'set it equal to the Nyquist limit for the sensor' and be done.

There are plenty of examples of moiré in images from cameras with an OLPF, I know I've seen it in bird feathers and buildings with my 1D X. Also, moiré is more evident in video than in still photography – that's because it's not just the frequencies of the patterns, it's also the alignment of the patterns in the subject with the pixel array on the sensor. For example, if you take two shots of the same brick wall (same camera, lens, etc.) but move the camera 1 cm to the left, you may see moiré in one image but not the other. But if you're panning a video across that brick wall, you will see the moiré at some point in the footage.

So, if you're a camera maker you need to decide – do you make the AA filter stronger (set the cutoff lower than the Nyquist frequency for the sensor). If you do that, you will reduce moiré in both stills and video, and if you make it strong enough, you can make pretty darn sure that none of your users see moiré. But as you make the AA filter stronger, you introduce more blur, and that means softer images. Granted, the softness introduced by an AA filter is very amenable to sharpening, but that can have undesirable consequences too (accentuates noise, but you can do NR, but that softens the image again, etc.). Or, you can make the filter weaker (less blur) – or eliminate it entirely – which means a sharper native image but a higher propensity to show aliasing.

There's also a third reason, concerning your statement that, "Incremental increases in sensor resolution have pretty consistently caused a loss of sharpness." That's partly down to technique and camera build. With a lower resolution sensor, a given amount of camera shake (from any source, including mirror/shutter vibration) or subject motion might fall above the Nyquist frequency of the sensor. So for the 5DIII's 6 µm pixel pitch, if the camera is vibrating at an amplitude of 2.5 µm, you would not see any shake-induced blur. But if you switch to a 5Ds with a 4 µm pixel pitch, now that same amount of shake is below the the Nyquist limit, and you'll see the effect.
 
Upvote 0
scyrene said:
I know, and while I respect photographers' opinions, that's not the same as presenting evidence. Is it even possible to quantify sharpness?
Yes it is.

If you use the link above to dpreview you can download and /or check out the difference of the visible lines between the 5DIV and 5DS/R. It is - extremely - easy to see that the 5DS/R has much more detail than the 5DIV. You can simply see the lines along he ruler the whole way up to 50 while the lines blur to extinction already around the 38 mark on the 5DIV test photo.

I was quite surprised that the difference was so big and it seems accurate to say that the extra MPIX and no AA filter gives the 5DS/R a 50% advantage over the 5DIV.

However, noise seems clearly better on the 5DIV and I expect (but have no proof or tests) DR will also be better because Canon has said improved DR was one of three key goals with the 5DIV compared to the 5DIII (the 5DS/R already has up to 2 full stops better DR than the 5DIII).
 
Upvote 0
Maiaibing said:
If you use the link above to dpreview you can download and /or check out the difference of the visible lines between the 5DIV and 5DS/R. It is - extremely - easy to see that the 5DS/R has much more detail than the 5DIV. You can simply see the lines along he ruler the whole way up to 50 while the lines blur to extinction already around the 38 mark on the 5DIV test photo.

I was quite surprised that the difference was so big and it seems accurate to say that the extra MPIX and no AA filter gives the 5DS/R a 50% advantage over the 5DIV.

Yes, the lack of an AA filter results in sharper images, if you compare ± AA filter with no sharpening or with the same amount of sharpening. However, you cannot apply much sharpening to an image from an AA-less camera before you start to see sharpening artifacts, whereas you can apply substantially more sharpening to the AA-filtered image. So in practice (i.e. where you appropriately post-process your images), although the lack of an AA filter will result in sharper images, the difference is no where near what the typical (somewhat misleading) comparisons show.

Of course, that applies when you're only looking at the effect of an AA filter. Comparing the 5DIV to the 5Ds, you'd see a significant advantage for the latter due to the extra resolution, with only a minor bump when going to the 5DsR (and assuming your image has no moiré). Also, there's a caveat about judging fine details on DPR's comparator at the present time, and they indicate that by coloring the little 'i' (information) icon yellow for the 5DIV images – they're processed using a beta version of ACR, and while Adobe has time to make the software read the images, they likely have not optimized it for that particular sensor (which is why it's a beta version). Kudos to DPR for calling that out with the icon color...yellow = caution.
 
Upvote 0
neuroanatomist said:
East Wind Photography said:
neuroanatomist said:
midluk said:
East Wind Photography said:
It will be stronger due to the increase in pixel density. Every incremental increase in resolution requires a bit stronger AA filter.
Isn't the AA filter supposed to make the image satisfy the Nyquist–Shannon sampling theorem?
From this I would deduce that the AA filter strength should be proportional to the pixel pitch and therefore higher resolution requires a weaker AA filter.

True – East Wind Photography is incorrect.

Please share some links so we can better understand. Higher pixel pitch should produce more moire on finer detail and therefore a higher degree of aa is required. However, I'm not afraid to stand corrected. Just trying to understand the reason for the opposite.

Incremental increases in sensor resolution have pretty consistently caused a loss of sharpness. Resolution does increase detail but the two are different.

I haven't run across anything that delves into clear detail about the 'strength' of an AA filter, but maybe it's confusion about semantics?

An AA filter essentially spreads the incoming light (introduces blur), and the amount of that spread is proportional to the pixel pitch. So, as pixel pitch gets smaller (more MP for the same size sensor), the amount of blur that the AA filter needs to introduce to counteract aliasing also gets smaller. I think convention would say that a filter that introduces less blur is a weaker filter.

What an AA filter does is to add blur to prevent the aliasing (e.g. moiré) caused by repeating patterns in a subject where the periodicity is approximately half that of the pixel pitch or higher (Nyquist limit). Incidentally, that's why the AA filter is also called an optical low pass filter (OLPF) – it allows frequencies lower than the cutoff to pass, while blocking (blurring out, in this case) higher frequencies. For example, if a sensor's pixel pitch is 6 µm (the 5DIII is close), then patterns that repeat every 3 µm would be at the Nyquist frequency, patterns repeating every 4 µm would be lower than the Nyquist frequency, and patterns repeating every 2µm would be above it. It's the 'at or above' that the filter is designed to reduce/eliminate.

But it's a bit more complex than that, for two reasons. The first is that lenses aren't perfect. As pixel pitch decreases, eventually the blur introduced by the optics will reduce and potentially obviate the need for an AA filter. If you use crappy lenses, you won't have to complain about moiré. ;) The second (and for now, more important) reason is that manufacturers make choices regarding the strength of the AA filter – it's not simply 'set it equal to the Nyquist limit for the sensor' and be done.

There are plenty of examples of moiré in images from cameras with an OLPF, I know I've seen it in bird feathers and buildings with my 1D X. Also, moiré is more evident in video than in still photography – that's because it's not just the frequencies of the patterns, it's also the alignment of the patterns in the subject with the pixel array on the sensor. For example, if you take two shots of the same brick wall (same camera, lens, etc.) but move the camera 1 cm to the left, you may see moiré in one image but not the other. But if you're panning a video across that brick wall, you will see the moiré at some point in the footage.

So, if you're a camera maker you need to decide – do you make the AA filter stronger (set the cutoff lower than the Nyquist frequency for the sensor). If you do that, you will reduce moiré in both stills and video, and if you make it strong enough, you can make pretty darn sure that none of your users see moiré. But as you make the AA filter stronger, you introduce more blur, and that means softer images. Granted, the softness introduced by an AA filter is very amenable to sharpening, but that can have undesirable consequences too (accentuates noise, but you can do NR, but that softens the image again, etc.). Or, you can make the filter weaker (less blur) – or eliminate it entirely – which means a sharper native image but a higher propensity to show aliasing.

There's also a third reason, concerning your statement that, "Incremental increases in sensor resolution have pretty consistently caused a loss of sharpness." That's partly down to technique and camera build. With a lower resolution sensor, a given amount of camera shake (from any source, including mirror/shutter vibration) or subject motion might fall above the Nyquist frequency of the sensor. So for the 5DIII's 6 µm pixel pitch, if the camera is vibrating at an amplitude of 2.5 µm, you would not see any shake-induced blur. But if you switch to a 5Ds with a 4 µm pixel pitch, now that same amount of shake is below the the Nyquist limit, and you'll see the effect.

That is a great explanation and further clarifies a lot of what we are seeing in the real world. I guess that since many of these dslrs are expected to shoot video that the AA filters are probobly not at the nyquist limit. An ideal still camera may be easier to tweak at the design phase than one that has to do everything and still look acceptable to the populous.

I also still see a lot of confusion over the term sharpness vs detail and that may have been where i erred in my statement. All is well in the continuum.
 
Upvote 0
neuroanatomist said:
What an AA filter does is to add blur to prevent the aliasing (e.g. moiré) caused by repeating patterns in a subject where the periodicity is approximately half that of the pixel pitch or higher (Nyquist limit). Incidentally, that's why the AA filter is also called an optical low pass filter (OLPF) – it allows frequencies lower than the cutoff to pass, while blocking (blurring out, in this case) higher frequencies. For example, if a sensor's pixel pitch is 6 µm (the 5DIII is close), then patterns that repeat every 3 µm would be at the Nyquist frequency, patterns repeating every 4 µm would be lower than the Nyquist frequency, and patterns repeating every 2µm would be above it. It's the 'at or above' that the filter is designed to reduce/eliminate.
I think you did the factor 2 in the wrong direction. With 6 μm pixel pitch, your Nyquist limit is at 12μm. A sinusoidal signal that repeats every 12μm has its maximum and minimum 6μm apart.

Of course the Bayer pattern complicates the situation for image sensors. You would have to take 12μm as the pixel pitch to really be on the safe side.

And an additional problems with real-life LPF is that the cutoff is not infinitely steep. If you want to suppress (nearly) all frequencies above the threshold, you will also lose lower frequencies to a lesser degree.
 
Upvote 0
midluk said:
I think you did the factor 2 in the wrong direction. With 6 μm pixel pitch, your Nyquist limit is at 12μm. A sinusoidal signal that repeats every 12μm has its maximum and minimum 6μm apart.

Headsmack. How is a blur magnitude smaller than a pixel actually a blur?!? Thanks for the correction!
 
Upvote 0
neuroanatomist said:
Maiaibing said:
If you use the link above to dpreview you can download and /or check out the difference of the visible lines between the 5DIV and 5DS/R. It is - extremely - easy to see that the 5DS/R has much more detail than the 5DIV. You can simply see the lines along he ruler the whole way up to 50 while the lines blur to extinction already around the 38 mark on the 5DIV test photo.

I was quite surprised that the difference was so big and it seems accurate to say that the extra MPIX and no AA filter gives the 5DS/R a 50% advantage over the 5DIV.

Yes, the lack of an AA filter results in sharper images, if you compare ± AA filter with no sharpening or with the same amount of sharpening. However, you cannot apply much sharpening to an image from an AA-less camera before you start to see sharpening artifacts, whereas you can apply substantially more sharpening to the AA-filtered image. So in practice (i.e. where you appropriately post-process your images), although the lack of an AA filter will result in sharper images, the difference is no where near what the typical (somewhat misleading) comparisons show.

Of course, that applies when you're only looking at the effect of an AA filter. Comparing the 5DIV to the 5Ds, you'd see a significant advantage for the latter due to the extra resolution, with only a minor bump when going to the 5DsR (and assuming your image has no moiré). Also, there's a caveat about judging fine details on DPR's comparator at the present time, and they indicate that by coloring the little 'i' (information) icon yellow for the 5DIV images – they're processed using a beta version of ACR, and while Adobe has time to make the software read the images, they likely have not optimized it for that particular sensor (which is why it's a beta version). Kudos to DPR for calling that out with the icon color...yellow = caution.

This is what I was thinking of when I asked the questions. Thanks for clearing it up! Lots of good info in this thread :)

East Wind Photography said:
I also still see a lot of confusion over the term sharpness vs detail and that may have been where i erred in my statement. All is well in the continuum.

Am I right in thinking people are using the term 'sharpness' to mean two different things - resolution of fine detail, and microcontrast at edges?
 
Upvote 0
Scyrene, you are right that there is sloppy usage of "sharpness". I also try to separate resolution and acutance. The AA filter does two things. It lowers resolution, and conventional "sharpening" can't restore detail that is no longer there. It lowers actuance, the local contrast you mentioned, and sharpening can restore that. If I have got it wrong, please correct me.
 
Upvote 0
midluk said:
neuroanatomist said:
What an AA filter does is to add blur to prevent the aliasing (e.g. moiré) caused by repeating patterns in a subject where the periodicity is approximately half that of the pixel pitch or higher (Nyquist limit). Incidentally, that's why the AA filter is also called an optical low pass filter (OLPF) – it allows frequencies lower than the cutoff to pass, while blocking (blurring out, in this case) higher frequencies. For example, if a sensor's pixel pitch is 6 µm (the 5DIII is close), then patterns that repeat every 3 µm would be at the Nyquist frequency, patterns repeating every 4 µm would be lower than the Nyquist frequency, and patterns repeating every 2µm would be above it. It's the 'at or above' that the filter is designed to reduce/eliminate.
I think you did the factor 2 in the wrong direction. With 6 μm pixel pitch, your Nyquist limit is at 12μm. A sinusoidal signal that repeats every 12μm has its maximum and minimum 6μm apart.

Of course the Bayer pattern complicates the situation for image sensors. You would have to take 12μm as the pixel pitch to really be on the safe side.

And an additional problems with real-life LPF is that the cutoff is not infinitely steep. If you want to suppress (nearly) all frequencies above the threshold, you will also lose lower frequencies to a lesser degree.

On a typical Bayer pattern the Nyquist frequency of green is half the width of the Nyquist frequency of Red and Blue because Every 2nd sensel in each direction is green whereas only every 4th sensel in each direction is Red or Blue. After demosaicing the absolute resolution limit of a typical regular Bayer sensor works out to about 1.414x (√2) the pixel pitch.
 
Upvote 0