Is a brand new 32.5mp APS-C sensor from Canon on the way? [CR1]

dtaylor

Canon 5Ds
Jul 26, 2011
1,805
1,433
For a start the 5DsR does have an AA filter, plus an attempted-cancellation filter. So it's not a bare sensor.

There's very little difference between "canceled" and "no AA", though there is some judging from D800E vs D810 images.

But a lot of the 'detail' allegedly seen in non-AA photos is actually not real detail, it is the hard contrast boundary between photosites on the sensor. With an AA filter the transitions are softened.

I wouldn't use the term "detail" but I understand what you're getting at. Edge transitions can be harsh without an AA filter. The difference between the D800E (canceled AA) and D810 (no AA) isn't sharpness or resolved detail at extinction, but rather how harsh the edges look. So even a canceled AA filter is helping edge aliasing some what.

At the end of the day through, for most print sizes, you won't see a sharpness advantage or edge aliasing (assuming a weak AA filter vs canceled/no filter). The one thing that can stand out is moire when it actually breaks out. If you want to see how little of a difference exists, head on over to the "All new 24mp" pixel thread page 4 where I posted a bunch of 5Ds and 5Dsr comparisons.
 
Last edited:
Upvote 0

Lee Jay

EOS 7D Mark II
Sep 22, 2011
2,250
175
The above is an assertion I don’t trust.

Then look up the proper application of the Nyquist-Shannon sampling theorem.

https://en.wikipedia.org/wiki/Nyquist–Shannon_sampling_theorem

(emphasis added)

"...if the Nyquist criterion is not satisfied, adjacent copies overlap, and it is not possible in general to discern an unambiguous X(f). Any frequency component above
f_{s}/2
is indistinguishable from a lower-frequency component, called an alias, associated with one of the copies. In such cases, the customary interpolation techniques produce the alias, rather than the original component. When the sample-rate is pre-determined by other considerations (such as an industry standard),
x(t)
is usually filtered to reduce its high frequencies to acceptable levels before it is sampled. The type of filter required is a lowpass filter, and in this application it is called an anti-aliasing filter. "

So, without an anti-aliasing filter, it is not possible to discern a low frequency component from an aliased high frequency component.

I saw a brilliant example of this once and wish I had saved it. It was a picture of a picket fence and it looked mostly okay. But a tighter view showed that the number of pickets per section in the wider view didn't match reality. In reality, the fence had many more, smaller pickets, but those were aliased down to many fewer, wider pickets in the image by reflection about the Nyquist frequency. In other words, the original image was lying about the size and quantity of the pickets.

This isn't open to debate, it's a proven, mathematical theorem that's been known for about a century.

Camera engineers know this and that's why they put an expensive anti-aliasing filter into their cameras. They don't add cost to annoy customers, they do it so their cameras produce trustworthy results.
 
  • Like
Reactions: 1 user
Upvote 0
Jul 21, 2010
31,099
12,863
Then look up the proper application of the Nyquist-Shannon sampling theorem.

https://en.wikipedia.org/wiki/Nyquist–Shannon_sampling_theorem

(emphasis added)

"...if the Nyquist criterion is not satisfied, adjacent copies overlap, and it is not possible in general to discern an unambiguous X(f). Any frequency component above
f_{s}/2
is indistinguishable from a lower-frequency component, called an alias, associated with one of the copies. In such cases, the customary interpolation techniques produce the alias, rather than the original component. When the sample-rate is pre-determined by other considerations (such as an industry standard),
x(t)
is usually filtered to reduce its high frequencies to acceptable levels before it is sampled. The type of filter required is a lowpass filter, and in this application it is called an anti-aliasing filter. "

So, without an anti-aliasing filter, it is not possible to discern a low frequency component from an aliased high frequency component.

I saw a brilliant example of this once and wish I had saved it. It was a picture of a picket fence and it looked mostly okay. But a tighter view showed that the number of pickets per section in the wider view didn't match reality. In reality, the fence had many more, smaller pickets, but those were aliased down to many fewer, wider pickets in the image by reflection about the Nyquist frequency. In other words, the original image was lying about the size and quantity of the pickets.

This isn't open to debate, it's a proven, mathematical theorem that's been known for about a century.

Camera engineers know this and that's why they put an expensive anti-aliasing filter into their cameras. They don't add cost to annoy customers, they do it so their cameras produce trustworthy results.
Trust or lack thereof is your own value judgement. The camera will behave as designed, with or without an AA filter (or OLPF, if you prefer). As pixel pitch decreases (or sampling frequency increases, if you prefer), the number of real-world scenes in which aliasing occurs drops.

Besides, your statements indicate that you believe an OLPF precludes all aliasing. It doesn’t, it merely alters the range of subject frequencies at which aliasing is evident. In other words, you can still get moiré with an AA filter. So by your logic, you can’t trust any camera. Guess you’ll just have to give up photography. How sad.
 
  • Like
Reactions: 1 users
Upvote 0

AlanF

Desperately seeking birds
CR Pro
Aug 16, 2012
12,355
22,534
Trust or lack thereof is your own value judgement. The camera will behave as designed, with or without an AA filter (or OLPF, if you prefer). As pixel pitch decreases (or sampling frequency increases, if you prefer), the number of real-world scenes in which aliasing occurs drops.

Besides, your statements indicate that you believe an OLPF precludes all aliasing. It doesn’t, it merely alters the range of subject frequencies at which aliasing is evident. In other words, you can still get moiré with an AA filter. So by your logic, you can’t trust any camera. Guess you’ll just have to give up photography. How sad.
Too true Neuro. Here is a discarded shot of a waxwing using the 5DIV and the 400mm DO II + 2XTC, a combination that is not the sharpest. The waxwing is slightly soft and even then the back wings have Moire. You have to download to see the full extent of the Moire. I occasionally do see Moire because the repeating fine structure of bird wings is prone to that in close ups, and I discard those. Otherwise the extreme cropping I have to do gives images that never have Moire, and the absence of an AA-filter helps because I have to sharpen less and so do not increase noise.

184012
 
Last edited:
  • Like
Reactions: 1 users
Upvote 0

Lee Jay

EOS 7D Mark II
Sep 22, 2011
2,250
175
Trust or lack thereof is your own value judgement. The camera will behave as designed, with or without an AA filter (or OLPF, if you prefer). As pixel pitch decreases (or sampling frequency increases, if you prefer), the number of real-world scenes in which aliasing occurs drops.

Besides, your statements indicate that you believe an OLPF precludes all aliasing. It doesn’t, it merely alters the range of subject frequencies at which aliasing is evident. In other words, you can still get moiré with an AA filter. So by your logic, you can’t trust any camera. Guess you’ll just have to give up photography. How sad.

No, it's not a value judgement, it's fact. Every single scene will have frequencies beyond Nyquist.

And, yes, OLPFs are not perfect, but they don't fail the way you say because they can't alter frequency. The way they fail is to allow smaller amplitudes of the frequencies present to be aliased. Less false information is preferable to more, obviously.
 
  • Like
Reactions: 1 user
Upvote 0

dtaylor

Canon 5Ds
Jul 26, 2011
1,805
1,433
I'm going to preface this by saying that "aliasing" and "moire" are two related but separate things in my book.

As pixel pitch decreases (or sampling frequency increases, if you prefer), the number of real-world scenes in which aliasing occurs drops.

Aliasing is always there at high contrast edges. It's just less apparent or invisible at normal view/print sizes at higher resolutions. AA filters do very effectively eliminate aliasing.

Moire is visible even at web page sizes. Higher resolution does reduce the number of situations where moire will occur, but I don't know why people act like it's a thing of the past. At 50mp it still occurs fairly often with man made objects. A strong AA filter pretty much precludes moire. A weak one will still show it, but less often and less severely.

No, it's not a value judgement, it's fact. Every single scene will have frequencies beyond Nyquist.

While I think the entire AA discussion is "mountain out of an ant hill", I have to agree with the statement that scientific accuracy in a digital sample suffers without a lowpass filter. You won't find anyone suggesting a scientific instrument doesn't need a lowpass filter, or will work better without one, unless they are absolutely certain that instrument's sampling frequency is higher than anything they will sample.
 
Upvote 0
Jul 21, 2010
31,099
12,863
No, it's not a value judgement, it's fact. Every single scene will have frequencies beyond Nyquist.
trust /trəst/ verb
1. believe in the reliability, truth, ability, or strength of.

Sorry, trust requires belief in a fact, it's not the fact itself. That means it's a value judgement. The occurrence of aliasing is a fact. Frequencies beyond Nyquist are a fact. Your lack of trust in those occurrences is a value judgement. Trying to claim your opinions are facts is one of the worse forms of intellectual dishonesty, hubris, and often outright stupidity. As I stated, the camera/sensor will behave as designed, OLPF or not. That is also fact. You believe you won't like the results from an AA-less sensor, that's your belief – a value judgement. If it was fact that lack of an AA filter resulted in poor images, no one would ever have bought the Nikon D800E/810, Canon 5DsR, Sony a7III/a9/etc., PhaseOne MF cameras, etc.
 
Upvote 0
Jul 21, 2010
31,099
12,863
While I think the entire AA discussion is "mountain out of an ant hill", I have to agree with the statement that scientific accuracy in a digital sample suffers without a lowpass filter. You won't find anyone suggesting a scientific instrument doesn't need a lowpass filter, or will work better without one, unless they are absolutely certain that instrument's sampling frequency is higher than anything they will sample.
The discussion was not about 'scientific accuracy', it was about 'trust'. Moreover, a camera (in the context of a photography forum) is not a scientific instrument (unless Lee Jay was being paid to accurately document the number of pickets in a particular fence and his count was wrong due to the lack of an AA filter...in which case, even though the instrument was performing as designed, the person who paid him for the job should not have trusted someone who would pick the wrong tool for the task at hand :p).

My superresolution and other confocal microscopy systems do not have an OLPF, although I do routinely sample at ~2.3x the optical resolution to satisfy Nyquist.
 
Upvote 0

AlanF

Desperately seeking birds
CR Pro
Aug 16, 2012
12,355
22,534
The discussion was not about 'scientific accuracy', it was about 'trust'. Moreover, a camera (in the context of a photography forum) is not a scientific instrument (unless Lee Jay was being paid to accurately document the number of pickets in a particular fence and his count was wrong due to the lack of an AA filter...in which case, even though the instrument was performing as designed, the person who paid him for the job should not have trusted someone who would pick the wrong tool for the task at hand :p).

My superresolution and other confocal microscopy systems do not have an OLPF, although I do routinely sample at ~2.3x the optical resolution to satisfy Nyquist.
If I were in the lab solving a novel protein complex by cryo-electron microscopy, then I certainly would use low-pass filtering. However, if I am taking a photo of my grandson blowing out the candles on his birthday cake, then I am confident that my 5DSR will not add a year or two to his age or make him younger by aliasing the candles and I do trust that my camera would not add an extra pair of wings to an eagle to turn it into the avian equivalent of a biplane.
 
  • Like
Reactions: 1 user
Upvote 0

Michael Clark

Now we see through a glass, darkly...
Apr 5, 2016
4,722
2,655
I felt the same way about the original 7D. An amazing camera in every respect but it was let down by a very poor sensor. In some ways the 7D paved the way for Canon's next gen AF system and pretty much led to the host of pro-body upgrades that led to the 5DmkIII. But I was never particularly happy with the image quality. Sure, I got some great images from it but I found the images noisier and softer than any other Canon DSLR I've owned and the RAW files couldn't take a lot of processing. But every thing else about the camera was amazing. Even at 400 iso I found noise every where. So I passed on the 7DII, which seemed to have similar issues. I loved the extra reach that the 1.6x crop offered and on paper the camera looked like a very capable 1DxII lite...but the image quality wasn't in the same league.
Instead I went for a pair of 5DIII's and haven't looked back. Still using them today....sure I'd like to upgrade to a pair of 5D4's but the mkIII's are still working well for me.

I felt the same way about the original 7D. Much less so about the 7D Mark II. It's still an APS-C camera, though.

I shoot a lot with both the 7D2 and the 5D3 and each have uses where they outshine the other.

In flickering light (i.e. stadiums and gyms) the flicker reduction feature introduced with the 7D Mark II that times the shutter release with the peak of the lights gives you almost an extra stop over setting exposure for the average between peak and trough. You also get more consistent color from frame to frame and have very few to no frames where one side is bright and blue and the other side is dim and brown when shooting at Tvs faster than 1/120 (which is way too slow for sports).

The 5D Mark IV, of course, also has flicker reduction. The 5D Mark III, however, does not.
 
Upvote 0

Lee Jay

EOS 7D Mark II
Sep 22, 2011
2,250
175
trust /trəst/ verb
1. believe in the reliability, truth, ability, or strength of.

Sorry, trust requires belief in a fact, it's not the fact itself.

Belief is acceptance without evidence that justifies such a conclusion. This is a proven mathematical theorem, thus no belief is required. It's simply a fact.
 
Upvote 0
Jul 21, 2010
31,099
12,863
Belief is acceptance without evidence that justifies such a conclusion. This is a proven mathematical theorem, thus no belief is required. It's simply a fact.
Semantic BS. :rolleyes: You can trust the 'proven mathematical theorem' (which is a tautology, since a mathematical theorem is proven by definition, although a theory is not and cannot be proven), but your distrust of the camera/sensor is not a theorem, it's your own opinion and value judgement. I've already stated my viewpoint on those who claim their opinions are facts. You continue to demonstrate the accuracy of that viewpoint.
 
Upvote 0

Michael Clark

Now we see through a glass, darkly...
Apr 5, 2016
4,722
2,655
i would say "JUST WAIT AND SEE".canon is not from yesterday.and sony is just problematic player at any technic. always.canon,nikon,fuji,leica are real photography producers since ever.and to be honest,how many of us are taking photos with iso above 1600 in real world. Martin Osner.Any problem with 5d iv.

Anyone who shoots night/indoor amateur sports all the way up to medium colleges lives at ISO 3200 or above. Lots of event/wedding photographers. Concert/theatrical photographers. Photojournalist covering spot news at night. The list is near endless.

About the only folks that don't need high ISO these days are most landscape specialists, studio portraitists (including those who move their lights outdoors), product photographers, daylight sports and wildlife shooters who can choose to only work in ideal light, and some real estate shooters that have enough time and budget to light the interiors they shoot.


It is always a battle between optical resolution from increasing the pixels vs diffraction. IMO ore pixels always wins (30D vs 7D vs 7D2) - unless you can show me the same image shot with high and low MP presented at the same viewing size that shows me any different.

It's the difference in viewing size that gets a lot of people. They don't realize that when pixel peeping a 20 MP image at 100% (one image pixel = 1 screen pixel) on a 24" HD monitor they're looking at a piece of a roughly 54x36 inch full display size, but then when they look at a 50 MP image at 100% on the same monitor they're looking at a piece of a roughly 96x64 inch display size.

That also goes for noise and motion blur. If the display size is the same, the sensor size is the same, and the focal length is the same, then the same amount of camera movement will produce the same amount of motion blur regardless of the comparative sensor resolutions.

There is imo no chance something like 7D III is going to happen in an EF-M mount. First - ergonomics / size aspects - you want your 7D III being larger than M5, right? Second - with EF-M, there is no upgrade path to the RF mount lens. I think that if something like the 7D III is going to be released one day, it is going to be the RF version. And once that happens, it is going to be the last nail in the coffin of an EF-M mount, not that it will die off, but anchoring it definitely in the hobby segment.

The "hobby" segment is by far the biggest part of Canon's total sales units of ILCs and lenses. Maybe even the biggest part of revenue and profits from ILCs and lenses.


You shoot whatyou have to. If you are doing landscape f5.6 will probably give too little DOF - sometimes these diffraction comments ignore the necessities of the art and end up being purely philosophical.
Can you show me an image where a high-res sensor 'diffration limited' photo has less resolution than a lower-res sensor image that (in theory) is less 'diffraction limited'??

At the DLA, diffraction begins to be evident when pixel peeping at 100%. If you're not pixel peeping at 100% on a monitor that allows your eyes to resolve a single pixel, diffraction will not be evident at the DLA. As the aperture is closed down further, the effect increases. It's just like depth of field. If you look at an image at 8x10 it has more DoF than if you look at the same image from the same distance displayed at 16x20.


  • MILCs eliminate essentially all opto-mechanical components except optional sensor stabilisation assuming that electro-mechanical shutter will be replaced with electronic one. This essentially leaves only digital electronics. There is a well documented trend since circa 1975 that digital electronics components typically drop in prices at least 20% per annum for the same functionality or capacity.
  • Cheaper to make does not automatically translate to lower prices for the buyer, of course. Manufacturers would like to increase their margin, but competition is likely to force them to lower prices in the end. However, looking at it from the point of view of a manufacturer is is very desirable to have low production cost. You can then can higher margins and still be competitive.
  • DSLRs are outselling MILCs because the bottom range DSLRs are actually quite cheap. At least until recently their prices were generally lower than those of MILCs. Rebel T6 with kit zoom lists on Canon US web site for $399.99 and the street price is much lower.
  • I would respectfully disagree with your view that adding gizmos to DSLRs can continue. In my view, this is not the case and certainly not on the scale possible in a MILCs. In the AF area the main limitation is the number of AF points in a DSLR. The other limitation is that you by definition cannot display all these gizmo visual effects in the OVF. Of course, you can display them on the LCD in Live View, but then why to have a mirror and OVF, you could just as well use your phone + Instagram. Well, it is actually obvious to me that DSLRs reached maturity. Of course, sensors still change slowly, but the main concept of the DSLR remains more or less constant. The last revolution apart from transition from film to digital was autofocus, but that was 35 years ago still in the film days. The maturity is a good thing in a way and this is why I still use a DSLR.

By that logic, the Canon 5D Mark II was useless for video, since it could only record video with the mirror locked up in Live View. Ditto for every DSLR since the 5DII that has offered video recording. No one would ever think of using the 5D Mark II, the original 7D, the 1D X, the 1D C, 5D Mark III, 5D Mark IV, etc. to record award winning films, would they?


IF the readout speed is vastly increased, I could see digic 8 being fast enough to handle an aps-c sized processor with increased pixel density. It's handling a full frame right now just fine with eye-af, especially with the new firmware. The big hangup has been how fast the speed readout from the chip is. If it was mirroreless, more than likely you'd be able to get 7dMkII like FPS with that setup as well since it's only having to process af points from a much smaller chip, and the 90d will not be displaying that real time. Even with increased pixel density it's still less points than the current R when in live view. If you use the digic 8 to interpret phase detect focal systems it will not even begin to tax the chip.

The 2020 olympic ready mirrorless sports 1dx type camera would need both the increased readout speed AND a newer Digic 9 chip. If they release a full frame version of this newly designed chip in the 70-100 megapixel range the increased readout speed of the sensor would make the current Digic 8 chip work in a hypothetical 5ds replacement, but with slower fps than the R, just like the 5ds versus 5dIII.

The more I think about it, the more getting rid of the 7D line makes sense from what I always considered the primary reason why the line existed: Fast fps, 1dx level tracking
Both the RP and the R exhibit tracking on par with the 1dx, even if they can't fire off shots as fast as it can. This level of tracking will be in ALL future mirrorless bodies and all future bodies with live view. The only differentiation is in FPS speeds. The 80D already was only 2 fps lower than the 7dMkII, and had a better sensor to boot. So why wouldn't they add in the 1dx tracking and use the digic 8 to handle it? Even with that advanced setup, in some respects it will have worse tracking than the R and RP outside of speed! If you add that in, at that point what you are losing is better weatherproofing and a joystick. And I'll bet we'll see at least one of those on the upgraded (and upgraded price) body.

Increasing readout speed by that much by 2020 is a BIG if since it hasn't seemed to move at all for Canon in the last 5+ years.

What do you think is the reason for only 30fps 1080p with full frame sensors?

What do you think is the reason for increasingly higher crop factors for 4K video as the sensor resolutions increase?

What do you think is the reason for the dismal AI Servo frame rate with the EOS R?

For all of the above it is sensor readout speed and probably nothing but sensor readout speed.

The EOS 1D X Mark II has dual DiG!C 6+ image processors, plus a DiG!C 5 processor for combining distance information from the PDAF sensor with color information from the RGB+IR metering sensor to assist in tracking moving subjects using EOS iTR AF. The 7D Mark II has dual DiG!C 6 image processors plus another non-DiG!C designated processor to handle EOS iTR AF.


I've been hoping for pet-eye-Af for a long time. My bearded dragon moves fast when the cat is chasing him around the house. Pet-eye-AF will be a Godsend. Though, I have to wonder whether the camera would confuse the lizard eye with the holes in my Fruit Loops if he happens to be near the bowl. Any word on insect-eye-AF development?

I must have Reptile Eye AF (REAF) in the next Canon model or I'm switching to Hasselblad!


Without a surprising new sensor, it’s highly unlikely. To achieve the “zero blackout,” the A9’s sensor reads in something like 1/150s. Most sensors read in closer to 1/30s.

BINGO! it's all about sensor readout speed.
 
  • Like
Reactions: 2 users
Upvote 0

Michael Clark

Now we see through a glass, darkly...
Apr 5, 2016
4,722
2,655
So much of the discussion about diffraction is confusing or misleading to those of us not so conversant on what various terms mean. "Diffraction limited" seems to be used as if there is some zone where there is no diffraction, and then suddenly you hit a wall where your image gets ruined. Some online calculators can reinforce that impression.

Of course there is diffraction at f/1.4, and even more at f/32. The effect becomes noticeable gradually. I would guess that the "limited" moment comes when diffraction starts to be the limiting factor more so than anything else. But I don't know for sure.

Diffraction Limited Aperture is the aperture at which the airy disc of a point source of light at the focus distance becomes larger than the pixel pitch. It's where it starts to become noticeable only if one is viewing at a display size large enough that one's eyes can barely differentiate a single pixel from another.


Yes, one needs to decide what is important in making the compromises. Obviously the choices will be different in shooting a misty woodland scene from doing product photography.

For the 2017 total solar eclipse, I was shooting with my T3i. My only telephoto lens at the time was the rather bad 75-300mm that has a lot of chromatic aberration, among other faults. I was concerned about diffraction, but also needed focusing leeway from depth of field, since I didn't dare look through the OVF, and focusing on the screen, even shaded was rather difficult. I also presumed that stopping down would help minimize the CA. Graphs I saw on line suggested that f/11 was the best choice at 300mm on that lens. Days before, I put the filter on the lens and practiced shooting the sun. Even keeping the sun in the picture was a challenge, even through I had swung the floppy screen into the shadow of the camera itself. It turned out that f/11 gave sharp looking pictures with that less than optimal lens. Sunspots showed up very clearly. My eclipse pictures turned out about as well as those by people using superior equipment.

Here's how I dealt with seeing the non-flippy screen on my cameras during the eclipse: foam board with holes cut just the right size to slide them onto the lens (from the rear before attaching it to the camera).

184022
 
  • Like
Reactions: 1 user
Upvote 0

stevelee

FT-QL
CR Pro
Jul 6, 2017
2,383
1,064
Davidson, NC
Here's how I dealt with seeing the non-flippy screen on my cameras during the eclipse: foam board with holes cut just the right size to slide them onto the lens (from the rear before attaching it to the camera).

I'll think about that if I head to Dallas in 2024, I think it is. There will probably be fun stuff going on on the SMU campus, where I went to grad school.

As it was, even with the flippy screen shaded by the camera, it was hard to see. I sent the friend I had come with back to the car to get a black umbrella. He eventually got back, after chatting with a radio crew and a woman who had come from France. And I could see decently well as long as he was patient enough to hold the umbrella. My first few pictures were not usable. I posted some of them and an assembled video at http://www.stevelee.name/eclipse/index.html

If I had not practiced on the sun a few days before, I could not have done much during the eclipse. I had to deal with the realization that to photograph the sun, one has to be out in bright sunlight. The flippy screen helped, but moving the camera to keep the sun in view was harder, because the direction of movement was uncoordinated with how it appeared on the angled screen.
 
Upvote 0