The EOS 80D Replacement to be a Big Leap Forward [CR2]

Apr 23, 2018
1,088
153
canon - as other makers - uses "scene analysis" in addition to phase-af. face and eye-tracking for example as in the M50, and also for tracking AF (eg color information and possibly? also some "object identification AI/database?) to keep selected moving object in focus). i would think that poses an even greater challenge to (realtime) in-camera data processing. compared to that "simple" phase-af operation might be an easy exercise.

but thats not what i am trying to find out. i would like to know "why 99 AF fields", why not 999 or 10 million of it, when "each single (split) dual pixel can serve as AF-field". out of curiosity teally, nothing else.

from a practical/user perspective i want to know answer to the "AF orientation" question, ie whether there really is only AF sensitivity for vertical contrast edges/structures or not. that one i can and will check soon (on my daughter's new M50, gift/box not presented/ opened yet ;-)

i find it a bit strange that i cannot find any information on this in any of the reviews for M50 - or other canon DPAF cameras (DSLRs in live view mode, not in mirror-mode with detached AF sensor).
 
Upvote 0
Mar 2, 2012
3,188
543
fullstop said:
from a practical/user perspective i want to know answer to the "AF orientation" question, ie whether there really is only AF sensitivity for vertical contrast edges/structures or not.

Think of them like traditional vertical line pair sensors: they are only looking horizontally (i.e. along the width of the sensor).

To further clarify, there is only phase AF sensitivity in the horizontal. CDAF is not similarly limited.

Sony’s OSPDAF cameras are set up like this too, albeit not with dual diodes per pixel.

Fortunately in the real world, there aren’t many subjects which only present parallel lines in the direction exactly opposite of how you want to frame, especially ones which move so much that CDAF just can’t cut it. However multiple orientation PDAF would be better, and it looks (see the aforementioned figure from a patent) that canon recognizes the next step.

================================================

Reference material:

neuroanatomist said:
The DPAF pixels are indeed constrained directionally, the hemi-pixel divisions are all in the same orientation. I really think Canon needs to explore the idea of QPAF - quad-pixels (while a technical challenge, I think that would be than alternating/mixing the orientation of the pixel divisions).

neuroanatomist said:
This image show a little of the tech, notably that all the split pixels have the same orientation.
...
They could be sampled in any configuration, but still would be sensitive to phase differences in only one dimension (line sensors).

3kramd5 said:
one line is in essence a vertical pair (half the line looks left, half the line looks right).

3kramd5 said:
fullstop said:
If so, does Canon arrange these "AF strips/lines" of DP-pixels in vertical, horizontal and/or diagonal cross pattern/s on sensor? if not, why not?

Presumably they only establish lines in the vertical. Why not horizontal or diagonal? I think I covered that in (1) above.

Each pixel has two photodiodes. One “looks left” and the other “looks right.” To be effective as a horizontal line sensor, they would have to “look” up and down. That could be accomplished with quads, or by alternating the orientation of the photodiodes within each pixel, but that would require a new sensor fab.

neuroanatomist said:
Probably squares. But, current DPAF is restricted to detecting features in only one orientation, determined by the split in the individual pixels. It doesn't matter whether the pixel sampling is in the pattern of a cross, square, or perambulating pentagon...the orientation sensitivity is fixed to that same orientation.

Note I misread this post when replying previously, and have corrected it.
 
Upvote 0
Apr 23, 2018
1,088
153
@3kramd5 - ok, thx. Vertically oriented ["virtual"] Phase-AF line sensor, detecting horizontal structures.

But still far from really clear on DP-AF workings. Also, why on M50 do only 3 EF-M lenses [recent 28, recent 18-150 and not so recent 55-250] enable 143 "AF fields", but not the other EF-M lenses [99 AF fields]?

I would have expected/guessed a faster, more wide-angle lens like the 22/2.0 to be a more likely candidate for the expanded AF field coverage than a slow tele-zoom like the 55-200 or a slow 18-150 trans-zoom.

Canon only states:
For technical reasons, this expanded coverage is available (as of February 2018) with the EF-M 18–150mm, 55–200mm, and 28mm Macro lenses only.
http://learn.usa.canon.com/resources/articles/2018/eos-m50/eos-m50-autofocus.shtml

Really wonder, what those "technical reasons are"? Specific chip in lens? Specific firmware functionality in lens? Difference in lens mount protocol support? But why then does it work with the "relatively old" 55-200 but not with somewhat more recent 15-45 kit zoom?

So unclear to me, that I don't even have a marketing conspiracy/nerfing hypothesis for it. ;D ;D ;D
 
Upvote 0
Mar 2, 2012
3,188
543
Good question, but again one nobody who writes on these walls is likely to know. Maybe whomever took over for Chuck Westfall would publish a write up if asked.

My off the cuff guess: they tested it, and only enable the AF zones which they deem reliable on a per-camera and per-lens basis. What variables play into that reliability would be worse than guesswork, so I won’t even try.

I have a similar question about sony’s A9 e shutter. It only works at 20FPS with some lenses. What possible relationship is there? Probably: AF speed. If a lens can’t keep up, they will slow the framerate to avoid people complaining about 99 OOF shots from a 5 second burst.
 
Upvote 0
Jul 21, 2010
31,228
13,089
3kramd5 said:
Good question, but again one nobody who writes on these walls is likely to know. Maybe whomever took over for Chuck Westfall would publish a write up if asked.

My off the cuff guess: they tested it, and only enable the AF zones which they deem reliable on a per-camera and per-lens basis. What variables play into that reliability would be worse than guesswork, so I won’t even try.

Agreed, probably something only Canon knows. But I also agree that empirical testing may be the basis for it. I suspect (but really don’t know) that such testing formed the basis for all of those lens groups with the 1D X/5DIII AF system, with max aperture not being the sole determinant of which AF points were available as crosses, only lines, or not available at all.

Going back to an earlier post:

3kramd5 said:
Talys said:
That's the first I've seen a picture of a 1DXII PDAF sensor, I think.
That’s the 1Dx / 5D3 PDAF assembly.
The 1Dx II presumably has a different configuration of sensors to facilitate f/8 sensitivity at each location.
Correct. But they’re actually quite similar.
[/quote]

I think the f/8 functionality of the 1D X was also based on empirical testing. When the camera was launched, it required f/5.6 for AF...f/8 AF was added with a firmware update, so clearly no hardware change was needed to support f/8. Or the hardware was there but disabled, but then...why? More likely users asked for the feature, Canon tied it or and found it worked reasonably well with the center point, and enabled it. The fact that Tamron/Sigma f/6.3 lenses can AF with an f/5.6 AF point shows f/5.6 isn’t a hard limit (and the physics of the AF sensor support that).
 
Upvote 0
Apr 25, 2011
2,521
1,900
fullstop said:
Canon only states:
For technical reasons, this expanded coverage is available (as of February 2018) with the EF-M 18–150mm, 55–200mm, and 28mm Macro lenses only.
http://learn.usa.canon.com/resources/articles/2018/eos-m50/eos-m50-autofocus.shtml
Really wonder, what those "technical reasons are"? Specific chip in lens? Specific firmware functionality in lens? Difference in lens mount protocol support?
"Legacy" retrofocus design, which decreases microlens shading for pixels far away from the optical axis?
 
Upvote 0
May 11, 2017
1,365
635
fullstop said:
canon - as other makers - uses "scene analysis" in addition to phase-af. face and eye-tracking for example as in the M50, and also for tracking AF (eg color information and possibly? also some "object identification AI/database?) to keep selected moving object in focus). i would think that poses an even greater challenge to (realtime) in-camera data processing. compared to that "simple" phase-af operation might be an easy exercise.

but thats not what i am trying to find out. i would like to know "why 99 AF fields", why not 999 or 10 million of it, when "each single (split) dual pixel can serve as AF-field". out of curiosity teally, nothing else.

from a practical/user perspective i want to know answer to the "AF orientation" question, ie whether there really is only AF sensitivity for vertical contrast edges/structures or not. that one i can and will check soon (on my daughter's new M50, gift/box not presented/ opened yet ;-)

i find it a bit strange that i cannot find any information on this in any of the reviews for M50 - or other canon DPAF cameras (DSLRs in live view mode, not in mirror-mode with AF sensor).

My guess is that several factors may have played into Canon's decision to go with "99 AF Fields". One factor may have been the need to develop a practical and effective user interface. A second factor may have been the need to aggregate data from clusters of pixels to cope with the fact that all pixels are oriented the same way. Originally data is developed by comparing the output of the two halves of individual pixels, but maybe it is then aggregated to assure that at least some pixels will have produce information that will permit accurate focussing. If data from enough pixels is used, at least some of the pixels are virtually certain find lines that will generate information that can be used for accurate focus adjustment. This would require that development of algorithms that could separate the wheat from the chaff among the data from the individual pixels. In addition, there may also be other technical factors influencing the design of the AF fields, such as manufacturing issues, etc. Apparently lens design can be a factor.
 
Upvote 0
any updated info on a 90D or 80D MII ? For months i'm thinking about buying a Nikon D7500 JUST for :
- it's a DSLR with a good OVF
- it's small but with a nice grip / good handling (i've had a test camera for a week - also had the D500 for testing but i want the smaller DSLR body)
- it has no AA filter !
- it has a nice sensor, just 20MP and very nice IQ
- it has a decent fps / AF and again it's small compared to my other bodies (except my m50 but i'm not too happy with the EVF thing)

I just need a DSLR that i have always with me , especially if i'm out with the dogs and need some nice action shots and i find myself not taking the big 1DX with me on the daily walks. As i said i've tested the D7500 with the 35 1.8 (DX) and 85 1.8 (DX) and liked it a lot. I like the Canon menus more but some of the Nikons way's of doing things are also very nice - so not a big problem to use the nikon together with the canon (usability wise). But hey, i don't WANT to buy it just because i can't have it from Canon. So i think the 90D should have all those minimum features (no AA please!) and add the better video AF and Wifi as well (that sucks on the nikon).

So i find myself checking the price for the D7500 every second day and stop myself from pushing the buy button - but WHEN do i get the 90D please ??????
 
Upvote 0