Patent: ND Filter to Increase Dynamic Range

3kramd5 said:
jrista said:
This is an excellent explanation. However, one caveat: I do not believe this has anything to do with increasing the dynamic range of the images...and everything to do with increasing dynamic range and more importantly contrast for the purposes of performing AF with a DPAF sensor. There has never been any indication that Canon intends to use DPAF subpixels in independent reads for the purposes of combining them halves into a supposedly higher dynamic range image. (I don't think that is really viable, as it is not the same thing as what ML does...ML uses FULL pixels with two separate exposures to improve DR...with DPAF, each half-pixel exposure has half the signal...so SNR is even lower to start with.)

Would creating contrast in the AF sensor which doesn't exist in the scene be beneficial? Seems like it would create uncertainty. Also DPAF is based on phase, so how would contrast improve it? The summary specifically mentions AF sensors, but I'm unsure how exactly this would improve AF.

Pure speculation pertaining to split pixels for imaging purposes, but don't they already read out each subpixel (and send the info to the AF brain) and for imaging use a logic device somewhere in the chain to determine total pixel charge based on each pair of diodes? If so, darkening half of each pixel could open some interesting options for the determination of total charge (i.e. it need not be a pure sum).

Indeed. Jrista, addressing your comment about the readout, my response is that the ND filter obviates the need for setting pixels at different sensitivities (as Magic Lantern does), because the difference in exposure as seen by "light" and "dark" pixels is done optically--by the filter itself. The information thus captured is superior to that of ML's method: it is analogous to taking two bracketed exposures simultaneously in which one exposure is rendered darker by an ND filter, not by changing ISO.

3kramd5 is correct: phase detection AF does rely in some sense on contrast differences between points that are not in phase, but putting an ND filter on one half doesn't necessarily improve that contrast. In fact, it could decrease or even reverse the apparent contrast between two adjacent AF pixels.
 
Upvote 0
My post is probably somewhat unrelated, but given that a Bayer-mosaic is used for color, couldn't that also be used for dynamic range?

RGBG is currently used. I've heard Sony is coming with a RGBW.

Imagine if the white pixel had an ND-filter over it of 1 or 2 stops (or whatever would be most useful).
Then the picture could be extrapolated from the RGB pixels first (with is normal exposure-limits that we see currently (blown whites)) and then sub-sequently be modified by the under-exposed image taken from the 4th pixel (the white one) to put in luma-info in the part where the RGB-image was overexposed (and perhaps in the entire image)

If nothing else, then it could perhaps be used like film's knee around max-exposure and make blown whites look less blown.

Just a thought :)
 
Upvote 0
chromophore said:
3kramd5 said:
jrista said:
This is an excellent explanation. However, one caveat: I do not believe this has anything to do with increasing the dynamic range of the images...and everything to do with increasing dynamic range and more importantly contrast for the purposes of performing AF with a DPAF sensor. There has never been any indication that Canon intends to use DPAF subpixels in independent reads for the purposes of combining them halves into a supposedly higher dynamic range image. (I don't think that is really viable, as it is not the same thing as what ML does...ML uses FULL pixels with two separate exposures to improve DR...with DPAF, each half-pixel exposure has half the signal...so SNR is even lower to start with.)

Would creating contrast in the AF sensor which doesn't exist in the scene be beneficial? Seems like it would create uncertainty. Also DPAF is based on phase, so how would contrast improve it? The summary specifically mentions AF sensors, but I'm unsure how exactly this would improve AF.

Pure speculation pertaining to split pixels for imaging purposes, but don't they already read out each subpixel (and send the info to the AF brain) and for imaging use a logic device somewhere in the chain to determine total pixel charge based on each pair of diodes? If so, darkening half of each pixel could open some interesting options for the determination of total charge (i.e. it need not be a pure sum).

Indeed. Jrista, addressing your comment about the readout, my response is that the ND filter obviates the need for setting pixels at different sensitivities (as Magic Lantern does), because the difference in exposure as seen by "light" and "dark" pixels is done optically--by the filter itself. The information thus captured is superior to that of ML's method: it is analogous to taking two bracketed exposures simultaneously in which one exposure is rendered darker by an ND filter, not by changing ISO.

Sure, one "exposure" is rendered darker by the ND...without changing ISO...but at HALF the signal. That does NOT improve noise. The lower signal is actually going to reduce SNR.

It also does not improve the kind of noise that causes Canon's DR problem. The improvement in dynamic range achieved by ML is due to the fact that they read the pixels out at different ISOs. A higher ISO read amplifies the signal in the pixels before the signal hits the noisy electronics, so the read noise contribution is, in relative terms, a smaller portion of the signal. Blending that second higher ISO read with lower read noise into the first read's shadows is how you reduce read noise and gain DR.

Slapping an ND filter on a half photodiode and reading each half out independently does not achieve the same thing. Could you do HDR with it? Sure. You could overexpose the photodiode that does not have an ND filter, and properly or underexpose the other photodiode. But both of those photodiodes are going to have lower signal. That would be like using a camera with a sensor half the area, and doing normal HDR. Your going to have considerably more noise in each half signal, which is going to offset a lot of the benefit of doing HDR. Could you actually gain the full 1-2 stops of additional dynamic range? Probably not.

chromophore said:
3kramd5 is correct: phase detection AF does rely in some sense on contrast differences between points that are not in phase, but putting an ND filter on one half doesn't necessarily improve that contrast. In fact, it could decrease or even reverse the apparent contrast between two adjacent AF pixels.

It depends. Canon also has patents for DPAF pixels of different sensitivities. I was never sure how that would work either...yet if you combine the ND filter patent with the dual-sensitivity patent...then it starts to make sense.
 
Upvote 0