Patent: ND Filter to Increase Dynamic Range

Canon Rumors

Who Dey
Canon Rumors Premium
Jul 20, 2010
12,753
5,577
279,596
Canada
www.canonrumors.com
HTML:
A patent showing an ND filter to increase dynamic range for only half the field of view.</p>
<p>In English (<em><a href="http://www.northlight-images.co.uk/rumours.html" target="_blank">Thanks Keith</a></em>):</p>
<blockquote>
<p class="p1"><span class="s1">Introducing a neutral density filter for only half the </span><span class="s1">field, you get differing amounts of light to AF pixels that are split on the </span><span class="s1">same axis as the filter. This would increase the differential output of the </span><span class="s1">sub-pixel pair as you pass through sharp focus.</span></p>
</blockquote>
<p class="p1">Patent Publication No. 2015-87401 (Google Translated)</p>
<ul>
<li class="p1">Published 2015.5.7</li>
<li class="p1">Filing date 2013.10.28</li>
</ul>
<p class="p1">Canon patents</p>
<ul>
<li class="p1">Dual Pixel AF</li>
<li class="p1">And set up a half-ND filter at the pupil position of the lens</li>
<li class="p1">Half ND filter can be switched</li>
</ul>
 
meywd said:
Not sure, but seems like an ND filter that is full of small holes, so some pixels get the normal amount of light and the others get an N stop less light, so instead of Dual_ISO its dual exposures

Wouldn't diffraction mean it acted just like a regular ND filter whose strength was reduced by the same factor as the area of the holes?
 
Upvote 0
AcutancePhotography said:
I am stubbing my brain on this.

Same here, the only thing I (seem to) understand is that, given a "half-ND filter at the pupil position of the lens" is mentioned, all current and past Canon lenses probably won't benefit from this technology. This patent puzzles me.
 
Upvote 0
Woody said:
Idea is similar to dual ISO? In this case, for every dual pixel pair, one of the pixels has ND cover while the other does not?

I too am having a difficult time understanding what this does, but the idea of dual ISO seems to make more sense to me as it's extremely simple to explain versus this. Does it mean that dual-pixel AF is no longer functional while normal stills shooting is going on? It's a Canon patents. These rarely see the light of day in regards to new technology, but do sometimes hint at new lenses.
 
Upvote 0
Understandably, there's a lot of confusion about what exactly such a construction would do. How would it affect the image? Wouldn't it simply darken half of the frame?

The answer--perhaps strangely--is no. To understand what it does, think about a conventional lens shot wide open. Note that in out of focus areas of the image, circles of confusion ("bokeh balls") are created, and are especially visible when a bright point light source not in the plane of focus is projected through the lens onto the image plane.

If you could alter the shape of the aperture in some way, you would see those out-of-focus highlights rendered in the shape of the aperture. This has been a popular trick exploited by numerous photographers to give bokeh in the shape of hearts, stars, or whatever cloyingly cute gimmick is in vogue these days. This is the key insight that should immediately tell you that placing an ND filter over half of the aperture will not darken half of the frame in such a crude way: instead, if you were to obstruct the light entirely on that half where the ND is being applied, you'd see out-of-focus highlights that were shaped like a semicircular disk.

But in the plane of focus, the image would still be sharp, albeit at an exposure exactly one stop slower than if that obstruction were not present (in the case of total density obstructing half the aperture). Diffraction effects at small apertures would be asymmetrical, but you would not notice this unless you were stopped down to, say, f/8 or slower, and you imaged bright point light sources (the familiar "stars" you see radiating from street lights in night photography, for example, would then be asymmetric).

Now, if the density were NOT complete--only partial, the effect would still be proportionately visible in the out-of-focus highlights if shot at fast apertures (or in such a way as to reveal the circles of confusion). But at, say, 3 stops ND, you would still find it very difficult to see any effect on the image in the plane of focus.

Okay, so now we roughly know what such a device would do to the image as it is perceived. What does any of this actually get you? What is this talk about dynamic range?

The idea is that, in the plane of focus, the incident light on the sensor is coming from two halves (as delineated by this hypothetical ND filter). The light "cone" comes to a (reasonably) sharp point on the image plane, but half of this cone is unimpeded, whereas the other half has its luminous intensity diminished by the ND filter, and so is darker. This dual nature (again, in the plane of focus only!) lets the designer of the system theoretically use a sensor for which two neighboring (sub-)pixels are oriented in such a way as to capture these two halves separately. This is essentially a more sophisticated version of the dual ISO Magic Lantern hack, conducted at an optical level; alternatively, it is a variant of the light field imaging idea but with the incoming image split into only two components (light and dark). The result is the recording of a wider exposure range after appropriate signal processing.

The final step of the idea is that if we already have this Dual Pixel AF technology, then the sensor is already essentially adapted to the use of this modified optical construction. But as we have already seen, this idea is not without its drawbacks: it will totally mess up the appearance of out-of-focus highlights.
 
Upvote 0
chromophore said:
Understandably, there's a lot of confusion about what exactly such a construction would do. How would it affect the image? Wouldn't it simply darken half of the frame?

The answer--perhaps strangely--is no. To understand what it does, think about a conventional lens shot wide open. Note that in out of focus areas of the image, circles of confusion ("bokeh balls") are created, and are especially visible when a bright point light source not in the plane of focus is projected through the lens onto the image plane.

If you could alter the shape of the aperture in some way, you would see those out-of-focus highlights rendered in the shape of the aperture. This has been a popular trick exploited by numerous photographers to give bokeh in the shape of hearts, stars, or whatever cloyingly cute gimmick is in vogue these days. This is the key insight that should immediately tell you that placing an ND filter over half of the aperture will not darken half of the frame in such a crude way: instead, if you were to obstruct the light entirely on that half where the ND is being applied, you'd see out-of-focus highlights that were shaped like a semicircular disk.

But in the plane of focus, the image would still be sharp, albeit at an exposure exactly one stop slower than if that obstruction were not present (in the case of total density obstructing half the aperture). Diffraction effects at small apertures would be asymmetrical, but you would not notice this unless you were stopped down to, say, f/8 or slower, and you imaged bright point light sources (the familiar "stars" you see radiating from street lights in night photography, for example, would then be asymmetric).

Now, if the density were NOT complete--only partial, the effect would still be proportionately visible in the out-of-focus highlights if shot at fast apertures (or in such a way as to reveal the circles of confusion). But at, say, 3 stops ND, you would still find it very difficult to see any effect on the image in the plane of focus.

Okay, so now we roughly know what such a device would do to the image as it is perceived. What does any of this actually get you? What is this talk about dynamic range?

The idea is that, in the plane of focus, the incident light on the sensor is coming from two halves (as delineated by this hypothetical ND filter). The light "cone" comes to a (reasonably) sharp point on the image plane, but half of this cone is unimpeded, whereas the other half has its luminous intensity diminished by the ND filter, and so is darker. This dual nature (again, in the plane of focus only!) lets the designer of the system theoretically use a sensor for which two neighboring (sub-)pixels are oriented in such a way as to capture these two halves separately. This is essentially a more sophisticated version of the dual ISO Magic Lantern hack, conducted at an optical level; alternatively, it is a variant of the light field imaging idea but with the incoming image split into only two components (light and dark). The result is the recording of a wider exposure range after appropriate signal processing.

The final step of the idea is that if we already have this Dual Pixel AF technology, then the sensor is already essentially adapted to the use of this modified optical construction. But as we have already seen, this idea is not without its drawbacks: it will totally mess up the appearance of out-of-focus highlights.

Thank you for the lengthy explanation, from what you said, it allows the Af(?) Sensor to see both the normal image with OOF highlights and one without them, so they can focus better? Because I understand light field technology (not how they capture it) but this doesn't seem to be like it, as yes storing both exposures will be similar but what's OOF is already optically OOF and you can't change it simply with an ND
 
Upvote 0
"Light field" isn't necessarily about permitting refocusing the image after the capture. In its broadest sense, it is about capturing information about the incident light that depends on the angle and direction of incidence. A single circular aperture doesn't do this.

As the concept applies to refocusing, the idea is to place a secondary array of lenses between the primary lens and the image plane, thereby recording parallax information (each secondary lens sees the image from a slightly different point of view. Like the compound eye of an insect, the imaging sensor sees many smaller copies of the scene; this information is then recombined to generate a composite that can be refocused because depth information is now present.

A related concept is that of a coded aperture, in which the aperture of the lens can be made into a complex shape, and the resulting diffraction can be "deconvolved" (since we know what the shape of the aperture is). This is also related to the zone plate and other ideas in diffractive optics and computational photography. From there, we can see how this "half-ND filter" idea is similarly related. In such a case, we are recording information about which half of the image-forming light is striking a given pixel, thus enabling us to distinguish whether we should look at a given pixel for information in the upper half of the exposure range, or the lower half.

Keep in mind, this idea is not related to AF performance. It is strictly about improving the dynamic range with which a scene can be recorded, by sacrificing half of the spatial resolution of the sensor, much in the way that Magic Lantern's dual ISO trick interleaves bands of low-sensitivity pixels with high-sensitivity pixels by varying the readout levels. But the drawback of the ML hack is that these pixels are arranged in bands, so there can be some introduced artifacts when interpolation is performed (deinterlacing). However, if we use this modified aperture, then the high/low sensitivity pixels are no longer arranged in bands: they are arranged in individual pairs throughout the sensor plane, and the demosaicing should produce less noticeable artifacts.

The only reason why AF is mentioned here is that current sensor designs using these "dual AF pixels" are already suited to this half-ND filter concept. It isn't to improve AF detection or accuracy at all.
 
Upvote 0
Its a very interesting concept but as highlighted not without its drawbacks which is why they are stating its user changeable. Would be curious to know what affects it would have on ND grads at the front of the lens and on polarisers. The centre line on the built in filter would need to be horizonally accurate which is harder to do if its user selectable being at the point of focus. The ND would also need to be very neutral as not to affect colorimagery.
 
Upvote 0
JohnBran said:
What ?! Canon stop with this nonsense and just increase DR in your sensors.

By all reports their sensors are fine, it's the signal chain which would yield the most improvement. Regardless, why is this nonsense, and why stop it? If this is viable, then it's viable on a lower-noise system as well. People use grad ND filters to allow longer shadow exposures without blowing highlights on sony-based systems...

Besides, the people designing things like this are almost certainly not the same people designing the camera guts. Most electronics engineers don't moonlight as optical engineers. Sure, you could say that there is limited R&D money and that canon should fire everyone developing things not strictly related to sensor performance and throw all the money at dynamic range, but I don't think most canon users would appreciate the end result.
 
Upvote 0
3kramd5 said:
JohnBran said:
What ?! Canon stop with this nonsense and just increase DR in your sensors.

By all reports their sensors are fine, it's the signal chain which would yield the most improvement. Regardless, why is this nonsense, and why stop it? If this is viable, then it's viable on a lower-noise system as well. People use grad ND filters on sony-based systems...

Besides, the people designing things like this are almost certainly not the same people designing the camera guts. Most electronics engineers don't moonlight as optical engineers. Sure, you could say that there is limited RND money and that canon should fire everyone developing things not strictly related to sensor performance and throw all the money at dynamic range, but I don't think most canon users would appreciate the end result.

Nope, I clearly do want canon to continue on the path of bringing mindblowing new lenses. I believe these might have more of an impact on the ability to create unique pictures than improvements in sensor technology.
 
Upvote 0
JohnBran said:
What ?! Canon stop with this nonsense and just increase DR in your sensors.

If this is the basis of the class-leading dynamic range that the 1dx mk ii and future bodies will enjoy and it also can be applied more cost effectively, does it matter how they do it? If it can be used for video and stills, at all ISO, and any speed then is that not a good thing?

Secondly, I saw the tech as being independent of lenses....
 
Upvote 0
Stu_bert said:
JohnBran said:
What ?! Canon stop with this nonsense and just increase DR in your sensors.

If this is the basis of the class-leading dynamic range that the 1dx mk ii and future bodies will enjoy and it also can be applied more cost effectively, does it matter how they do it? If it can be used for video and stills, at all ISO, and any speed then is that not a good thing?

Secondly, I saw the tech as being independent of lenses....
It would have zero impact on the rest of the present range of lenses as this requires the ND to be built-in and thus new designs its less practical than doing it at sensor level and as sensors will develop faster than lenses may actually be problematic for the future.
 
Upvote 0
chromophore said:
Understandably, there's a lot of confusion about what exactly such a construction would do. How would it affect the image? Wouldn't it simply darken half of the frame?

The answer--perhaps strangely--is no. To understand what it does, think about a conventional lens shot wide open. Note that in out of focus areas of the image, circles of confusion ("bokeh balls") are created, and are especially visible when a bright point light source not in the plane of focus is projected through the lens onto the image plane.

If you could alter the shape of the aperture in some way, you would see those out-of-focus highlights rendered in the shape of the aperture. This has been a popular trick exploited by numerous photographers to give bokeh in the shape of hearts, stars, or whatever cloyingly cute gimmick is in vogue these days. This is the key insight that should immediately tell you that placing an ND filter over half of the aperture will not darken half of the frame in such a crude way: instead, if you were to obstruct the light entirely on that half where the ND is being applied, you'd see out-of-focus highlights that were shaped like a semicircular disk.

But in the plane of focus, the image would still be sharp, albeit at an exposure exactly one stop slower than if that obstruction were not present (in the case of total density obstructing half the aperture). Diffraction effects at small apertures would be asymmetrical, but you would not notice this unless you were stopped down to, say, f/8 or slower, and you imaged bright point light sources (the familiar "stars" you see radiating from street lights in night photography, for example, would then be asymmetric).

Now, if the density were NOT complete--only partial, the effect would still be proportionately visible in the out-of-focus highlights if shot at fast apertures (or in such a way as to reveal the circles of confusion). But at, say, 3 stops ND, you would still find it very difficult to see any effect on the image in the plane of focus.

Okay, so now we roughly know what such a device would do to the image as it is perceived. What does any of this actually get you? What is this talk about dynamic range?

The idea is that, in the plane of focus, the incident light on the sensor is coming from two halves (as delineated by this hypothetical ND filter). The light "cone" comes to a (reasonably) sharp point on the image plane, but half of this cone is unimpeded, whereas the other half has its luminous intensity diminished by the ND filter, and so is darker. This dual nature (again, in the plane of focus only!) lets the designer of the system theoretically use a sensor for which two neighboring (sub-)pixels are oriented in such a way as to capture these two halves separately. This is essentially a more sophisticated version of the dual ISO Magic Lantern hack, conducted at an optical level; alternatively, it is a variant of the light field imaging idea but with the incoming image split into only two components (light and dark). The result is the recording of a wider exposure range after appropriate signal processing.

The final step of the idea is that if we already have this Dual Pixel AF technology, then the sensor is already essentially adapted to the use of this modified optical construction. But as we have already seen, this idea is not without its drawbacks: it will totally mess up the appearance of out-of-focus highlights.

This is an excellent explanation. However, one caveat: I do not believe this has anything to do with increasing the dynamic range of the images...and everything to do with increasing dynamic range and more importantly contrast for the purposes of performing AF with a DPAF sensor. There has never been any indication that Canon intends to use DPAF subpixels in independent reads for the purposes of combining them halves into a supposedly higher dynamic range image. (I don't think that is really viable, as it is not the same thing as what ML does...ML uses FULL pixels with two separate exposures to improve DR...with DPAF, each half-pixel exposure has half the signal...so SNR is even lower to start with.)
 
Upvote 0
Woody said:
Idea is similar to dual ISO?

ML's dual_iso has the drawback that one iso is higher, i.e. with extreme settings and/or esp. on crop sensors you're getting a noisier shot than with a bare bones Sonikon sensor at low iso. The nd filter solution (if it is a solution) would allow you to leave all pixels at one iso value, and thus avoid all further hassle ... including dual_iso's broken white balance.

Another diffrence is that ML is up to max. 3 stops (100/1600 setting), while I guess an nd filter can have an even larger ev spacing.
 
Upvote 0
jrista said:
chromophore said:
Understandably, there's a lot of confusion about what exactly such a construction would do. How would it affect the image? Wouldn't it simply darken half of the frame?

The answer--perhaps strangely--is no. To understand what it does, think about a conventional lens shot wide open. Note that in out of focus areas of the image, circles of confusion ("bokeh balls") are created, and are especially visible when a bright point light source not in the plane of focus is projected through the lens onto the image plane.

If you could alter the shape of the aperture in some way, you would see those out-of-focus highlights rendered in the shape of the aperture. This has been a popular trick exploited by numerous photographers to give bokeh in the shape of hearts, stars, or whatever cloyingly cute gimmick is in vogue these days. This is the key insight that should immediately tell you that placing an ND filter over half of the aperture will not darken half of the frame in such a crude way: instead, if you were to obstruct the light entirely on that half where the ND is being applied, you'd see out-of-focus highlights that were shaped like a semicircular disk.

But in the plane of focus, the image would still be sharp, albeit at an exposure exactly one stop slower than if that obstruction were not present (in the case of total density obstructing half the aperture). Diffraction effects at small apertures would be asymmetrical, but you would not notice this unless you were stopped down to, say, f/8 or slower, and you imaged bright point light sources (the familiar "stars" you see radiating from street lights in night photography, for example, would then be asymmetric).

Now, if the density were NOT complete--only partial, the effect would still be proportionately visible in the out-of-focus highlights if shot at fast apertures (or in such a way as to reveal the circles of confusion). But at, say, 3 stops ND, you would still find it very difficult to see any effect on the image in the plane of focus.

Okay, so now we roughly know what such a device would do to the image as it is perceived. What does any of this actually get you? What is this talk about dynamic range?

The idea is that, in the plane of focus, the incident light on the sensor is coming from two halves (as delineated by this hypothetical ND filter). The light "cone" comes to a (reasonably) sharp point on the image plane, but half of this cone is unimpeded, whereas the other half has its luminous intensity diminished by the ND filter, and so is darker. This dual nature (again, in the plane of focus only!) lets the designer of the system theoretically use a sensor for which two neighboring (sub-)pixels are oriented in such a way as to capture these two halves separately. This is essentially a more sophisticated version of the dual ISO Magic Lantern hack, conducted at an optical level; alternatively, it is a variant of the light field imaging idea but with the incoming image split into only two components (light and dark). The result is the recording of a wider exposure range after appropriate signal processing.

The final step of the idea is that if we already have this Dual Pixel AF technology, then the sensor is already essentially adapted to the use of this modified optical construction. But as we have already seen, this idea is not without its drawbacks: it will totally mess up the appearance of out-of-focus highlights.

This is an excellent explanation. However, one caveat: I do not believe this has anything to do with increasing the dynamic range of the images...and everything to do with increasing dynamic range and more importantly contrast for the purposes of performing AF with a DPAF sensor. There has never been any indication that Canon intends to use DPAF subpixels in independent reads for the purposes of combining them halves into a supposedly higher dynamic range image. (I don't think that is really viable, as it is not the same thing as what ML does...ML uses FULL pixels with two separate exposures to improve DR...with DPAF, each half-pixel exposure has half the signal...so SNR is even lower to start with.)

Would creating contrast in the AF sensor which doesn't exist in the scene be beneficial? Seems like it would create uncertainty. Also DPAF is based on phase, so how would contrast improve it? The summary specifically mentions AF sensors, but I'm unsure how exactly this would improve AF.

Pure speculation pertaining to split pixels for imaging purposes, but don't they already read out each subpixel (and send the info to the AF brain) and for imaging use a logic device somewhere in the chain to determine total pixel charge based on each pair of diodes? If so, darkening half of each pixel could open some interesting options for the determination of total charge (i.e. it need not be a pure sum).
 
Upvote 0