Canon Leads in Sensor Tech

Status
Not open for further replies.
tcmatthews said:
Pi said:
Speaking about emphasis: I find the recent news that Nikon is patenting an adjustable AA filter really exciting. I wish that was Canon.
I wish they would just remove the AA filter and be done with it. Solve the problem in software and down scale. But the movie crowd and some landscape photographers would scream bloody murder. And insist software is just not good enough.

Software is not good enough
 
Upvote 0
Re: Canon Leads in Sensor Tech - Dual Pixel more to it?

unfocused said:
After following the 30 pages (currently) of obscure debate over DXO ratings, I have to say this:

I am getting a little sick of the conventional wisdom that somehow Canon is "behind" in sensor technology. The more accurate statement is that Canon has placed a different emphasis in its sensor development than some of its competitors. And, it would also be correct that Canon has placed a different emphasis on its sensor development than a vocal group of participants in this forum would like.

Specifically, Canon has decided to push sensor technology that improves live view and video autofocus and has done so without compromising still image quality. Canon's competitors appear to be emphasizing marginal improvements in sensor performance for stills.

One can say Canon is "behind" only if one totally discounts the significant technological advancement that its dual-pixel sensor represents.

I think there are two points you may overlook and nobody has commented yet on it.

A), the Dual pixel technology means that underneath there is a 40 MP sensor. While
I have not seen any chip analysis yet, it seems rather likely that this was not, or couldn't have
been accomplished with the same old 500 nm technology. So, what is the actual status of Canon's production
line?

B) Dual pixel technology DOES NOT ONLY have to be about PDAF on the sensor. We have the recent
example of Magic Lantern achieving much higher dynamic range by reading alternating pixels with different
ISOs. With the new 70D dual pixel sensor, Canon would have the opportunity to implement the same trick
right there. All they need to do is provide the same capability that the 7D and the 5DIII have, i.e. the possibility to read each half site with a different ISO applied. While the intensity data from the two sites has to be combined, no further new complicated demosaicing algorithm would have to be applied, since the two half-sites sit under the same microlens (unlike the ML solution, which has to redo everything).
So, it should be easy to get a DR of 14 stops or more.
The RED camera mentioned with 20 DR is using the very same trick, two different exposures, it's not a better sensor.

So, don't discount the dual pixel as only PDAF. It's technology provides the ground for much higher DR. The question is, when and where will Canon implement it. In the 7DII?? Is that how they will top the 70D?
 
Upvote 0
Jim O said:
The challenge for any company, as you say, is to identify their customers and figure out what they want now and in the future. However, growth comes by by finding new customers and by increasing market share, not losing it due to others getting ahead.

True. But, I don't believe for a single second that the small differences in sensor performance that currently exist have the slightest effect on market share.

The entire industry is facing significant challenges in two key areas: Economic problems in both mature and developing markets and the seismic collapse of the point and shoot market. Both of these are causing third party companies to try to get a larger piece of the Nikon-Canon DSLR market, which has traditionally been a more lucrative market. At the same time, both the DSLR market and the technology are maturing, slowing the growth in existing markets.

Add to this the fact that the "best" technology is almost never a path to success. Betamax being the most-often cited example, but there are many others.

Someone mentioned Kodak earlier. But the fact is, Kodak did not fail because it had inferior technology, in fact, Kodak was built on inferior technology. The Box Brownie was inferior to other cameras available at the time. Kodak cameras, film and chemicals were always inferior to other brands that were available. Yet, Kodak was a success for more than a century because it followed the "good enough" path.

When Kodak finally failed, it wasn't because Agfa produced better products, it was because their market disappeared.

I am quite certain that is what Canon and Nikon are most worried about. The point and shoot market has disappeared. And, while it's still profitable and strong, they aren't seeing the growth that they previously had in the enthusiast market.

You are correct that companies need to find new markets to succeed and stay in business. But, while Canon may lose a tiny fraction of a fraction of a fraction of customers to sensor tech. But, that's not where the real threat is coming from.
 
Upvote 0
jrista said:
f banding did occur, and it was not overpowered by banding caused by high frequency components or photon shot noise, I suspect a very significant exposure push or shadow lift would be necessary to cause it to show up.

All this is a speculation anyway but the noise would be multiplicative, not additive. The noise coming from uneven channels is of this sort and can be seen in the midtones with normal developing parameters.
 
Upvote 0
unfocused said:
Jim O said:
The challenge for any company, as you say, is to identify their customers and figure out what they want now and in the future. However, growth comes by by finding new customers and by increasing market share, not losing it due to others getting ahead.

True. But, I don't believe for a single second that the small differences in sensor performance that currently exist have the slightest effect on market share.

I didn't say that they did. But they probably do have a "slight" effect.


unfocused said:
The entire industry is facing significant challenges in two key areas: Economic problems in both mature and developing markets and the seismic collapse of the point and shoot market. Both of these are causing third party companies to try to get a larger piece of the Nikon-Canon DSLR market, which has traditionally been a more lucrative market. At the same time, both the DSLR market and the technology are maturing, slowing the growth in existing markets.

Very true.


unfocused said:
Add to this the fact that the "best" technology is almost never a path to success. Betamax being the most-often cited example, but there are many others.

Depends how one defines "the best". Look at Apple and my earlier comments. Windows was "good enough" but MacOSX is "better" in almost every way [ducks].


unfocused said:
Someone mentioned Kodak earlier. But the fact is, Kodak did not fail because it had inferior technology, in fact, Kodak was built on inferior technology. The Box Brownie was inferior to other cameras available at the time. Kodak cameras, film and chemicals were always inferior to other brands that were available. Yet, Kodak was a success for more than a century because it followed the "good enough" path.

When Kodak finally failed, it wasn't because Agfa produced better products, it was because their market disappeared.

I guess it's in how one defines "good enough". Kodak made photography available to the masses and commoditized film and consumer grade cameras. That was exactly my point. They found new markets. People wanted to take snapshots and Kodak made it possible. It was innovative! They also sold paper that was a better value for the typical consumer and they made huge margins on it. BTW, I have black and white photos from the 1920's-1950's of my father, and many the 1930's-1950's of my mother, including many of my father and his brothers from World War II, and they have held up quite well. Most if not all are on Kodak paper. So I guess it was "good enough".

Kodak didn't fail solely because their market disappeared. They failed because they failed to anticipate it in time. They were complacent. They had huge market share and thought it would never go away. Companies that are ahead of the curve survive. Companies that fall behind fail.

BTW, Agfa survived. So did Ilford. So could have Kodak but for extremely poor management.


unfocused said:
You are correct that companies need to find new markets to succeed and stay in business. But, while Canon may lose a tiny fraction of a fraction of a fraction of customers to sensor tech. But, that's not where the real threat is coming from.

True today. And if I am a "typical" customer you are correct. Let the differences expand and that could change. I wouldn't buy a Model T today, unless I was a collector.
 
Upvote 0
Etienne said:
tcmatthews said:
Pi said:
Speaking about emphasis: I find the recent news that Nikon is patenting an adjustable AA filter really exciting. I wish that was Canon.
I wish they would just remove the AA filter and be done with it. Solve the problem in software and down scale. But the movie crowd and some landscape photographers would scream bloody murder. And insist software is just not good enough.

Software is not good enough

Beyond that I have yet to see an example pair where the AA version couldn't be sharpened to look the same as the non-AA version. Perhaps someone could post such an example, I admit I haven't extensively tested this. But the couple times sample pairs were available a quick USM brought them even.
 
Upvote 0
hilduras said:
yes you are right, if I want a video camera, but for now I love to take stills

IMHO there is an improvement at high ISO with the 6D / 5D3 over the 5D2, and an edge over competitors like the D600. DxO may disagree, but when I look at DPReview and Imaging Resource test shots, that's what I see.

Likewise my EOS M is better at high ISO then my 7D. Not by leaps and bounds, but there was some improvement.

Critiques portray Canon as standing still with antiquated sensor tech. They will point at the 18 MP APS-C line and say "look how old that is!" ignoring that each iteration brings some improvement. They're looking for a radical improvement in shadow noise (see the Sony patent) or MP, and ignoring small improvements over time.

Critiques also rely on DxO measurements which I find questionable. For example, DxO will tell you there was no DR improvement from the 10D/20D to the 7D. I owned all 3 and I noticed a significant DR improvement with the 7D.

Canon's moving forward in still photography sensor tech, just not at the pace some people want.
 
Upvote 0
dtaylor said:
hilduras said:
yes you are right, if I want a video camera, but for now I love to take stills

IMHO there is an improvement at high ISO with the 6D / 5D3 over the 5D2, and an edge over competitors like the D600. DxO may disagree, but when I look at DPReview and Imaging Resource test shots, that's what I see.

Likewise my EOS M is better at high ISO then my 7D. Not by leaps and bounds, but there was some improvement.

Critiques portray Canon as standing still with antiquated sensor tech. They will point at the 18 MP APS-C line and say "look how old that is!" ignoring that each iteration brings some improvement. They're looking for a radical improvement in shadow noise (see the Sony patent) or MP, and ignoring small improvements over time.

Critiques also rely on DxO measurements which I find questionable. For example, DxO will tell you there was no DR improvement from the 10D/20D to the 7D. I owned all 3 and I noticed a significant DR improvement with the 7D.

Canon's moving forward in still photography sensor tech, just not at the pace some people want.

Agreed, for the most part. I think the lack of any kind of significant revolutionary leap in favor of very small evolutionary improvements will hurt Canon in the long run, when more revolutionary leaps are what the competition has been doing.

As for DXO, I wouldn't call ALL of their measurements questionable. Of all their measurements, the only one I truly find questionable is Print DR. I understand the need to normalize image size for cross-camera comparisons, but I find the notion that you can gain photographic DR (and stops of it, at that!) simply by downsampling to be extremely suspect. I believe the very vast majority of photographers who care about DR are editing in RAW...and the simple fact of the matter is...you don't scale raw. You ALWAYS edit RAW at full size, so any additional DR would come from the Screen DR measurement.

The worst thing from DXO is their linear, scalar "score". I find scores for such complex devices to be useless, misleading, and in many cases with DXO (i.e. lens reviews), just flat out wrong.
 
Upvote 0
dtaylor said:
Etienne said:
tcmatthews said:
Pi said:
Speaking about emphasis: I find the recent news that Nikon is patenting an adjustable AA filter really exciting. I wish that was Canon.
I wish they would just remove the AA filter and be done with it. Solve the problem in software and down scale. But the movie crowd and some landscape photographers would scream bloody murder. And insist software is just not good enough.

Software is not good enough

Beyond that I have yet to see an example pair where the AA version couldn't be sharpened to look the same as the non-AA version. Perhaps someone could post such an example, I admit I haven't extensively tested this. But the couple times sample pairs were available a quick USM brought them even.

First let me state that the AA exacts a cost beyond sharpness and with decreasing size of pixels its need is being reduced. If the optical AA filter layer is removed from the sensor it will improve edge performance on mirror-less cameras by reducing the thickness of the filter in front of the sensor. The AA filter is basically blur filter similar to a gaussian blur filter (optical low pass filter). This is why USM brings back the sharpness. One way to remove moire is to apply a gaussian blur filter then use USM to bring back the sharpness. All animation and almost all 3d games apply gaussian blur to their renderings and most modern video cards can add it in real time to video. The big issue with this in photos is you loose some photo info. But with a strong AA filter you loose info some of which can be brought back through USM.

Second when I say software I do not explicitly mean Post Processing Software. There is nothing to say that Canon could not apply a two pixel Gaussian blur to the raw readout before combining the phase pixels. This could form a simple optical low pass filter in either software(firmware) or electronics hardware depending on their design. Basically solve it in software and down sample before you ever touch it.

I have not looked into Nikon's patent but if it is not MEMS based optical solution it has to be either a electronic hardware or software trick. I suspect it is electronic readout lowpass filter trick.

But there will be plenty of people that will complain that all the above is not enough and want there optical Low pass AA filter.
 
Upvote 0
neuroanatomist said:
jrista said:
The only future hardware update I can conceive of would be Quad Pixel AF, which would support sensor-plane PDAF in both the horizontal as well as vertical directions.

Could be done as orthogonal dual pixel AF, where the half of the pixels are split vertically and half are split horizontally.

From an engineering perspective I doubt that this is necessary. It would be better if half were vertical and half were horizontal but that is not strictly necessary. I am assuming that a single dual pixel pair is not enough to make a focus determination. They are likely sampling multiple pixel pair in some given pattern. If you use some of the algorithms used in software defined radio (more specifically software defined antenna grid arrays) you can determine vertical and horizontal information. Software defined radio and software defined antenna arrays are the current holly grail of radio FR research and Canon basically implemented an optical version of it to collect phase info for focusing.

I doubt that Canon fully implemented it competently in software it is likely hardwired to some extent but it is easy to see where this is going in the future.
 
Upvote 0
tcmatthews said:
Second when I say software I do not explicitly mean Post Processing Software. There is nothing to say that Canon could not apply a two pixel Gaussian blur to the raw readout before combining the phase pixels. This could form a simple optical low pass filter in either software(firmware) or electronics hardware depending on their design. Basically solve it in software and down sample before you ever touch it.

Even if the dual pixel sensor is a truly 40 mp one, 40mp on FF still needs an AA filter (yes, I know that the D800E has none, more precisely, a very weak one). On crop, we may be close to the pixel density not requiring an AA filter but this is hard to predict.

But there will be plenty of people that will complain that all the above is not enough and want there optical Low pass AA filter.

Those would be the people who understand Sampling Theory.
 
Upvote 0
Pi said:
tcmatthews said:
Second when I say software I do not explicitly mean Post Processing Software. There is nothing to say that Canon could not apply a two pixel Gaussian blur to the raw readout before combining the phase pixels. This could form a simple optical low pass filter in either software(firmware) or electronics hardware depending on their design. Basically solve it in software and down sample before you ever touch it.

Even if the dual pixel sensor is a truly 40 mp one, 40mp on FF still needs an AA filter (yes, I know that the D800E has none, more precisely, a very weak one). On crop, we may be close to the pixel density not requiring an AA filter but this is hard to predict.

But there will be plenty of people that will complain that all the above is not enough and want there optical Low pass AA filter.

Those would be the people who understand Sampling Theory.

I understand Sampling Theory the my point is for me we are getting to the point that it may make sense to remove it for good especially for crop. And personally I do not really need it there. Some moire does not realty bother me but that is a matter of preference. In cell phone sensors they are getting to the point that removal of the UV filter likely makes sense as well because of pixel pitch.

You are right about FF it likely still need a AA filter for some time to come. At least until or if the pixel pitch makes sense to remove it. I would just air on the side of a weak filter. For video use software techniques when down-sampling or better just give us raw.

My first post was really just bait.
 
Upvote 0
dtaylor said:
Beyond that I have yet to see an example pair where the AA version couldn't be sharpened to look the same as the non-AA version. Perhaps someone could post such an example, I admit I haven't extensively tested this. But the couple times sample pairs were available a quick USM brought them even.

you need to try it and pixel-peep but the difference is there.
you can sharpen the AA version to be about like the non-AA version
But you can also sharpen the non-AA version to get wow-factor... at least in the center region with a good lens.
edit: I rarely use USM for sharpening fine detail, it messes up edges with too much contrast/halo compared with other sharpening methods.

I love my d800e for that reason, the textural detail it can deliver is very impressive.

I also prefer my K-5 IIs for the same reason, AA-less 16MP crop body that delivers amazingly crisp detail. I can see the difference when I go back and shoot blurred sensors; the pixel-level detail is not the same.
I haven't had a chance to shoot with the higher res D7100 yet.

Moire's not often been a problem, but false-color fine highlite details have been so I rarely use these cameras for shots with energetic moving water and other fine specular reflections.
 
Upvote 0
tcmatthews said:
Pi said:
tcmatthews said:
Second when I say software I do not explicitly mean Post Processing Software. There is nothing to say that Canon could not apply a two pixel Gaussian blur to the raw readout before combining the phase pixels. This could form a simple optical low pass filter in either software(firmware) or electronics hardware depending on their design. Basically solve it in software and down sample before you ever touch it.

Even if the dual pixel sensor is a truly 40 mp one, 40mp on FF still needs an AA filter (yes, I know that the D800E has none, more precisely, a very weak one). On crop, we may be close to the pixel density not requiring an AA filter but this is hard to predict.

But there will be plenty of people that will complain that all the above is not enough and want there optical Low pass AA filter.

Those would be the people who understand Sampling Theory.

I understand Sampling Theory the my point is for me we are getting to the point that it may make sense to remove it for good especially for crop. And personally I do not really need it there. Some moire does not realty bother me but that is a matter of preference.

I dunno. It really kind of depends on the apertures you use most, and whether your lens produces enough diffraction at those apertures to produce blur larger than the pixel pitch. Sensors have a REALLY LONG way to go before they are under-resolving diffraction-limited lenses at f/5.6 or above, and there are some lenses on the market that produce diffraction-limited resolution, or near enough that the lens well outresolves the sensor, at those apertures. If you tend to shoot at such apertures, removal of the AA filter isn't necessarily going to do you any good. Your images might be crisper in some cases, but the potential for aliasing and moire will still be higher.

Nikon's patent regarding a toggleable AA filter is particularly intriguing here, as it would give you the option in the field, optically, to solve the problem if it was necessary, and maximize detail otherwise.

tcmatthews said:
In cell phone sensors they are getting to the point that removal of the UV filter likely makes sense as well because of pixel pitch.

UV filter? That would mean pixels are approaching 380nm (0.38µm). At the moment, the smallest pixels are 1100nm (1.1µm), which is approaching the near-infrared spectrum. Once we achieve 700-800nm pixels, then you could remove the IR cutoff filter (which most digital sensors have). If we ever actually get to 700nm, then we would already be filtering some red light. The chances of us getting to pixels smaller than 400nm is impossible...as we would then no longer be able to record visible light! :P
 
Upvote 0
tcmatthews said:
In cell phone sensors they are getting to the point that removal of the UV filter likely makes sense as well because of pixel pitch.

How would removal of the UV filter based on small pixel pitch make sense? UV wavelengths are shorter than visible light. Perhaps you meant the IR cut filter?
 
Upvote 0
I'm pretty sure a sensor made up of pixels smaller than 700nm would just end up collecting excess heat if you let IR light hit it, so it's still in your best interest to filter light not being translated into a signal.
 
Upvote 0
9VIII said:
I'm pretty sure a sensor made up of pixels smaller than 700nm would just end up collecting excess heat if you let IR light hit it, so it's still in your best interest to filter light not being translated into a signal.

You are still gathering that heat. It will either be directly in the sensor itself, or a fraction of a millimeter above it. One way or another, the ambient temperature of the sensor is going to increase, so why put in an unnecessary filter?
 
Upvote 0
jrista said:
UV filter? That would mean pixels are approaching 380nm (0.38µm). At the moment, the smallest pixels are 1100nm (1.1µm), which is approaching the near-infrared spectrum. Once we achieve 700-800nm pixels, then you could remove the IR cutoff filter (which most digital sensors have). If we ever actually get to 700nm, then we would already be filtering some red light. The chances of us getting to pixels smaller than 400nm is impossible...as we would then no longer be able to record visible light! :P

I don't think that's actually true. As best I understand the behavior of light, some light passes through holes at a given wavelength even if the holes are significantly smaller than the wavelength. It just falls off in intensity, both as the size of the hole shrinks below the wavelength and as the distance between the hole and the detector on the other side increases.

So if we had pixels that were on the order of 300 nm, you might be able to get away with dropping the infrared filter, but you'd also have a lot of red loss by that point, some green loss, and a little blue loss.

That said, I am not a physicist, so I could be understanding things incorrectly.
 
Upvote 0
dgatwood said:
jrista said:
UV filter? That would mean pixels are approaching 380nm (0.38µm). At the moment, the smallest pixels are 1100nm (1.1µm), which is approaching the near-infrared spectrum. Once we achieve 700-800nm pixels, then you could remove the IR cutoff filter (which most digital sensors have). If we ever actually get to 700nm, then we would already be filtering some red light. The chances of us getting to pixels smaller than 400nm is impossible...as we would then no longer be able to record visible light! :P

I don't think that's actually true. As best I understand the behavior of light, some light passes through holes at a given wavelength even if the holes are significantly smaller than the wavelength. It just falls off in intensity, both as the size of the hole shrinks below the wavelength and as the distance between the hole and the detector on the other side increases.

So if we had pixels that were on the order of 300 nm, you might be able to get away with dropping the infrared filter, but you'd also have a lot of red loss by that point, some green loss, and a little blue loss.

That said, I am not a physicist, so I could be understanding things incorrectly.

Sure, its not a sudden cutoff to zero light, however it falls off faster than you think. you would be working with practically no visible light with 380nm pixels. I dont see the usefulness of a sensor that is filtering out the vast majority of the light it is supposed to be sensitive to. It takes some rather specialized equipment, for example, to perform subwavelength EUV lithography for cmos manufacture...I can't imagine a subwavelwngth sensor would be an easy or cost effective thing. I guess color splitting in place of color filtration might help...
 
Upvote 0
Status
Not open for further replies.