Has it ever been rumored for their to be prototypes in testing of what would essentially be a physical 1.6x crop of a FF sensor, allowing the ISO capabilities of the 6D/5D3 in, say, an EOS M or xxD body?
I'd love me an EOS M with 6-9 megapixels of low light goodness!
Would this be stupid-expensive to develop? I can imagine the right advertising campaign could sell the concept of fewer pixels for low light, arty, shallow DoF shooting with the 22mm f/2 with results that're still 2-4x larger than necessary for Facebook
Your post shows a fundamental misunderstanding of how cameras work.
There is no difference between low light performance when adding more pixels in the range of pixels that DSLR's typically have. The issue with more pixels = more noise is only relevant to ultra compact sensors, like smartphones.
There USED to be an issue with more pixels adding more noise BEFORE micro lenses were invented.
Let's just do the math.
According to DXO Mark's sensor tests. The Canon 70D which has 40.4 megapixels which are binned into 20.2 mp in files and has ISO performance that is equal at 2.5 times it's ISO rating compared to the 5D Mark III, which has 22.3 megapixels. The 5DIII has around 5 times less pixel density than the 70D.
So ISO 1000 on the 70D = same noise as ISO 2500 on the 5D Mark III.
The 70D has a sensor area that is 1/2.5 times that of the 5D Mark III. Wait a second... isn't that the exact difference between the 70D and the 5D Mark III's ISO? Yes it is!
Because the sensor is 2.5 times smaller the GAIN on the smaller sensor has to be 2.5 times higher for a given illumination to get the same exposure. Meaning if you cropped the 5D Mark III's sensor to APS-C ISO 2500 would now be called ISO 1000, even though the sensor gain is identical.
With current sensor technology there is no meaningful difference between the noise sensitivity of larger pixels to smaller pixels in DSLRs. That's why Canon can go from an 18 megapixel sensor in the 60D to a 40.4 megapixel sensor and actually improve noise performance in the 70D. That's why the 36 megapixel D800 and the 22.3 megapixel 5D3 have the same noise performance.
The reason why full frames are better than APS-C sensors is aperture and sharpness.
An APS-C camera has the exact same depth of field at equivalent focal lengths to a full frame camera when the APS-C camera is at f/1.75 and the full frame is at f/2.8. Or when the APS-C is at f/2.5 and the full frame is at f/4.0.
That's why the Sigma 18-35mm f/1.8 on crop gives you the same look as a 24-70mm f/2.8 lens on full frame.
What this means is that if you put an f/1.4 lens on a full frame you are shooting at f/0.85 on crop, and there are very few f/0.85 lenses. That's why full frame is better in low light, because on crop you are shooting at way lower equivalent aperture and the lenses available for full frame tend to have faster apertures.
Even though you can design faster lenses for crop than you can for full frame (f/0.85 40mm lenses for example exist for crop but not full frame) full frame is better than crop because it is much easier to design a lens with a given equivalent aperture and equivalent focal length for full frame. A f/1.2 50mm lens on crop will look much worse than a 85mm f/2 lens on full frame. The net result is that you have better availability of faster lenses, and at equivalent settings full frame typically has at least twice the resolution and way less aberrations.
For example if you compare a 70-200mm f/2.8 IS II on a 70D @ 70mm f/2.8 vs a 70-200mm f/4.0 IS L on a 5D Mark III @112mm f/4.5, the lens on the full frame has 3.3 TIMES the spacial resolution and detail.
Your talking equivalence. In a sense, your right, when it comes to aperture equivalence vs. total sensor area, an f/4 400mm lens used on a 70D is effectively equivalent to an f/8 800mm lens used on a 5D III. That does, however, assume that you actually use cameras that way. I'd like to present a real-world scenario that demonstrates why this isn't actually necessarily the case.
First off, smaller pixels don't "add" noise. That's a misnomer. Noise is present in the image signal, it is a NATURAL phenomena derived from the physical nature of light. Smaller pixels divide up an image signal into smaller parts, thus the intrinsic noise in the discretized result is higher. That REQUIRES more gain. Equivalently, that may not be an issue, however again...in a real-world situation, things can be and are frequently not equivalent.
Second, microlenses are used on ALL sensors nowadays. The advantage of microlenses is not solely given to APS-C sensors, therefor there is no advantage at all. Microlenses only serve to increase the incident light on the photodiode. That does not change the photodiodes capacity. I'd also point out that even with microlenses today, we aren't even close to 100% capture. Even with double layered microlenses, both above and below the CFA, there isn't 100% capture. Microlens power, something difficult to control but critical, is usually wrong for the given photodiode pitch. The photodiode is also not level with the readout wiring, the wiring creates a literal "well", deep inside of which the photodiode sits. Microlenses direct more light into the photowell, however with smaller pixels, a greater percentage of that light is lost as incident strikes on the wiring wall itself. Some even reflects back out of the photowell. An area of research right now in CIS manufacture is the production of more accurately curved microlenses that focus more light onto the photodiode itself. Other avenues of research, such as lightpiping, fills the photowells with a high-K refracting substance that picks up where the photodiodes leave off, channeling more light onto the photodiode. However in NONE of these cases is light capture ~100%. You still have losses.
Third, the 70D is not a 40.4mp sensor. It is a 20.2mp sensor. Plain and simple. When you bin, you bin, there isn't 40 million much smaller pixels...there are 20 million slightly smaller pixels. The charge of both halves of a DPAF pixels are read out and combined ON THE SENSOR. There is effectively ZERO difference between that and having one full sized pixel. The 70D is a 20.2mp sensor, always has been a 20.2mp sensor, always will be a 20.2mp sensor. The 32 million pixels (80%) that are used for DPAF are only read out individually for AF purposes. Gain will be twice as high, noise will be twice as high (at least), but this kind of readout does not produce images. It is only used for AF. It is an invisible factor that never affects IQ. From a photography standpoint, the 70D is a 20.2mp camera with a 26726e- FWC. Plain and simple.
Finally, there is aperture in-equivalence. In absolutely no way are the smaller pixels of a sensor like the 70D equivalent to the larger pixels of a 5D III when you consider identical aperture
at identical framing
. Let me take my wildlife photography as an example. I LOVE blurry backgrounds! For blurry backgrounds, I use the fastest aperture I can get away with. I spent $13000 on a 600mm f/4 lens to help me get blurrier backgrounds (and to get more reach once I moved to FF.) I currently use a 7D. It's a great camera, it has served me well for over two years now. But it's just not cutting it when it comes to helping me achieve all of my goals. So I'll be switching up to a 5D III soon.
Now, here are the facts of how the 5D III will affect my shooting. First, it will lose the reach advantage...and the FPS advantage. At least, theoretically. I can always attach a TC or get closer to my subject...getting closer is not much of a problem especially with wildlife, and when it is, the TC will do quite fine (I've never needed to use a TC with my wildlife photography...in fact I always feel I should be able to get closer, but the 7D crop factor is the limiting factor here). I do not need to stop down, the entire point is to reduce the depth of field
, so aperture equivalence (ironically) doesn't apply here...instead of stopping down from f/4 to f/8, I simply leave the aperture at f/4. As for the FPS difference, the 7D has an inherent AF jitter...so it loses about 2-3 fps anyway, so there is really no difference there (the edge might even lean in the 5D III's favor.) That leaves all the other differences. The 7D has a smaller
sensor, so I have to be back farther to frame, meaning less boke
. It has smaller
pixels, so more
of higher gain). It has fewer
The 5D III has the advantage in every
respect. Its larger FoV allows
me to get closer, which is exactly what I want. Once I'm closer, I frame the subject the same way. That means I'm not only using larger pixels (so less gain), but putting MORE of those larger pixels on my subject! There is no way around it here. Even in my previously reach-limited scenario, with a 600mm lens, reach is not nearly the issue it used to be at 400mm (a subject area difference of 2.25x relative to the frame, so if the difference between a 7D/70D and 5D III is 2.5x, moving to the longer lens left me with a mere potential 0.25x reach loss...however given the 7D's AA filter, we can call it even.)
The simple fact of the matter is that, while a FF sensor at f/8 gathers roughly the same total light as an APS-C sensor at f/4.5, that isn't how people shoot. We aren't even forced to shoot like that by any real-world conditions most of the time. The crop-sensor advantage really only presents in literal reach-limited scenarios, where you are photographing small birds at a distance, and cannot get closer. The 70D or 7D would then have the advantage...you could use a 400mm f/5.6 lens on a cropped sensor, or an 800mm f/8 lens on a FF sensor...and you would then indeed finally experience the one case where equivalence directly applies. Outside of that...it's all just theory, theory that otherwise indicates that the FF sensor will always have the advantage at the same or faster aperture than the APS-C sensor is used at (i.e. 5D III + 600/4 vs. 70D + 400/5.6...5D III wins hands down every time no contest; beyond that, my 600mm f/4 with a 1.4x TC is 840mm f/5.6, so even when I am limited by reach, once I add the TC I am still not at an "equivalent" aperture...I'm at the same aperture, so the 5D III + 840mm lens STILL wins, hands down, every time, no contest).
I'd also offer that photographer skill plays a significant role in combating the direct application of equivalence. Even as a bird photographer, the more you hone your skill, the less you will actually NEED a physical reach advantage. Professional bird photographers are much more skilled at getting close enough to frame fully with a FF camera without using a teleconverter than your average novice with a 7D and 400mm lens. Spend five years or more photographing small birds on a regular basis, and you'll ultimately find that you end up having too much reach, and actually need the FF sensor to give you back some FoV. (This is basically where I am now...I'm able to get far closer to the small birds in my back yard than I used to, and I certainly need less reach with the wildlife, and the 7D with my 600mm lens is actually more of a problem than not.)
The theory is sound, but there is more to equivalence than simply always reducing camera equipment to equal use cases. The real-world case is that use cases are NOT equal, therefor lending the advantage to FF.