read what I write, the real improvements are around 1,1 to 1,4 um sensel size
and there are no APS or 24x36 from Canon or others yet= with that small pixel size
BSI cost about 30% more than FSI
Improvements like BSI typically improve image quality mathematically and from a perception point of view, by increasing QE and reducing effects orginating from pixel stack height, when comparing two pixels of equal size. At 1.4 um pixel pitch the improvement offered by BSI is small. By 1.1 um pixel pitch, BSI offers a substantial advantage, unless some FSI breakthrough is made. BSI costs more to make so there is motivation for the FSI breakthough
It really depends on the photodiode size. A 7D has 4.3 micron pixels, but the actual photodiode is smaller than that. The entire pixel is surrounded by 500nm (.5 micron) transistors and wiring, which would mean the photodiode...the actual light sensitive area embedded in the silicon substrate, is only about 3.3 microns at best (and usually, the photodiode has a small margin around it...so closer to 3 microns). A 24.4mp sensor would have pixels in the range of 3.2 microns, however with a 500nm process, the actual photodiode pitch is closer to 2 microns.
Canon has already demonstrated that larger pixels can be huge for overall SNR (and therefor actual light sensitivity) with the 1D X. Despite the fact that the 1D X is a FF sensor, it benefits greatly from a larger pixel, and thus a larger photodiode size...as the gain is relative to the square of the pixel pitch. Production of a BSI APS-C 24.4mp sensor would mean that it could have 3.1 micron photodiodes that perform at least as well as the 7D's 18mp sensor, as total electron capacity is relative to photodiode area. A 24.4mp BSI 7D II could then be roughly as capable (~21,000 electrons charge FWC @ ISO 100) as an 18mp FSI 7D.
Personally, I find that to be quite a valuable thing. Especially given that the 7D currently performs about as poorly as one could expect by today's standards. A 2 micron photodiode in the 7D II would mean SNR suffers even more, which is going to have an impact on IQ, especially for croppers, so I can't imagine Canon doing that.
I was never quite sure about this topic, it seemed very electrical engineer related and there was a lot of acronyms and stuff that confused me and made my brain hurt but this post by jrista is the first time I kinda understand what you guys are talking about! Thanks!
Rookie question - what does BSI and FSI stand for?
Glad it was helpful. Any engineering stuff aside, an image sensor is really just a circuitboard with sensors that generate electric charge in response to light stimulus surrounded by a bunch of electronic logic (transitors, capacitors/resistors, and wiring) designed to make it possible to "read" out the charge of each pixel when told to do so. Generally, as a matter of physics, the larger the area of the sensor, the more light can be detected and converted into charge.
BTW, BSI stands for Backside Illiminated, it has to do with the specifics of how the sensor is manufactured. These nano-scale circuit boards are "etched" onto the surface of highly polished, high grade silicon wafers. Etching occurs via light, which is beamed through a much larger scale "circuit board template" and onto the surface of the silicon (its a lot more complicated than that, as etching a CMOS device is usually done in layers, with depositions of various material for each layer, and further etchings with different templates...but that's the gist). The "front" side is the side that is etched. Usually, all the logic is etched onto the front side, and the photodiode itself is simply appropriately doped silicon in a grid at the bottom of the "well" created by all the transistors and wiring. Sensors etched in such a way are FSI, or Front Side Illuminated. Fig 1: You can see the photosite well in this image. The "pixel cathode" is the photodiode. Various wiring surrounds the photodiode. Above the pixel is a color filter and a microlens.Fig 2: You can see the grid layout of pixels in this image.
A newer technique originally designed to support the increasingly small photodiode area left available in small form factor sensors (such as the ones that are a fraction of a fingernail in size) for cell phone cameras, cheap point & shoots, etc. put the photodiode on the back of the silicon wafer, then etched the wiring on the front side, connected to the previously etched photodiodes. There are also usually color filters and micro lenses etched into the back side as well, above the photodiode itself. The process is more expensive as usually, only one side of the wafer needs to be etched or doped. The back side is usually just part of the "substrate", and the number of defects (stratches, pits, or other marks or even particulate embedded into the surface) do not matter. Since both sides of the wafer are important in a BSI design, both sides of the silicon wafer must be not only polished, but defects must be kept to a minimum. Hence it is more expensive and harder to manufacture. Fig 3: A sony BSI sensor design. You can see all of the logic on top (front side), and microlenses, color filters, and photodiode on the bottom (back side). You can see where the photodiode for each pixel is connected to its logic in the middle.
An alternative to BSI design is LightPipe design. Canon also has patents as well as prototype (and possibly production...not sure) designs for a 180nm Cu (copper wiring) LightPipe sensor design with a double layer of microlenses. LightPipes make use of a high refractive index material to fill in the well. Normally, any light not directly incident on the photodiode itself will convert to heat or possibly reflect. That results in a loss of light energy, reducing the sensitivity of the sensor. Fig 4: Canon's 180nm Cu LightPipe sensor cross section. This is for a very small sensor, possibly with pixels less than 2 microns in size (as evidenced by the very large wiring blocks next to each pixel, which on a 180nm process, means these pixels are quite small.)