If the sensor is a 16-bit sensor with some kind of active cooling (no, not necessarily a fan), and Canon doesn't completely botch the ISO 100 and 200 electronic noise, then it could stomp all over the D800. With an extra two bits of information they could push 15 stops of DR, maybe even a little more (but no more than 16.0.)
My guess is that its still probably their same old sensor tech, but with some kind of efficient cooling to keep the sensor below room temperature (thereby reducing electronic noise), and extra bit depth. Canon needs the active cooling because they are either incapable of innovating and patenting technology similar but different enough to Sony Exmor, or there simply ISN'T another way to reduce noise electronically like Exmor, and Canon either has to pay Sony royalties, or do something entirely different.
Am I wrong in thinking that the amount electronic noise stems from the placement of the image processing unit? Too close to the sensor, too much heat for say video, far enough away to eliminate heat more electronic noise as it passe through the camera?
Maybe they should redesign the chip and do a pure imaging camera meant for the absolute best stills possible.
Yes, bluntly. The biggest reason that Nikon/Sony have better 'DR', is because they have a WHOLE lot of ADCs on chip and Canon doesn't. If Canon were to move from 8 channels of readout to something like 32 or 64, they would instantly get a stop more DR. If they manage to get more DR out of a 46mp chip odds are they've gone onchip with the ADC (like Sony) and then their DR would be the same as (or more likely a little better) than Sony (since their Sensel/Pixel tech is apparently a little better than Sony's)
Its more complicated than that. Sony Exmor puts the ADC on the same die as the sensor itself. That shortens the channel distance from pixel to ADC. It is also a hell of a lot more than 64 ADC's...its one per column or few columns, which means there are thousands of ADC's. That allows each ADC to operate at a far lower frequency (since each one only has to process a small fraction of the total pixels in the sensor), and a large part of the reason ADC's add noise to the image is their high operating frequency (which tends to generate electronic noise.)
From what I understand, the 1D X already uses a 16-channel readout (8 channels per Digic 5+ processor). Moving to 32 or 64 channels would complicate the image processor (probably at high cost...high frequency ADC's of the caliber required for something like the 1D X aren't cheap), but probably not allow a full stop DR improvement. Each ADC would still be responsible for processing nearly 720,000 pixels every time an 46.1mp sensor was read out...where as if there was one ADC per column or two columns, each one would only have to read about 5500 or 11000 pixels every time a 46.1mp sensor was read out. By the time you get to the ADC, you've already extracted the pixel...and that pixel already has the bulk of the electronic noise present in the sensor. The ADC will add some, bit its minimal...a bit of additional noise due to the high frequency current and some quantization error noise...both of which look very natural and random. At the same time, its burning in the nasty kinds of noise...fixed pattern, horizontal and vertical banding (crosshatch pattern noise), transistor differential noise (difference in efficiency between each pixel), color noise, etc. Even though you have parallelized pixel conversion 64-fold...each ADC has to work with pixels from a lot of different columns, so they can't really correct vertical banding like a CP-ADC design can.
Canon's pixel technology really isn't better than Sony's. The high ISO capability of the 1D X, 5D III and 6D is thanks to a weaker CFA, which basically allows a lot more green light into the red and blue channels. It was a "cheat", since at the time Canon really didn't have any other way to combat the onslaught of sensor tech improvements from SoNikon. That cheat requires stronger curves to be applied when processing RAW images to compensate and "remove" that extra green in the red and blue channels...so while color can still look great, its not actually as pure and accurate as it could be.