« on: March 21, 2011, 02:16:20 PM »
I finally got my 7D, and it's an amazing camera. Upgrading from my still-pretty-amazing old 30D(inosaur). Pretty much everything that can be said about using this camera has likely been said on this forum.
One thing I haven't found anywhere online is a detailed description of the 7D (or indeed pretty much any other DSLR) in the way that noise is physically perceived by the sensor. In my obsessiveness, I spent about a half hour looking at 10x enlargements of hi-iso live-view and video. I noticed that almost all of the pixels have a specific style of unpredictability which is, well, predictable. Certain pixels only glow varying shades of blue (Partially an artifact of the 4:2:0 processing?) while others flash red and blue, or others go dark with intermittent flashes of white.
The important thing I noticed is that the behavior of the pixels is consistent - one that flashes off-blue-white will always be in one of those three states, and won't flash red/green if it wasn't already predisposed to do so. Same for the dull blue ones, which stay in various modes of that state, or, if they have the disposition, flash red. This seems to indicate nano-scale "flaws" or slight differences in the way each pixel lithographed, but I'm not entirely sure. Sensor noise is caused by random electrons/charged particles striking the pixels, so the structure of each pixel might be different in such a way that some are more predisposed to being affected.
I understand that in stills, there is a fairly effective noise reduction algorithm based on a "dark frame" of this noise, but given the consistency and placement of noise activity across the sensor, shouldn't it be possible to create a more advanced noise reduction algorithm based on the fact that each pixel has its own specific, fairly predictable noise signature?