sarangiman said:
jrista said:
The kicker is that RAW editors don't have to honor Canon's bias offset. The entire RAW signal is stored in Canon's files, and the offset is calibrated with a border of masked pixels. Who knows if editors like Lightroom, or DXO, or Aperture actually adhere to Canon's recommended offset. Even if they do, there is still negative signal information that can be pulled up, and the full noise signal is there. With Nikon RAW images...all that negative (deep noise) signal is simply discarded.
I use DeepSkyStacker and PixInsight to calibrate Canon RAW files for integration into a "stack". I use a 200-frame master bias image to subtract the bias signal from each light frame before integrating it. When the bias is removed from Canon RAW files, the dynamic range jumps by almost two stops...which puts it in the same range as Nikon files...
As Horshack mentioned earlier, this has changed with some recent Nikon models. In fact, the D810 has a bias offset of 600 in a 14-bit Raw file at base ISO. This might make it more suitable for your astrophotography, no, jrista?
Also: jrista you mention averaged dark frame subtraction as increasing dynamic range (DR) by 2 stops for Canon DSLRs - putting it in the same range Nikon/Sony sensors. I'm sure averaged dark frame subtraction to remove bias and some forms of FPN can help for certain use-cases (e.g. astrophotography), but I'm confused why you mention this here as if it would help any typical, say, landscape shooter suddenly get as much DR with a Canon DSLR sensor as you would with, say, the Sony A7R sensor.
That's simply not true. And I think you admitted this in a later post, but I do think it's important to stress the point.
I think it's unfair to say it's "simply not true." It depends. I've shot landscapes that are perfectly still, with only a slight amount of motion in the sky. Now, usually I photograph at ISO 100. However, if your going to be integrating frames, you could get away with using a higher ISO, and taking more frames at a faster shutter speed. If you bumped up to, say, ISO 400, took a bunch of frames, then integrated them together (you can do that with Photoshop, but it's still better to use a tool like DSS to do it, as it can work on the RAW images themselves, rather than demosaiced results), you would very likely GAIN DR in the end.
Why? Because when you integrate multiple frames, even if you don't even do any kind of dark or bias frame subtraction to remove read noise, your averaging those frames together. Averaging reduces noise. So, let's say you have the option of shooting one ISO 100 shot at 1/10th of a second, or four ISO 400 shots at 1/40th of a second. Integrate the ISO 400 shots, and you reduce noise by averaging. You reduce ALL noise, including deeper shadow read noise. In a Canon camera, ISO 400 has as much DR as ISO 100, so your did not lose anything by doing that, but because you could use a higher shutter speed, in the end, after integration, you gain something.
At a shutter speed of 1/40th second, you could probably get away with eight ISO 400 shots. Eight integrated ISO 400 1/40th second frames are going to have 2.8x less noise than the single ISO 100 shot at 1/10th second.
Now, such a thing is completely unnecessary with a camera that uses a Sony Exmor sensor. However people who use D800's still do true HDR, and some of them will take a good dozen frames for an HDR merge. Firing off eight ISO 400 frames at a relatively high ISO is trivial in comparison.
So, "simply no true"? Really?
sarangiman said:
Now, I know
you know this b/c clearly you have a grasp of all this, jrista (stunning image, by the way

), but for everyone else - downstream (of ISO amplification) read noise essentially randomly varies the signal, so you can't simply 'subtract' out this random variation to reduce shadow noise (well, not without the usual costs typical NR software pay). Shadows suffer more simply b/c a constant source of electronic noise varies a smaller signal much more than a larger one; hence, shadows pay a larger SNR cost.
You don't subtract it out. Of course not. ;P You AVERAGE it out! Averaging attenuates the standard deviation of noise. You actually literally CAN NOT subtract noise because subtraction actually enhances the standard deviation, making the noise worse (this is intensely obvious when you start doing astrophotography...I accidentally subtracted a master flat frame once, and the noise was terrible because both the flat and the light frame had random noise. You normally divide out flat frames to avoid that problem.)
sarangiman said:
The only way I know of recovering the 'sensor DR' (without the influence of downstream read noise) is to simultaneously shoot two ISOs and combine results. For example, what Magic Lantern does. One can show that a Canon 5D Mark III is - for most practical purposes - 'ISO-less' above ISO 3200. That means that if you simultaneously shoot different rows of pixels at ISO 100 & ISO 3200, you can effectively avoid the downstream read noise effects and get more of the actual sensor DR (which is quite good for modern Canon, and Nikon/Sony, sensors). But then there are all the downsides this method brings...
Shooting two ISOs is just a firmware hack to extract more dynamic range from Canon's whole readout pipeline. The best way to extract more dynamic range is averaging. Your basic noise reduction algorithm works by averaging. When you apply noise reduction to any single frame in Lightroom, or with Topaz Denoise, Nik Dfine, NoiseNinja, NeatImage, or any of those tools, you ARE increasing dynamic range. That's what noise reduction does. It increases dynamic range.
I know LTRLI disagrees, but what were really talking about here is not actually dynamic range. Were talking about editing latitude. I wish these two things, DR and editing latitude, weren't so intrinsically linked, but they are. Editing latitude, as in the ability to lift shadows, is only one benefit of having more dynamic range. Fundamentally, DR is about less noise. Not just read noise, which only exists in the shadows, but ALL noise, which exists at every level of the entire image...shadows, midtones, highlights, whites, blacks, everything. Denoise algorithms reduce noise, which means, by definition, they are concurrently increasing dynamic range.
While the ML Dual ISO technique is certainly one way of reducing Canon's nasty banded read noise, it's not the only way. I have used Topaz Denoise 5 for a few years now. It has both debanding and DR recovery features. The debanding works wonders. I've used it on some really heavily banded astrophotography images to great effect. I've used it to remove the 7D vertical banding as well (which is actually not that difficult to remove, it has a very strict 8-pixel wide repeating pattern, and Denoise allows you to configure the band width or separation.
There are a lot of ways of recovering dynamic range. Canon doesn't clip their signal, they offset instead, so all the image signal data is there. It can be recovered (which is actually NOT the case with Nikon and Sony cameras), by denoising and debanding. There are certainly caveats and limitations to post-process noise reduction (which, btw, could also be called dynamic range recovery...same thing!) Like any algorithm, push denoising too far, and you'll start getting artifacts. But usually it doesn't really take all that much to really improve an image. A light touch of full-image NR and maybe a pass of debanding, mask and scatter a very light bit of artificially generated noise in the really deep shadows (to get rid of posterization)...and voila. More DR!
sarangiman said:
Also, I haven't yet seen evidence that the increased DR at base ISO on the D810 is due to 'cooking' the Raw file. In fact, DxO's full SNR curves suggest some sort of nonlinearity introduced into capture at the lowest ISO. The SNR at clipping - where the noise is dominated by shot noise - remains the same for ISO 64 and ISO 100. That indicates to me that for an increased
You should search some of the astrophotography sites. There is a lot if information about how all the various DSLR manufacturers mess with their signals. Even Canon does, to a degree...Craig Stark from Stark Labs (maker of Nebulosity) actually wrote a fairly detailed article about an unexpected shift in Canon's noise curves as ISO is increased. Every DSLR maker cooks their signals...it's just that Nikon does it more. There is actually a Nikon hacker group that has been pounding away at Nikon's firmware in an attempt to remove their black point clipping, and recover the entire signal. They seem very close to cracking that nut as well, and they have found that Nikon does indeed to quite a bit of signal cooking.