I just received my Nikon D600 yesterday. I was expecting tremendous performance from another soon-to-be legendary Nikon/Sony sensor.
Well... I was quite disappointed.
With both cameras at ISO 12,800 and 1/1,600 exposure, with identical lenses (Sigma 85mm f/1.4 for Nikon and for Canon at f/2.8 ), what I get from the RAW data is shown in the attached images cropped at 100%.
This is really stupid, Nikon. I thought that with most of another year of tech gone by, and the legend of Nikon and Sony thrown in, that the D600 would perform better than the 5D Mark III. Boy, was I wrong.
The Canon image looks almost as smooth as butter under identical processing. The Nikon image looks like it came from a pocket camera or something like that.
I am also starting to get very suspicious about DxO Mark. Why do they have results for Nikon in a heartbeat all the time? Why do they measure dynamic range by trying to measure a theoretical definition (black point photo level to white point photon level) rather than trying to measure actual amount of detail in images that are under or over exposed by a certain number of stops?
I am thinking about doing a simple and truly mathematical measurement of image noise now, just to see if maybe I'm not giving the D600 enough credit.
The real thing that matters isn't DR or anything. Once the finite-dimensional subspace of data is fixed (which it is for raw files), the only thing that matters mathematically is signal to noise ratio, which is basically an aggregate of precision and accuracy, the two components of any recording technology, including photography.
Here's my plan, before I do it.
Take two or more successive exposures in RAW at the same settings, and repeat this process to obtain other pairs of image data with various under/normal/over exposure settings and various ISOs.
Then I will measure the actual noise by calculating the difference between identical images. The means of the data will be adjusted to account for a tiny variation in exposure times. Any difference between the images would be purely due to the random variation of noise.
Then I will use the old formula from science for relative error (observed-expected)/expected * 100 and then use the RE to calculate the signal to noise ratio.
And one number is meaningless. DxO Mark loves to give the highest ISO where (in their system of experimentation) the SNR falls below about 80-85%, a "critical point" of image quality.
But with cameras like the 1D X, the SNR barely falls any further for a long, long way.
However, cameras like the 5D Mark II fall off much, much faster after getting to this "critical point" even if their score reported by DxO Mark isn't very much different from the 1D X.
For example (made up numbers):
ISO 1,000 SNR = 90%
ISO 10,000 SNR = 75%
ISO 990 SNR = 90%
ISO 10,000 SNR = 45%
Obviously, camera A is 65% better than camera B, but the way DxO Mark reports things, there would only be a meaningless difference of about 1% of one stop in the "ISO score" of camera A vs. camera B.
So if I get time (tonight is a RARE few hours off for me), I will try to report the results. And I'll be totally unbiased. I use about half Nikon and half Canon equipment, and I have absolutely no grudges or favoritism on either side.