So sayeth DXO: "The a7R II poops on the 5DS"

Jan 29, 2011
10,673
6,120
Re: Sony A7R II scores 98, new king of DxO Mark (by a nose)

StudentOfLight said:
privatebydesign said:
StudentOfLight said:
privatebydesign said:
You are not averaging, you are adding, to get the lower noise you have to divide by the number of pixels you have added together, in this case four. Take any one pixel of information and it has to fit in the bit depth, for 14 bit that is 16,385, if you add four together you can't have a number higher than 16,385, so you have to divide by that same number to keep your range constant.

So take your last block of four, if noise becomes visible, your noise floor, at 3 (for example) you have two noisy pixels px-ID-14 and px-ID-15, if you add the four together you get 8, then divide by four you get 2 per pixel, which you can't see. Voila, two pixels that had visible noise don't now have visible noise, but you have lost the ability to differentiate detail in those four pixels so you now have one noiseless pixel instead of two of four noisy ones.

To be sure, your DR has not increased in that you don't have a wider range, you can't see below your noise floor and the bit depth has not increased because add four and divide by four is a zero sum when confined to whole numbers. You have lowered the noise levels by averaging/downsampling though.
Hi PBD, thanks for the reply.

I don't know if I'm just retarded, but I still don't get it. I included a division process in the averaging my original spreadsheet here is an update I just changed the layout to put the averages in at the bottom of the table (see attached)

Is my concept of average image noise flawed (i.e. Average image noise = sum of pixel noise divided by number of pixels)

Yes, you don't average the first group.

So take your first four pixels, say the noise floor is 4, ID-2 and ID-4 are both noisy pixels, at 100% view those pixels are garbage. Add the four together and divide by four and the resulting value is 3, so that block of four pixels, that is now one number is no longer noisy, at 100% view that down sampled one pixel (the four have become one) is not noisy but the picture is 1/4 the size it was.

This is how multiple exposures reduces noise on a same size basis, take various exposures of the same thing, add them together and divide by the number of exposures and you get less noise and retain the number of pixels. Basic astrophotography.
I've ask the question before in a different thread regarding how "noise floor" is calculated but unfortunately got no reply. I thought that since this example is a dark frame that the average value of all the "supposed-to-be-zero-value pixels" would be the noise floor. Hence I calculated the average value of 2.5 initially.

My understanding is that noise floor is a comparatively arbitrary figure given as a percentage of signal. That is, anything below X value and above 0 is noise, so in your examples above it would be more accurate to say anything any pixel value between 1 and 3 would be considered noisy, but the averaging works just the same, you get less noisy pixels and fewer of them. So in the imaginary simplified version we are using 0 is black, 4 is the darkest grey you can discern, 1-3 are just noise.

Forgive me for keep having edited my above reply whilst you were replying to it!
 
Upvote 0
Re: Sony A7R II scores 98, new king of DxO Mark (by a nose)

StudentOfLight said:
I've ask the question before in a different thread regarding how "noise floor" is calculated but unfortunately got no reply. I thought that since this example is a dark frame that the average value of all the "supposed-to-be-zero-value pixels" would be the noise floor. Hence I calculated the average value of 2.5 initially.

The noise is not the average value, but the arbitrary deviation of the values.

If you calculate the relevant statistical values for your example:
Number of pixels 16 ==> avg = 2,5; standard deviation = 2,221
Number of pixels 8 ==> avg = 5; standard deviation = 2,268
Number of pixels 4 ==> avg = 10; standard deviation = 3,65 (althoug not very meaningful, because the number of pixels is too low)

Average calculates as (sum of values)/(number of values).

Now, if you take avg/StdDev as measure for the signal-to-noise ratio you will see, that the lower the number of pixel the higher the SNR.

Oliver
 
Upvote 0
Mar 2, 2012
3,188
543
Re: Sony A7R II scores 98, new king of DxO Mark (by a nose)

StudentOfLight said:
privatebydesign said:
StudentOfLight said:
privatebydesign said:
You are not averaging, you are adding, to get the lower noise you have to divide by the number of pixels you have added together, in this case four. Take any one pixel of information and it has to fit in the bit depth, for 14 bit that is 16,385, if you add four together you can't have a number higher than 16,385, so you have to divide by that same number to keep your range constant.

So take your last block of four, if noise becomes visible, your noise floor, at 3 (for example) you have two noisy pixels px-ID-14 and px-ID-15, if you add the four together you get 8, then divide by four you get 2 per pixel, which you can't see. Voila, two pixels that had visible noise don't now have visible noise, but you have lost the ability to differentiate detail in those four pixels so you now have one noiseless pixel instead of two of four noisy ones.

To be sure, your DR has not increased in that you don't have a wider range, you can't see below your noise floor and the bit depth has not increased because add four and divide by four is a zero sum when confined to whole numbers. You have lowered the noise levels by averaging/downsampling though.
Hi PBD, thanks for the reply.

I don't know if I'm just retarded, but I still don't get it. I included a division process in the averaging my original spreadsheet here is an update I just changed the layout to put the averages in at the bottom of the table (see attached)

Is my concept of average image noise flawed (i.e. Average image noise = sum of pixel noise divided by number of pixels)

Yes, you don't average the first group.

So take your first four pixels, say the noise floor is 4, ID-2 and ID-4 are both noisy pixels, at 100% view those pixels are garbage. Add the four together and divide by four and the resulting value is 3, so that block of four pixels, that is now one number is no longer noisy, at 100% view that down sampled one pixel (the four have become one) is not noisy but the picture is 1/4 the size it was.

This is how multiple exposures reduces noise on a same size basis, take various exposures of the same thing, add them together and divide by the number of exposures and you get less noise and retain the number of pixels. Basic astrophotography.
I've ask the question before in a different thread regarding how "noise floor" is calculated but unfortunately got no reply. I thought that since this example is a dark frame that the average value of all the "supposed-to-be-zero-value pixels" would be the noise floor. Hence I calculated the average value of 2.5 initially.

It's very commonly defined as SNR=1
 
Upvote 0
Re: Sony A7R II scores 98, new king of DxO Mark (by a nose)

StudentOfLight said:
jrista said:
3kramd5 said:
benperrin said:
3kramd5 said:
benperrin said:
Downsampling everything to 8mp clearly disadvantages the 50mp sensor.

Actually, it is advantageous to the higher res sensor, in that it improves via noise-reducing averaging the dynamic range, which is the big ticket item.

Yes, it helps in terms of noise reduction for the sake of charts but even if there is slightly more noise in a 50mp file based on a per pixel level the extra detail of the 50mp opens new possibilities such as being able to push noise reduction further.

Within the DXO scoring methodology, downsampling helps, and the more you downsample the better. For whatever reason they don't consider sensor resolution in their camera scores, and instead consider it in their lens scores.

Downsampling always helps, not just with DXO. Downsampling averages pixel data together, which improves signal strength and reduces noise. That will always improve DR. Same reason why stacking multiple frames together when doing astrophotography improves signal strength and reduces noise.
Perhaps I don't understand downsampling and how averages work, but I constructed a simple experiment using excel. (see attached)

I took a black frame with a hypothetical 16 pixel camera. i.e. A perfectly accurate camera would take an image with a 0 value for each pixel. I then introduced a random noise value for each pixel. I then added pixels together first 2 pixel arrays then 4 pixel arrays then averaged to get the image noise. It appears the average image noise is the same regardless of how I scale the image down. Am I doing something wrong?
I see the add but where is the average?

bwa
 
Upvote 0

romanr74

I see, thus I am
Aug 4, 2012
531
0
50
Switzerland
sdsr said:
romanr74 said:
I have a terribly hard time to believe that these film area lenses will perform sufficently well on such high resolution sensors...

It all rather depends on what you mean by "sufficiently well". If you mean the sort of technical near-perfection of a Zeiss Otus, maybe not, but some come close to a degree you might find surprising (the hyper-critical Otus fan Ming Thein, for instance, is a big admirer of the Contax/Zeiss series from the 1970s-90s; being able to use them is among the advantages he gives for the Canon 5DS line and, especially, the Sony a7rII). With fast lenses wide open you might often find plenty to complain about, but often minor stopping down makes them indistinguishable from good modern lenses, while their "flaws" wide open can add a degree of character/atmosphere that for certain sorts of photography is (for some of us, anyway) highly desirable. I use such lenses on my a7r/a7rII more than any other (it also doesn't hurt that they're often a pleasure to handle regardless of image quality, including the older Pentax/Takumars). There are lots of examples on-line of images taken with so-called legacy lenses on cameras with high-resolution sensors, both dslrs and mirrorless (where they're easier to use), including comparisons of vintage vs new.

Can you share one or two links?
 
Upvote 0
Mar 2, 2012
3,188
543
Re: Sony A7R II scores 98, new king of DxO Mark (by a nose)

3kramd5 said:
dilbert said:
bmwzimmer said:
Looking at the signal to noise ratio/dynamic range/color tonality/etc... Throughout the ISO range in both screen and print, it really is a significant leap ahead of the d810 and 5dsr.
Props to Sony for a hell of a sensor design. Unfortunately for Sony, dxo only gives it a point more than the d810 when it should be heaps higher because it does so much better at higher ISO's

I wonder how much the lossy compression hurts the camera score here...

Probably not much. See D800 compared with A7R.


And we will never know. Per a DxO post on their website, they won't re-test it.
 
Upvote 0
ahsanford said:
You've got to love DXO's Fox News-ness about their shtick. Amazing.

Number of Canon lenses retested on the 5DS? Zero. They wouldn't want the world's 1,064th best lens -- the Canon 70-200 f/2.8L IS II -- suddenly becoming the world's 20th best lens all of sudden.

(Because their lens rating system is dominated by how many pixels are looking at it. You know. That.)

They could not re-test a *single* lens before they got their a7R II review completed. One might expect they'd now withhold all Canon lens retesting until they have the Sony lenses retested as well. Classic DXO.

- A

Forget about the sensor...how does that Sony perform in the rain? I'd like to see the scores on that. ;)
 
Upvote 0
Re: Sony A7R II scores 98, new king of DxO Mark (by a nose)

StudentOfLight said:
I've ask the question before in a different thread regarding how "noise floor" is calculated but unfortunately got no reply. I thought that since this example is a dark frame that the average value of all the "supposed-to-be-zero-value pixels" would be the noise floor. Hence I calculated the average value of 2.5 initially.

There is actually a very specific process you go through, using bias, dark and flat frames, to calculate system gain, and once you have gain, calculate read noise and dark current. You have to account for bias offset in order to actually measure the actual standard deviation of noise, so getting proper bias frames is important. Getting proper flat frames is also important as they are essential to calculating gain. Depending on what you want to know about dark current, whether you want to know anything at all, you may need to take quite a few dark frames, and in a very meticulous fashion in order to avoid oscillating temperatures between dark frames and the like.

When it comes to cameras with bayer arrays, you need to account for that as well. Most CFA cameras will give you very bad results if you don't extract one color channel from the NON-demosaiced (non-interpolated) RAW data. Usually green is used.

If you really want to calculate gain, read noise, dark current, etc. you can follow this procedure:

http://www.stark-labs.com/craig/articles/assets/CCD_SNR3.pdf

Follow it to the letter, and you should get fairly accurate results.
 
Upvote 0