5DS scores at DXO **now posted**

Don Haines said:
You aren't getting it....

It doesn't matter what the formula is. It's the fact that they move from recording and reporting all the sub-scores (which is good), to creating a formula for overall ranking....

It does not matter what the formula is. Having it is a bad idea and does a dis-service to everyone.

Put a landscape photographer, a wedding photographer, a portrait photographer, a sports photographer, a news photographer, a bird photographer, a studio photographer, and a cat photographer in the same room and those 8 people will come up with 8 different ways to rank which one is the best camera.

A fair rating system and a universally accepted single rating number are mutually exclusive because different needs require different weights on different metrics. IT CAN'T BE DONE!
+1 I agree
 
Upvote 0
ajfotofilmagem said:
dilbert said:
Orangutan said:
dilbert said:
Orangutan said:
dilbert said:
Don Haines said:
This is the flaw in DXO... they went beyond taking measurements and introduced bias.....
Can you demonstrate with actual scores for different cameras where you see evidence of this bias?
If this were an academic publication, it would be DxO's obligation to reveal their methods so we can determine whether there is bias. The fact that they don't do that is suspicious.
But it's not an academic publication
Many engineering and science-oriented companies strive for that same level of integrity. The fact that DxO is not willing to do so as well is a strong indicator of their lack of confidence in their methods.
What do they gain by doing so, hmm?
They shift the debate in one small corner of the websphere from X to Y...
Do you really think people here will stop hating on DxO if they published their formula?
Will the people will stop hating a corrupt politician, because he has made public its methods to steal the money of the country he governs?

Some would (those on the receiving end the the theft)
 
Upvote 0
dilbert said:
Maximilian said:
dilbert said:
jd7 said:
...
It may be that the results of some of DxO's sub-tests (for want of a better description) provide useful info, but the fact is DxO promotes its camera scores and lens scores as an important part of its test results. Therein lies the problem, in my opinion.

And that is life.

When you watch olympians compete in gymnastics, do they get one score out of 10 or do they get multiple scores for each aspect of their performance? Answer, one score out of ten.
I think you should go watching gymnastics a little bit more often, because the scoring system has changed a while ago ;)
It is the same as with figureskating:
They don't get just one score anymore but a summary of multiple points and factors for individual parts of thier exercise, because there was too much cheating in the background. The scoring system was restructured and laid open for all to see because of that.

Hmm, when did that change? Or have I missed too much TV?

Oh, since the Olympics in Athens (Greece) in 2004.

Change my comment above and replace "gymnastics" with "diving."

In the end, it matters little which example is used.

When it comes to camera sensor scores the problem here is that DxO publish numbers that some people here don't like and rather than accept the numbers are an aggregate of others, they dispute them and their formulation. There's no evidence to show that the numbers are wrong and there's no evidence of bias - if there was any evidence of bias then it would be trivial for someone else to set up a similar facility to DxO and publish contradictory results. Nobody has, not even anyone from CR.

So jump up and down and shout all you like about the numbers being wrong or biased, etc, but until you've got your own test rig set up and are producing a better set of results than DxO, grow up and stop acting like a spoiled child.

DxO have got respect from a lot of people that have a lot better credentials than those complaining here on CR but I won't be one to deny people their opinions but remind people that until they've got evidence to show DxO are wrong in how they score sensors, what they're saying is just an opinion and everyone has one of those...

Special pleading.
Burden of proof.
Appeal to authority.

That's a logical fallacy trifecta!
 
Upvote 0
neuroanatomist said:
9VIII said:
The problem is that the bias against Canon is intentional, they can't "fix" something when it's functioning just as they intend (as smear campaign and propaganda).

In the past, I've defended DxO against claims of direct brand bias. Pretty disappointing to see the post today showing DxO comparing the 'Professional' D810 to the 'Semi-professional' 5DIII.

Maybe if Aglet asks them nicely, they'll fix that mistake, too. I guess he didn't notice them staunchly defending their initial 70-200/2.8 L IS vs MkII results that they changed a year later, probably he was too busy cursing at his inability to use his 5DII.
nope, didn't bother looking at that lens on DxoMark, could tell by using it.
I sold it.
 
Upvote 0
scyrene said:
Everyone's entitled to their opinion, and everyone's opinions vary. But given how widely the 5DII has been praised, I have to place your experience as an outlier.

yes, I may have had an outlier of a camera, considering it was one of the earliest ones.
I sure left a bad taste tho. As did every other Digic 4 body I had except the G11.
I'm not re-hashin' that; my opinion, based on personal experience with those products, hasn't changed. The Digic 4 era was, to put it mildly, seriously disappointing to me because of serious FPN issues. It drove me to ABC.
 
Upvote 0
Don Haines said:
A fair rating system and a universally accepted single rating number are mutually exclusive because different needs require different weights on different metrics. IT CAN'T BE DONE!

sure they can, they're only rating sensors so it's simple shortcut and if you don't like it then they've provided all the base measurement data for you to compare yourself.

Try fit a single score rating to a whole camera, then your opinion's perfectly valid.
 
Upvote 0
so this has turned into a DxO-bash(ing).. again.

Don't overlook the fact that the 5ds series provides no technology advance from Canon.
It merely presents a different mix of trade-offs with a slight enhancement in in-camera processing ability which provides only a minuscule absolute improvement over previous generations of Canon sensors.
FWIW, that's just fine for many people who've wanted more pixels to play with.
At least Canon has FINALLY got the message that FPN was not a feature everyone wanted.
 
Upvote 0
Aglet said:
Don Haines said:
A fair rating system and a universally accepted single rating number are mutually exclusive because different needs require different weights on different metrics. IT CAN'T BE DONE!

sure they can, they're only rating sensors so it's simple shortcut and if you don't like it then they've provided all the base measurement data for you to compare yourself.

Try fit a single score rating to a whole camera, then your opinion's perfectly valid.
ISO? DR? Colour depth? how would you rank them and with what weights? And why only low ISO? What about high ISO?
 
Upvote 0
Don Haines said:
Aglet said:
Don Haines said:
A fair rating system and a universally accepted single rating number are mutually exclusive because different needs require different weights on different metrics. IT CAN'T BE DONE!

sure they can, they're only rating sensors so it's simple shortcut and if you don't like it then they've provided all the base measurement data for you to compare yourself.

Try fit a single score rating to a whole camera, then your opinion's perfectly valid.
ISO? DR? Colour depth? how would you rank them and with what weights? And why only low ISO? What about high ISO?

Don, most sensor metrics interact with each other so, yes, you can come up with some kind of simplistic scoring system. it's not perfect but if someone's interested in comparing 2 similarly ranked sensors they can do so from the base data.

go ahead, show me 2 similar sensor scores from different mfrs where you think there's some egregious flaw in individual metrics. I DARE YOU. :)

They bias the weighting towards low ISO because that's where overall design and implementation really matters to the maximum achievable results. At higher ISO, similar size sensors perform very similarly, it's just physics, so how would you propose to evaluate that and rate, rank or score that when most hi ISO shot performance is due to raw conversion NR?
 
Upvote 0
+1000 Don.

It can not be done. It is for this reason why DxO beclowns itself when it takes it's useful individual metrics and mysteriously compiles them into a magical overall score that says Sony is an "A" student and Canon is a "C" student (at least according to when I got grades in high school. An 81 was a high C ) It is THAT moment when DxO goes from scientific journalism to editorial opinion. While I love the Sony A7s, DxO Claiming It the top "Sports" camera is just patently hillaious. And it gets that dubios ranking simply by having the most bad-ass high ISO performance. If you want to rank that model as the best ISO, say so. But saying Best Sports is a real stretch.
 
Upvote 0
Aglet said:
At higher ISO, similar size sensors perform very similarly, it's just physics, so how would you propose to evaluate that and rate, rank or score that when most hi ISO shot performance is due to raw conversion NR?

DxO does analyses on raw data (with the exception of their not-raw-superraw, presumably), so it's raw-conversion and NR independent.

PureClassA said:
... a magical overall score that says Sony is an "A" student and Canon is a "C" student (at least according to I got grades in high school. An 81 was a high C )

It doesn't work that way. The score isn't a percentage and there is no ceiling. 70-80 isn't "C range," nor is 90-100 "A" range.
 
Upvote 0
3kramd5 said:
neuroanatomist said:
3kramd5 said:
You don't need scores to show that there is bias, you need only their description of the scores.

...the Sensor Overall Score describes the results of measurements only on sensors and is essentially related to image noise (for example, a difference of one f-stop offsets the Overall Sensor Score by approximately 15 points)...

Image noise has a more significant impact on sensor score than color depth, for example. That's called bias. It's perfectly fair for them to set up their own biases (another word could be 'weightings'), I just wish they'd disclose them.

How is DxO ONE SuperRAW™ getting its own separate score a 'measurement only on sensors'?? It's perfectly reasonable for them as a private company to test whatever/however they want, but it ruins their credibility as independent/impartial testers.

I agree completely. Calling it raw so that they can present it relative to single exposures is indefensible
They are bragging about this on their website about it being equal to crop cameras (d7200/7d2) and science behind this achievement. I think, they created a raw hdr of 4 pics. Can they beat even best of Exmor scores/numbers by creating composite of 6 or 8. Why did they stop with hdr of 4 pics.

" The DxO ONE camera’s score of up to 85 puts it on par with many DSLR cameras, such as the Nikon D7200 and the Sony A7S (both with a score of 87), and is well above such Canon DSLRs as the EOS 5D Mark III (81) and the 7D Mark II (70)."
 
Upvote 0
Yeah the four shot RAW HDR file scored like everyone else's one shot RAW was a red flag. What I DO like is that someone managed to come up with a RAW HDR (if I'm understanding correctly what they're producing). Granted, we can all shoot 3-4 RAW frames and merge them LR or PS into a final TIFF or JPG ... but making and KEEPING a RAW HDR file... pretty nifty. Would like to see more on that

ritholtz said:
3kramd5 said:
neuroanatomist said:
3kramd5 said:
You don't need scores to show that there is bias, you need only their description of the scores.

...the Sensor Overall Score describes the results of measurements only on sensors and is essentially related to image noise (for example, a difference of one f-stop offsets the Overall Sensor Score by approximately 15 points)...

Image noise has a more significant impact on sensor score than color depth, for example. That's called bias. It's perfectly fair for them to set up their own biases (another word could be 'weightings'), I just wish they'd disclose them.

How is DxO ONE SuperRAW™ getting its own separate score a 'measurement only on sensors'?? It's perfectly reasonable for them as a private company to test whatever/however they want, but it ruins their credibility as independent/impartial testers.

I agree completely. Calling it raw so that they can present it relative to single exposures is indefensible
They are bragging about this on their website about it being equal to crop cameras (d7200/7d2) and science behind this achievement. I think, they created a raw hdr of 4 pics. Can they beat even best of Exmor scores/numbers by creating composite of 6 or 8. Why did they stop with hdr of 4 pics.

" The DxO ONE camera’s score of up to 85 puts it on par with many DSLR cameras, such as the Nikon D7200 and the Sony A7S (both with a score of 87), and is well above such Canon DSLRs as the EOS 5D Mark III (81) and the 7D Mark II (70)."
 
Upvote 0
PureClassA said:
Yeah the four shot RAW HDR file scored like everyone else's one shot RAW was a red flag. What I DO like is that someone managed to come up with a RAW HDR (if I'm understanding correctly what they're producing). Granted, we can all shoot 3-4 RAW frames and merge them LR or PS into a final TIFF or JPG ... but making and KEEPING a RAW HDR file... pretty nifty. Would like to see more on that

ritholtz said:
3kramd5 said:
neuroanatomist said:
3kramd5 said:
You don't need scores to show that there is bias, you need only their description of the scores.

...the Sensor Overall Score describes the results of measurements only on sensors and is essentially related to image noise (for example, a difference of one f-stop offsets the Overall Sensor Score by approximately 15 points)...

Image noise has a more significant impact on sensor score than color depth, for example. That's called bias. It's perfectly fair for them to set up their own biases (another word could be 'weightings'), I just wish they'd disclose them.

How is DxO ONE SuperRAW™ getting its own separate score a 'measurement only on sensors'?? It's perfectly reasonable for them as a private company to test whatever/however they want, but it ruins their credibility as independent/impartial testers.

I agree completely. Calling it raw so that they can present it relative to single exposures is indefensible
They are bragging about this on their website about it being equal to crop cameras (d7200/7d2) and science behind this achievement. I think, they created a raw hdr of 4 pics. Can they beat even best of Exmor scores/numbers by creating composite of 6 or 8. Why did they stop with hdr of 4 pics.

" The DxO ONE camera’s score of up to 85 puts it on par with many DSLR cameras, such as the Nikon D7200 and the Sony A7S (both with a score of 87), and is well above such Canon DSLRs as the EOS 5D Mark III (81) and the 7D Mark II (70)."
Hi,
Once you combine 4 RAW image to produce a single RAW file, it's not RAW anymore... it's just an image save in RAW format. That's why a lot of serious Astrophotographer don't use Nikon camera in the past as Nikon applied some sort of NR in the RAW file which cannot be turn off in the camera (not sure about the current Nikon model).

Have a nice day.
 
Upvote 0
PureClassA said:
Yeah the four shot RAW HDR file scored like everyone else's one shot RAW was a red flag. What I DO like is that someone managed to come up with a RAW HDR (if I'm understanding correctly what they're producing). Granted, we can all shoot 3-4 RAW frames and merge them LR or PS into a final TIFF or JPG ... but making and KEEPING a RAW HDR file... pretty nifty. Would like to see more on that

It's not 'a RAW HDR file'. It's four separate RAW images stored a single file container (kind of like a zipped folder), that are opened and combined on your computer...but only by DxO's software.
 
Upvote 0
jthomson said:
I would like to understand the low iso scores of the 5DS/5DSR. My understanding was that the larger pixels of the 5DIII and 6D were what gave them the low light capability. The 5DS/5DSR have smaller pixels yet get similar iso scores to the 5DIII and 6D. How are they achieving roughly twice the iso score of the 7DII with a similar pixel size?

I would still argue this is hardware and not testing methodology. The big factor here is the sensor size. Equal pixel size on a bigger sensor produces less noise overall at equal technology. Even if the per-pixel noise is the same. The pixels could also have higher quantum efficiency on the 5Ds. Maybe the pixels' QE is much higher also than vs. the 5D3 or 6D? It's hard to tell until sensorgen (my favorite site) lists data/measurements.
 
Upvote 0
bdunbar79 said:
jthomson said:
I would like to understand the low iso scores of the 5DS/5DSR. My understanding was that the larger pixels of the 5DIII and 6D were what gave them the low light capability. The 5DS/5DSR have smaller pixels yet get similar iso scores to the 5DIII and 6D. How are they achieving roughly twice the iso score of the 7DII with a similar pixel size?

I would still argue this is hardware and not testing methodology. The big factor here is the sensor size. Equal pixel size on a bigger sensor produces less noise overall at equal technology. Even if the per-pixel noise is the same. The pixels could also have higher quantum efficiency on the 5Ds. Maybe the pixels' QE is much higher also than vs. the 5D3 or 6D? It's hard to tell until sensorgen (my favorite site) lists data/measurements.
IIRC, sensorgen just uses DxO data and does curve fitting to come to their figures. Their number will always agree with DxO because the source is pulled from DxO.

So far, according to DxO's reported measurements, the 5Ds cameras are underwhelming. The 5Ds' pixels are performing slightly worse than those of the 7D-II, and clearly worse than the 6D. It's quite disappointing that a camera 6 months newer than the 7D-II and without the split pixels is somehow performing worse. (see attached)

Anyway, I'm quite skeptical of the published results and will definitely wait for Roger Clark's analysis before making any sort of purchase decision. Who knows, perhaps the 5D-IV, 6D-II or A7R-II will provide the right mix of features for my future needs.
 

Attachments

  • DR_px_-_5DsR_v_6D_v_7D-II.jpg
    DR_px_-_5DsR_v_6D_v_7D-II.jpg
    73.2 KB · Views: 187
Upvote 0
StudentOfLight said:
bdunbar79 said:
jthomson said:
I would like to understand the low iso scores of the 5DS/5DSR. My understanding was that the larger pixels of the 5DIII and 6D were what gave them the low light capability. The 5DS/5DSR have smaller pixels yet get similar iso scores to the 5DIII and 6D. How are they achieving roughly twice the iso score of the 7DII with a similar pixel size?

I would still argue this is hardware and not testing methodology. The big factor here is the sensor size. Equal pixel size on a bigger sensor produces less noise overall at equal technology. Even if the per-pixel noise is the same. The pixels could also have higher quantum efficiency on the 5Ds. Maybe the pixels' QE is much higher also than vs. the 5D3 or 6D? It's hard to tell until sensorgen (my favorite site) lists data/measurements.
IIRC, sensorgen just uses DxO data and does curve fitting to come to their figures. Their number will always agree with DxO because the source is pulled from DxO.

So far, according to DxO's reported measurements, the 5Ds cameras are underwhelming. The 5Ds' pixels are performing slightly worse than those of the 7D-II, and clearly worse than the 6D. It's quite disappointing that a camera 6 months newer than the 7D-II and without the split pixels is somehow performing worse. (see attached)

Anyway, I'm quite skeptical of the published results and will definitely wait for Roger Clark's analysis before making any sort of purchase decision. Who knows, perhaps the 5D-IV, 6D-II or A7R-II will provide the right mix of features for my future needs.

I'm talking about the 12.4 stops of DR for print DR. Sorry.
 
Upvote 0
let me ask a naive question --

Let's assume that the DxO measures are correct and the Sony sensors have better dynamic range, ....

In the end, so what ? How much of a difference really matters? Does DxO measure anything that really matters or are the difference small enough not to be important except in a few rare cases?
 
Upvote 0