DXO uh-oh?

dilbert said:
sanj said:
MacroBug said:
Why does everyone respond to dilbert's nonsense? Can't we just ignore his posts and hope he goes away? It would make this forum much more enjoyable.

It is bit extreme to stop anyone from posting their viewpoints.

The fun part is trying to get jrista/neuro to be open with people rather than hide their viewpoints and thoughts. "DxO and Nikon are joined at the hip". How many times has that been repeated now but no substance has been given as to why anyone should think that but yet nobody wants to back away from saying that.

An alternative explanation is that most people really don't care one way or the other.

The inconvenient truth is that DSLR technology has progressed to the point where differences between brands and even between formats is so insignificant that it seldom, if ever, has any real world impact on the final product – the photograph.

This forum provides daily proof of Sayre's law: "the intensity of feeling is inversely proportional to the value of the issues at stake."

Perhaps DxO is biased. Perhaps Nikon and Sony have decided to "build to the test." Perhaps the differences being tested are so insignificant that the ratings have only academic and no real-world application. Most likely it's a combination of all three.

It's not like the scores have the tiniest bit of impact on the market. So really, who cares?
 
Upvote 0
unfocused said:
It's not like the scores have the tiniest bit of impact on the market. So really, who cares?

Or even an impact on practical use. If the there was any real, practical value to their 'metrics' photographers serious about low ISO performance would be deserting to Sony and Nikon in their droves, yet they are not because in the vast majority of low ISO circumstances there is just no difference, despite all the crap about read noise levels, FPN etc.

I remember he-who-shall-not-be-named once posted two identical shots from a 5DII and a D800 to show, in his opinion how much better the shadows were from the Nikon, but in his blinkered vision of pulled shadows he had overlooked the noise in the blue sky from the Nikon ! When I pointed this out there was a very hasty edit ;D

So who cares ? Well unfortunately there are a growing number of web based review sites that quote DxO, perhaps because of the way in which DxO present their data; it's seen as being very scientific. Obviously to date this has had no detrimental impact on Canon's sales, so it would seem that at the moment the majority of purchasers don't take any notice of what they are saying, but I wonder if in time it could start to impact, but I suppose by then Canon may have a sensor that scores better on DxO.
 
Upvote 0
sanj, I never stated dilbert couldn't post. However, we control how and when we respond to his posts. I noticed he has not addressed any of the unemotional, logical points made in a number of posts regarding lens ratings. He continually beats the 'joined at the hip' statement to death which got old in the first 2 pages of this thread. I learn a tremendous amount through this forum but not in this thread. I'm out on the rest of this one...
 
Upvote 0
jrista said:
MLfan3 said:
as a multi-system user I have to agree with DXO guys, they are honest, much more so than DPR or any other unscientific review sites online.

The problem is that DXO's "science" is in dispute. How can you trust something that produces inconsistent and obviously incorrect results?
I would be a lot happier with them if they left out the magic ratings numbers.....
 
Upvote 0
jrista said:
MLfan3 said:
as a multi-system user I have to agree with DXO guys, they are honest, much more so than DPR or any other unscientific review sites online.

The problem is that DXO's "science" is in dispute. How can you trust something that produces inconsistent and obviously incorrect results?

Perhaps they should submit their 'science' to the Journal of Irreproducible Results. They may even be worthy of consideration for an IgNobel Prize.
 
Upvote 0
jrista said:
MLfan3 said:
as a multi-system user I have to agree with DXO guys, they are honest, much more so than DPR or any other unscientific review sites online.

The problem is that DXO's "science" is in dispute. How can you trust something that produces inconsistent and obviously incorrect results?

I agree. Science demands transparency so the measurements can be replicated by others. Little about their process is transparent. It's not scientific, it's almost pseudo-scientific.
 
Upvote 0
neuroanatomist said:
jrista said:
MLfan3 said:
as a multi-system user I have to agree with DXO guys, they are honest, much more so than DPR or any other unscientific review sites online.

The problem is that DXO's "science" is in dispute. How can you trust something that produces inconsistent and obviously incorrect results?

Perhaps they should submit their 'science' to the Journal of Irreproducible Results. They may even be worthy of consideration for an IgNobel Prize.
DxO documents their sensor testing procedure here:
http://www.dxomark.com/About/In-depth-measurements/DxOMark-testing-protocols/Noise-dynamic-range

DxO results have been independently reproduced at various times. For example:
http://www.dpreview.com/forums/post/33806693
http://www.dpreview.com/forums/post/33833501
 
Upvote 0
jrista said:
MLfan3 said:
as a multi-system user I have to agree with DXO guys, they are honest, much more so than DPR or any other unscientific review sites online.

The problem is that DXO's "science" is in dispute. How can you trust something that produces inconsistent and obviously incorrect results?

so what are the so-called incorrect results?
I think all what they have posted are right, at least mirror to my own experience. only one issue I found with DXO mark is their stupid overall score D810=97, D800E =96, 5D3=81,etc. But other than that almost all graphs and numbers they provided there seem very correct. Why do you think they are inconsistent with some obviously incorrect results?
but I think you know much more than me in this kind of things , so I would like to hear your view on DXO.
 
Upvote 0
horshack said:
neuroanatomist said:
jrista said:
MLfan3 said:
as a multi-system user I have to agree with DXO guys, they are honest, much more so than DPR or any other unscientific review sites online.

The problem is that DXO's "science" is in dispute. How can you trust something that produces inconsistent and obviously incorrect results?

Perhaps they should submit their 'science' to the Journal of Irreproducible Results. They may even be worthy of consideration for an IgNobel Prize.
DxO documents their sensor testing procedure here:
http://www.dxomark.com/About/In-depth-measurements/DxOMark-testing-protocols/Noise-dynamic-range

DxO results have been independently reproduced at various times. For example:
http://www.dpreview.com/forums/post/33806693
http://www.dpreview.com/forums/post/33833501

Your DxO link describes one of their Measurements, which as I've stated on multiple occasions (at least dozens, if not hundreds on these forums) I find generally well done and useful (except when they make errors and deny it, which seems to occur mainly in their lens tests). The problems are not with their Measurements, but with their Scores. Can you provide a link where DxO explicitly describes how their Scores are calculated from the Measurements? No, because they don't disclose the specifics of how those Scores are calculated. Nor do they explicitly describe the bias inherent in their Scores.

FWIW, Peter van den Hamer suggests an approximation he states usually falls within 1-2 points: DxOMark_Sensor_Score = 59 + 4.3*(ColorDepth-21.1) + 3.4*(DynamicRange-11.3) + 4.4*log2(ISO/663) -0.2. He also states, "My guess is that the actual formula is non-linear and may use (under some conditions) coefficients of 5/5/5 rather than 4.3/3.4/4.4." His suggestion that the 'master formula' which DxO uses may be modified under some conditions further supports the claim that DxO's scoring is biased. Yeah, that sounds like good science. NOT.

As for your 'independent reproduction,' I clicked your first link but to be honest, I stopped reading after, "For some obscure reason - sunspots or moon phase or other strangeness - photons are behaving better today, and I achieved higher FWC results for my D3's than I have before." Sorry, but independent verification of poor pseudoscience with worse pseudoscience is even less valid than two wrongs making a right.
 
Upvote 0
neuroanatomist said:
horshack said:
neuroanatomist said:
jrista said:
MLfan3 said:
as a multi-system user I have to agree with DXO guys, they are honest, much more so than DPR or any other unscientific review sites online.

The problem is that DXO's "science" is in dispute. How can you trust something that produces inconsistent and obviously incorrect results?

Perhaps they should submit their 'science' to the Journal of Irreproducible Results. They may even be worthy of consideration for an IgNobel Prize.
DxO documents their sensor testing procedure here:
http://www.dxomark.com/About/In-depth-measurements/DxOMark-testing-protocols/Noise-dynamic-range

DxO results have been independently reproduced at various times. For example:
http://www.dpreview.com/forums/post/33806693
http://www.dpreview.com/forums/post/33833501

Your DxO link describes one of their Measurements, which as I've stated on multiple occasions (at least dozens, if not hundreds on these forums) I find generally well done and useful (except when they make errors and deny it, which seems to occur mainly in their lens tests). The problems are not with their Measurements, but with their Scores. Can you provide a link where DxO explicitly describes how their Scores are calculated from the Measurements? No, because they don't disclose the specifics of how those Scores are calculated. Nor do they explicitly describe the bias inherent in their Scores.
You can read about the methodology of their scores here:
http://www.dxomark.com/About/Sensor-scores

And I describe in detail their low-light score, the score which typically produces the most Canon vs Nikon controversy in online debates:
http://www.dpreview.com/forums/post/41265241

neuroanatomist said:
As for your 'independent reproduction,' I clicked your first link but to be honest, I stopped reading after, "For some obscure reason - sunspots or moon phase or other strangeness - photons are behaving better today, and I achieved higher FWC results for my D3's than I have before." Sorry, but independent verification of poor pseudoscience with worse pseudoscience is even less valid than two wrongs making a right.
She wrote that as tongue 'n cheek, and it actually represents a sign of humility and willingness to be open to contrary points of view, signs of a good engineer/scientist. As for her credentials, if you follow her posts on dpreview you'll see she one of the most informed technical minds for camera sensor info. To cite a specific example, she reverse-engineered Nikon's long-exposure noise algorithm, identified serious problems with it, devised a much improved alternate algorithm which was relayed to Nikon by Thom Hogan and then later adopted by Nikon in subsequent camera designs.
 
Upvote 0
horshack said:
You can read about the methodology of their scores here:
http://www.dxomark.com/About/Sensor-scores
One of the tenets of scientific research is that you publish your methods in sufficient detail that someone knowledgable in the field can repeat your experiments and derive equivalent results. Sorry, on the page you linked or elsewhere on their site, I cannot find where they state the formula used to calculate their scores. If that is an oversight on my part, can you please link to where they publish that part of their methods? If not, my claim of their "Image Science" being poor pseudoscience remains valid.

horshack said:
She wrote that as tongue 'n cheek, and it actually represents a sign of humility and willingness to be open to contrary points of view, signs of a good engineer/scientist.
So your contention is that to paraphrase her post, 'I cannot get consistent absolute measurements from one day to the next, so I'll present relative data instead,' is the sign of a good scientist/engineer? Sorry, but I develop and validate assays for a living, and absolute data with significant inter-run or inter-day variability means a poor assay that needs to be corrected appropriately, if possible (and if not, an alternate assay must be developed).

What I am getting from your posts is a better idea of why your 'dot tune' method doesn't stand the test of independent validation, at least in my hands.
 
Upvote 0
horshack said:
You can read about the methodology of their scores here:
http://www.dxomark.com/About/Sensor-scores

I'm sorry, could you point out where the formula is given? I see a few hints about what's considered important, but not the actual formula for the score. All I was able to find was this:

How is Sensor Overall Score measured?The Sensor Overall Score is an average of the Portrait Score based on color depth, the Landscape Score based on dynamic range, and the Sports Score based on low-light ISO.


It says "an average:" presumably that's a weighted average, but they don't give the weightings.

Thanks.
 
Upvote 0
<She wrote that as tongue 'n cheek, and it actually represents a sign of humility and willingness to be open to contrary points of view, signs of a good engineer/scientist. As for her credentials, if you follow her posts on dpreview you'll see she one of the most informed technical minds for camera sensor info. To cite a specific example, she reverse-engineered Nikon's long-exposure noise algorithm, identified serious problems with it, devised a much improved alternate algorithm which was relayed to Nikon by Thom Hogan and then later adopted by Nikon in subsequent camera designs.>

Right. Scientific GARBAGE. There are many scientists on this forum, myself included. This doesn't count, sorry. In science you don't get to "tongue 'n cheek" or get it right the majority of the time. Either you do good science that's meaningful or you don't. DxO mark does NOT. We've all read that link and they do NOT disclose how scores are done/derived from the measurements.

Besides, DxO mark isn't relevant. Despite them scoring Nikon/Sony higher and higher against Canon product head to head, Canon still went from a 4% market share lead 4 years ago to a now 20% market share lead. Nobody cares or nobody believes because of just that: The garbage "science" they are doing.
 
Upvote 0
"For some obscure reason - sunspots or moon phase or other strangeness - photons are behaving better today, and I achieved higher FWC results for my D3's than I have before."

Translation: "The results are inconsistent and I don't know why"

and that is supposed to give me confidence?????
 
Upvote 0
The margin of error in her D3s measurements were very small; she included them for completeness, and was open about not understanding their source. The variance was small enough to be immaterial to the results.

As for the exact formula DxO uses for their composite scores, I have not seen them published. If you look at the scores for a cross-section of cameras and then relate them to the individual data points DxO publishes (SNR, DR, color selectivity), you can get a general idea of their weighting, but yes, the precise formula is not published. If it were I imagine we would instead be discussing how the weighting unfairly favors one camera over another, which is the natural consequence of any subjective composite score, and why DxO publishes the individual data points for those wanting to look behind the curtain.
 
Upvote 0
horshack said:
The margin of error in her D3s measurements were very small; she included them for completeness, and was open about not understanding their source. The variance was small enough to be immaterial to the results.

Sorry, but your statement does not align well with hers:

I re-tested both of my D3 bodies, plus the new D3s, for this - just to make sure I produced a valid comparison. For some obscure reason - sunspots or moon phase or other strangeness - photons are behaving better today, and I achieved higher FWC results for my D3's than I have before. Because of this discrepancy, I am only going to report relative performance between the D3s and D3, instead of giving absolute measurements.

She states the discrepancy was significant enough that she would not report the absolute values. If the source of the discrepancy could not be identified, it cannot be assumed to be a systematic error, i.e. one which would affect the measurements of the new D3s with similar magnitude and direction as it would the old D3 bodies.

Inconsistent data, flawed assumptions...bad science.


horshack said:
As for the exact formula DxO uses for their composite scores...yes, the precise formula is not published.

'Black box' methods...bad science.

I think we're done here.
 
Upvote 0
neuroanatomist said:
horshack said:
The margin of error in her D3s measurements were very small; she included them for completeness, and was open about not understanding their source. The variance was small enough to be immaterial to the results.

Sorry, but your statement does not align well with hers:

I re-tested both of my D3 bodies, plus the new D3s, for this - just to make sure I produced a valid comparison. For some obscure reason - sunspots or moon phase or other strangeness - photons are behaving better today, and I achieved higher FWC results for my D3's than I have before. Because of this discrepancy, I am only going to report relative performance between the D3s and D3, instead of giving absolute measurements.

She states the discrepancy was significant enough that she would not report the absolute values. If the source of the discrepancy could not be identified, it cannot be assumed to be a systematic error, i.e. one which would affect the measurements of the new D3s with similar magnitude and direction as it would the old D3 bodies.

Inconsistent data, flawed assumptions...bad science.

She has a high standard for what she publishes. Her relative D3s vs D3 results still match DxO's results, and her absolute results are very close as well.

neuroanatomist said:
'Black box' methods...bad science.

I think we're done here.

Seems we couldn't come to an agreement but I appreciate the discussion.
 
Upvote 0
Admittedly, informational posts for an internet audience don't really need to be held to the rigorous standard of peer-reviewed scientific publication (not to mention that many things that are published in scientific journals turn out to not be independently reproducible).

horshack said:
Seems we couldn't come to an agreement but I appreciate the discussion.

As do I...thanks!
 
Upvote 0