DxOMark Sensor Performance: Nikon vs. Canon

Status
Not open for further replies.
neuroanatomist said:
Have you tried DPP instead of LR?

Hi neuroanatomist,

Yes, I tried DPP, since threads quickly popped up (I think on dpreview) claiming that DPP fixed the issue. It most certainly did not, IMHO. To me, the default noise reduction settings is what helped cover up the banding. But once you turned all that off, the pattern noise was still there.

Furthermore, DPP allows less ability to lift shadows than Lightroom, making comparisons of more drastic edits impossible.
 
Upvote 0
DB said:
No academic journal would publish a DxO report without both Data (in an Appendix) and Methods (with a clearly defined algorithm stating the parameters for weighting each category). In the world of peer review, Black-Box methodology would simply have REJECTED stamped on it and returned.

Yes, it'd be wonderful if they published their full methodology & made their RAW files available. As you mention, that's what'd have to happen in a peer-reviewed journal. I think they might gain more credibility if they did these things.

As for the lens tests -- how many copies of the 70-200 f/2.8L II did they test? The II is most certainly sharper than I wide open on the few copies I've handled. Copy variability can definitely skew results, as has been mentioned before. Not so much the case with sensors, which is why I trust their sensor data.
 
Upvote 0
DB said:
neuroanatomist said:
Define capable, and in your definition please address their evaluation of the performance of the 70-200mm f/2.8L IS II, which they score lower than the MkI version of that lens. :o

No academic journal would publish a DxO report without both Data (in an Appendix) and Methods (with a clearly defined algorithm stating the parameters for weighting each category). In the world of peer review, Black-Box methodology would simply have REJECTED stamped on it and returned.

For a living I don't take pics, but I do publish papers in theoretical physics. Now, compared to the complete lack of rigor in photography testing (at least from my very limited experience of forum, blogs, and other online sources reader), the DxO tests seem outstanding, especially when compared to those of the various Ken Rockwell and company (including DPreview, which now roots for Canon like a cheerleader in the interpretation of their results). Are the DxO tests rigorous enough to be published in a serious scientific journal? Very likely, no. But for what they are meant to do (publicize a software), they are outstanding.
 
Upvote 0
My 3.2MP Pentax Optio S took wonderful pictures. My Canon 20D was soft over the entire 20K+ range I took with it. The 5D Classic I had was Superb. As is my 5DMkII. But I prefer the images from my old 1DsMkII. What does all the rhetoric actually mean without the burden of proof? Where are the PICTURES?!? 8 pages of hyperexented talk about this or that with zero pictures to back or explain any of the viewpoints!

I like the way a Canon body fits my hand. I like the build of a Pro series 1D. I do not like the ergonomics of Nikon. Either one takes pictures. Neither one composes them. Not a single camera out there sets up the lighting for a perfect shot. Who's perfect shot? Well that would have to be the client. Not the guy who buys the camera, but the multiple clients that pay for it! My client last night was tickled to death with the shots of her business taken with Canon's 5DMkII. That's what counts. Does she know about DxO? Does it matter? Does she or any of my other clients know wether Canon, Nikon, Pentax, Sony or Olympus would give them the look they want? Or do they simply trust me to deliver? So in the end isn't it always a subjective issue based on the need and the delivery and NOT anything at all to do with test results in the lab?

I look at DxO. I look at SLR Gear. I also read DP Review's outlook as well as delve through the pages here. And the only thing I can say with absolute certainty is that I'm not getting my Client's wedding album built while doing any of this.

Show Me The Pictures! 8)
 
Upvote 0
mystic_theory said:
For a living I don't take pics, but I do publish papers in theoretical physics. Now, compared to the complete lack of rigor in photography testing (at least from my very limited experience of forum, blogs, and other online sources reader), the DxO tests seem outstanding, especially when compared to those of the various Ken Rockwell and company (including DPreview, which now roots for Canon like a cheerleader in the interpretation of their results). Are the DxO tests rigorous enough to be published in a serious scientific journal? Very likely, no. But for what they are meant to do (publicize a software), they are outstanding.

I agree.

I'm also in the sciences & therefore appreciate that DXO is significantly closer to rigorous testing (at least for their sensors) than most other tests out there. Roger Cicala is also doing a good job on his blog over at lensrentals. Bill Claff does excellent work as well. Emil Martinec has written outstanding treatises. I like that photography does, in many instances, attract science/engineering-minded folk! Incidentally I've been working on a methodology to rigorously quantify AF accuracy & precision (much like what Cicala has started to do, although he's using a different method) as I find a serious dearth of such information on any site (but lots of qualitative statements everywhere, like, 'I feel like this lens focuses 100% accurately!!'). I think problems like the D800 left AF issue should be easily identifiable by consumers; many other such problems go unnoticed, but actually do affect real-world shooting, surfacing as reduced hit-rates of focus. For example, D800s sent in for the left AF issue, as well as some newer D800 bodies probably going through the same calibration center, now exhibit large front focus of the center AF point in relation to the leftmost & rightmost AF points (which agree with each other). This is not something that microadjustment can fix, and is something that'd go unnoticed by most shooters who microadjust using only the center point. When buying new lenses/bodies, it'd be great to know what sort of AF accuracy & precision we can expect from the combo, and it's not unreasonable to expect wildly varying results given what a complicated process AF is (e.g. correcting for a lens' spherical aberration and how it affects the offset the AF system must apply to the phase data off the AF sensors which only evaluate light from the outer edges of the lens, etc.).

The thing is this stuff is quantifiable. And I'm glad someone (DXO) is doing it properly for sensors. I'd like to see it done for AF, & Cicala has made a great start with his tests.

As for dpreview, actually look at what they had to say about Canon & the 6D:

"Overall, though, it's difficult to shake the feeling that the EOS 6D simply lacks the 'wow' factor of its main rival. Whereas Nikon seems to have taken the approach of taking away as little as possible from D800 when creating the D600, Canon appears almost to have gone the other way, removing as much as it thinks it can get away with at the price. The result is the kind of conservative, slightly unimaginative design that's become the company's hallmark. It's still bound to be a very good camera, of course; just perhaps not quite as good as it could be."

http://www.dpreview.com/previews/canon-eos-6d/6

I think dpreview does a good job of trying to remain unbiased.
 
Upvote 0
neuroanatomist said:
sarangiman said:
Zlatko said:
I do weddings and portraits with the 5D3 and have never seen any banding issue.

I see banding in my 5DIII images just from having Lightroom automatically correct the vignetting for my 24/1.4 & 35/1.4 lenses. I just try to ignore it. :'(

Have you tried DPP instead of LR?

I use LR as that is what I have used for years now and it has a seamless route to CS5. I really want my 5D3 to work for me but in areas where it's dynamic range is challenged, it fails the test. There are many things it does very well. But I can't live with having to smudge all the details with Dfine just because Canon cannot get a grip on chroma noise banding at ISO 100 - 640. When it gets to 800 or above, the mkIII makes its mark. The only time as a landscape shooter I will need to use ISO 800 is when I have lost my tripod or am too lazy to get it out of my backpack. Both have happened before. I should be able to push a little detail out of the shadows without the whole shot being ruined. That I can do with the D800. I cannot with the mk III.
 
Upvote 0
If you folks are concerned with DR then just go back to film because film is still the king.
Dedicated 35mm scanners are quite cheap these days, you know?

Anyone who claims his gear limitations is what is holding him back from producing stunning images is a liar.
 
Upvote 0
PVS said:
If you folks are concerned with DR then just go back to film because film is still the king.

Not necessarily. For higher acceptable SNR on the low end, digital sensors trump negative film.
http://www.dxo.com/var/dxo/storage/fckeditor/File/embedded/2012%20Film_vs_Digital_final_copyright.pdf

Of course, this also depends on size of film, since larger film formats will allow for more detail to be pulled out of shadows while maintaining acceptable SNR for reasonable sized prints.

P.S. Just for that paper alone, DXO rocks :)
 
Upvote 0
PVS said:
If you folks are concerned with DR then just go back to film because film is still the king.
Dedicated 35mm scanners are quite cheap these days, you know?

Anyone who claims his gear limitations is what is holding him back from producing stunning images is a liar.

Depends on the film. Velvia 50 for example has very low DR. I use various film in Medium format bodies. Dynamic Range can be very important or not important at all. Images from mkIII often have blocked shadows. You end up with the choice of blowing highlights or using HDR or pushing detail from the shadows. I prefer the latter if I must because there is only so much detail that you can get from a blown out highlight and I always shoot RAW. HDR is hit and miss and you can lose too much contrast.

banding chroma noise isn't acceptable to me, so I have t take action. The mkIII is capable of taking great images but if there are shadows you want to lift, forget it. If I had to pixel peep to see it, it wouldn't be an issue. But I have had had it visible at web size dimensions. For £3200 it isn't acceptable.
 
Upvote 0
mystic_theory said:
...Are the DxO tests rigorous enough to be published in a serious scientific journal? Very likely, no. But for what they are meant to do (publicize a software), they are outstanding.

As I stated, their testing methodology is rigorous, and the data are of high quality. It's their data interpretation that's flawed.
 
Upvote 0
Nice post sarangiman!

Nice post sarangiman. It would be great to see a bit more scientific method in the testing and review of DSLRs.

I had to learn about the poor quality of Canon autofocus on my own, because I could find no really useful information on the net, back in the old days - say about 1-2 years ago.

I am currently awaiting the arrival of a 5D Mark 3 with great hopes for my future autofocus happiness. I've studied Roger Cicala's articles in great detail and I am prepared to buy Canon lenses that use the new autofocus method if I feel the need. I'm going to try the Mark 3 with my current lens collection first to see how it works after careful micro-adjust. I'll let folks on this forum know how it works out.

Mike
 
Upvote 0
I think you'll be happy with the Mark III, Mike. Actually Canon's AF as implemented in the Mark III in combo with its 24L, 35L, & 85L primes is the only reason I'm sticking with Canon right now. The DR & Auto ISO implementation on the D800 would otherwise be enough to make me want to switch. Well, and native use of the 14-24 :)

Note that for primes you'll want to microadjust based on your shooting habits. Since I tend to use the primes to shoot rather close subjects (that's more my style), I microadjust using a LensAlign at a distance of 25x focal length. I think manufacturers tend to suggest 50x focal length, as the best compromise. Certainly, using 25x focal length, infinity no longer focuses properly at wide apertures... it's a shame camera companies haven't implemented some sort of interpolation for microadjustment values based on subject distance (like they've done for focal length of zoom lenses). Maybe it's coming?

This does affect me in real-world shooting. For example, the other day I happened to be using my 85L in a non-standard way, shooting subjects more distance simply b/c of the way the event was set up. I noticed I got better results by resetting my microadjustment to 0 (which works best for distant subjects).

And then there are days where the microadjustment just seems to be off... haven't quite figured that one out yet... but luckily, it's not too often.

The good thing about the Mark III is that its precision is good enough that usually you can tell if the lens if front focusing or back focusing. Not so much with my Mark II, which had so low precision that I just couldn't tell b/c focus was all over the place.

Now, I realize I'm placing high demands on the system b/c I'm shooting below f/2.0... but why else buy a prime if not for that almost 3D look of having a subject pop out from a blurred background? That's generally what I'm interested in when I'm using primes... not always, but most of the time.

The Mark III brings me closer to achieving that, without having to take 100 shots just to get 15 or 20 in focus at f/1.4.

The Nikon D800 also has very good (similar to Mark III) precision. But there are other issues with lenses that I won't get into here, but will hopefully write an article about soon. Furthermore, their focus points being totally miscalibrated with respect to each other is just something I don't wish to deal with. Luckily, my 5DIII focus points are pretty consistent (the leftmost one slightly backfocuses compared to the others, but its acceptable and nowhere near as drastic as what I've seen testing 4 different D800 bodies).
 
Upvote 0
neuroanatomist said:
mystic_theory said:
...Are the DxO tests rigorous enough to be published in a serious scientific journal? Very likely, no. But for what they are meant to do (publicize a software), they are outstanding.

As I stated, their testing methodology is rigorous, and the data are of high quality. It's their data interpretation that's flawed.

Yes, this.
 
Upvote 0
sarangiman said:
dtaylor: How do you test DR?

Stouffer step wedge and visually inspect the results.

Furthermore, your results match DPReview? DPReview doesn't test RAW dynamic range...

They used to report RAW and JPEG.

My own 'real-world' tests also show ~3EV better DR on the D800 when I do side-by-side shots of high DR sunsets with my 5DIII vs. D800;

I have a hard time believing 3 stops, though I must admit I have not formally tested these bodies.

Put another way: I have to overexpose my 5DIII by 2 to 3 stops at the very least to get its shadows to match the cleanliness of lifted shadows of the D800 file that was underexposed to maintain highlights.

"Match the cleanliness" is a wide open question. Are you matching at 200% in PS or in a 20" print? And what constitutes a "match"? Does the 5D3 not "match" if there's a hint of noise that's irrelevant to 99% of uses? And to what degree does color play a role? (When you push RAW converters you can often recover detail that is correct in terms of tone, but incorrect in terms of color. How much of this you're willing to accept will alter the final judgement on DR.)

So, respectfully, I fail to see how DXOs DR & SNR, etc., numbers are the 'odd ball out'.

When I've compared their results to other sites, or to my own experience, they have not matched. One example: according to DxO the 7D (Canon's 18 MP sensor) has little DR gain over the 10D / 20D. I could tell you before formally testing them that it was large, 2 stops easily.

Now maybe I'm being too harsh. Maybe their current tests are better, or maybe it just so happens that the cameras I compared were the odd balls, not the entire testing methodology. I'll take another look. But DxO seems easily thrown by small factors, or easily gamed. Michael Reichmann was a big fan when they first started, then dropped them later because really tiny things would shove one score well above another, and not just on DR.

To clarify: I don't at all mean this to be a personal attack; just looking for clarification.

As was evident from the tone of your post, and I appreciate that. I'll look more carefully at DxO's latest results.
 
Upvote 0
sarangiman said:
As for dpreview, actually look at what they had to say about Canon & the 6D:

"Overall, though, it's difficult to shake the feeling that the EOS 6D simply lacks the 'wow' factor of its main rival. Whereas Nikon seems to have taken the approach of taking away as little as possible from D800 when creating the D600, Canon appears almost to have gone the other way, removing as much as it thinks it can get away with at the price. The result is the kind of conservative, slightly unimaginative design that's become the company's hallmark. It's still bound to be a very good camera, of course; just perhaps not quite as good as it could be."

http://www.dpreview.com/previews/canon-eos-6d/6

I think dpreview does a good job of trying to remain unbiased.

I see: well, I have to admit that I haven't read a dpreview review in a long while, since I read one on the 600D that I found very biased. One excerpt says:
"The Highlight Tone Priority option (Custom Function II.6) is a method for capturing more information in the brightest parts of the scene. It does this by applying ... Turn this on and the 600D captures an extra stop in the highlights, resulting in an overall range that at least matches that of Sony and Nikon models."
Notice that Nikon models have similar options, and the 5100D has about 3 stops more DR. But maybe things have changed at dpreview.

Notice also that I own a 550D, since I wanted to keep my awesome 10-22mm Canon but didn't want to spend extra bucks to have the same sensor for the 650D (or for a 60D, a 7D, an eos M, and the list goes on and on and on). I hope that within a few months there will be a new crop sensor from Canon with decent DR and low noise at high ISO: such a 70D would be perfect for me.
 
Upvote 0
Albi86 said:
Maui5150 said:
I love when idiots see some test, and then jump all over it.


Looks exactly like what you just did, since DXO is about sensors and not about cameras. When choosing a camera surely many other factors are to be considered, but that wasn't the point of these tests to begin with.

All your arguments make therefore little to no sense.

+1

Maui5150, before you call other people idiots (and you actually name them by quoting), make sure you know where you're treading - so you don't step in your own turds
 
Upvote 0
Hi,

DxO said they test the sensor, but IMHO, they are actually testing the camera, because they test using the RAW file from the camera not the RAW data directly from the sensor. If the RAW file is not RAW data, then you are not testing the sensor, but how camera handle the image.

Have a nice day.
 
Upvote 0
weixing said:
Hi,

DxO said they test the sensor, but IMHO, they are actually testing the camera, because they test using the RAW file from the camera not the RAW data from directly the sensor. If the RAW file is not RAW data, then you are not testing the sensor, but how camera handle the image.

Have a nice day.

Yes, they're testing the camer'a sensor and the part of the electronics that gets data off the sensor and turns it into a RAW image file. Nothing else is tested - not build quality, not sealing, not AF or metering, not any other functions, not anything except the sensor and the electronics directly involved in creating an image from it.
So what you say is only partly true.
 
Upvote 0
bdunbar79 said:
neuroanatomist said:
mystic_theory said:
...Are the DxO tests rigorous enough to be published in a serious scientific journal? Very likely, no. But for what they are meant to do (publicize a software), they are outstanding.

As I stated, their testing methodology is rigorous, and the data are of high quality. It's their data interpretation that's flawed.

Yes, this.
+1
How they arrive at their numbers is very questionable. However, if you look at their data rather than their sensor score, you can find out what you need to know.
Trying to give weight to numerous important parameters and put them into one score doesn't work, it does not tell you if a parameter you value highly is being given a low weighting.
 
Upvote 0
Status
Not open for further replies.