DXOMark tests the Canon EOS R image sensor, scores it at 89

Nov 2, 2016
849
648
I disagree. Most aspects of a lens’ optical performance, including distortion, vignetting, transmission, coma, astigmatism, longitudinal CA, field curvature, etc., do not change from one sensor to another. But perhaps you’re someone for whom sharpness is the only important aspect of lens performance. In that case, if you don’t already know how, I’d recommend learning to interpret MTF curves – they give a good idea of lens sharpness with the caveat that, Zeiss notwithstanding, you’re seeing the ideal/best possible lens performance, which in practice is subject to the vagaries of production.
Oh, don’t be so condescending. I’ve been doing this, professionally, for decades.
 
Upvote 0
Jul 21, 2010
31,168
13,006
Oh, don’t be so condescending. I’ve been doing this, professionally, for decades.
Your statement that testing a lens on a particular body tells you nothing about how that lens wil perform on a different body is flat out wrong, meaning that despite decades of professional experience, you have some major gaps in your knowledge. Don’t feel bad, I’m sure there are electricians with decades of experience who don’t know Coulomb's law, and carpenters with decades of experience who don’t know about xylem and phloem, etc. Doing something, even doing it very well, does not automatically confer understanding of the tools of one’s trade.
 
Upvote 0
Nov 2, 2016
849
648
Your statement that testing a lens on a particular body tells you nothing about how that lens wil perform on a different body is flat out wrong, meaning that despite decades of professional experience, you have some major gaps in your knowledge. Don’t feel bad, I’m sure there are electricians with decades of experience who don’t know Coulomb's law, and carpenters with decades of experience who don’t know about xylem and phloem, etc. Doing something, even doing it very well, does not automatically confer understanding of the tools of one’s trade.
I honestly don’t think you really understand this as well as you think you do.
 
Upvote 0
Jul 21, 2010
31,168
13,006
I honestly don’t think you really understand this as well as you think you do.
I see. In that case, please describe how the sensor behind the lens affects measurements of the lens’ distortion, vignetting, field curvature, etc. Feel free to leverage your decades of experience as you attempt an explanation.
 
Upvote 0
Feb 28, 2013
1,615
280
70
That's literally how they test the lenses: on the bodies. I don't know how you think they test lenses otherwise.

The R's processor is having to do a lot more than the 5D4 is and the battery isn't able to supply as much juice to the processor to begin with. That's one of the few areas where SLRs are still well ahead of mirrorless and will continue to be so for the next couple of generations. There's less demand on the processor and battery in an SLR so you can beef up the power going to the AF drive, clock up the processor, etc. The larger SLR bodies also typically feature bigger heatsinks, which further help how much the processor and battery can push out. On a mirrorless body the processor never gets a rest, heatsinks are typically smaller, and the battery is being drained constantly.

So it's not only unsurprising but completely normal that the same—or even fractionally newer—tech in a mirrorless body would perform a little worse than the SLR equivalent.

Also bear in mind that the DIGIC 8 is not outright better than the DIGIC 6+ found in the 5D4. As a general rule the '+' DIGIC processors are the equivalent of 'two and a half' generations ahead of the plain-numbered ones. Don't forget that the 5D4 has a second DIGIC 6 processor, too, to help share the load.
Note the 1DX2 uses two DIGIC 6+ processors and can do 4K60 with a 1.3x crop, while the 5D4 with a single DIGIC 6+ and one DIGIC 6 does 4K30 with a 1.74x crop and the R—with a single DIGIC 8—does 4K30 with a fractionally tighter 1.75x crop. Hence why the 1DX2 likely isn't going to be replaced until the DIGIC 8+ is ready; the DIGIC 8 isn't quite enough to replace the DIGIC 6+ yet.
It's a similar difference to regular PC CPUs, where something like a brand new 8th generation i3-8300 is worse than a discontinued 6th generation i7-6700K. Just because a processor is newer does not mean it's better.
There is a completely logical reason to test lenses and cameras separately because even in the digital world you have production tolerance variation which may manifest itself as lowering the performance of a given lens on a given camera body. Go over to the lens rentals article on MTF that's the true measure not on a camera body if testing the lenses and eliminating variances elsewhere. The same with the camera we test cameras independently on a "mule" lens where we are using a consistent & exacting testing regime in a controlled environment (we have a very precise light source with gratings to measure dynamic range against a manufacturers specification). Only then the two are tested together.
 
Upvote 0
Feb 28, 2013
1,615
280
70
I see. In that case, please describe how the sensor behind the lens affects measurements of the lens’ distortion, vignetting, field curvature, etc. Feel free to leverage your decades of experience as you attempt an explanation.
Its why we have very expensive, precisely aligned lens projectors. The MTF bench only tells us part of the story.
 
Upvote 0

AlanF

Desperately seeking birds
CR Pro
Aug 16, 2012
12,406
22,773
There is a completely logical reason to test lenses and cameras separately because even in the digital world you have production tolerance variation which may manifest itself as lowering the performance of a given lens on a given camera body. Go over to the lens rentals article on MTF that's the true measure not on a camera body if testing the lenses and eliminating variances elsewhere. The same with the camera we test cameras independently on a "mule" lens where we are using a consistent & exacting testing regime in a controlled environment (we have a very precise light source with gratings to measure dynamic range against a manufacturers specification). Only then the two are tested together.
Neuro made it absolutely clear that, quote: "Most aspects of a lens’ optical performance, including distortion, vignetting, transmission, coma, astigmatism, longitudinal CA, field curvature, etc., do not change from one sensor to another." And then he made it clear that resolution would be a variable between sensors, and if you were concerned only about resolution then read MTF charts.
 
Upvote 0
Feb 28, 2013
1,615
280
70
Neuro made it absolutely clear that, quote: "Most aspects of a lens’ optical performance, including distortion, vignetting, transmission, coma, astigmatism, longitudinal CA, field curvature, etc., do not change from one sensor to another." And then he made it clear that resolution would be a variable between sensors, and if you were concerned only about resolution then read MTF charts.
My point was very specific so let me explain it again. Production tolerances means lens X on Camera A maybe soft but lens X on camera B maybe just fine. Or lens X is soft on both Camera A or camera B. Now the margin maybe very small but its why you can adjust the back-focus of each lens to the camera. This can be done in-camera or by shimming the lens or in rare cases both. We need to establish however whether the lens or the camera is at fault.

The MTF machine measured on axis will not tell you if a lens has distortion, vignetting, transmission coma, astigmatism, longitudinal CAs, field curvature but a projector will show up most of the weaknesses. We test lens on the projector more than the MTF machine whilst in rental. The projector will identify lenses not hitting say 6ft, that have axial and transverse aberrations (Chromatic Aberrations), vignetting, field curvature and focus shift. We can see the differences between two types of lens or two of the same type of lens.

Likewise we can test the camera sensor measured from the mount and see any variance. So whilst I agree with everything Neuro posted the testing by DXOMark is flawed because camera & lenses need to be tested independently and then together
 
  • Like
Reactions: 1 user
Upvote 0
Mar 2, 2012
3,188
543
You’re optically measuring the position and orientation of the sensor from the mount? That’s cool. Do you have information about the mfr tolerances, or do you just compare actuals to the nominal flange distance?

While I understand the sensitivity, I wonder how much it really matters in the real world. I’d bet most rigidly mounted camera sensors are on average better controlled than a frame of flimsy film being pulled past the gate.
 
Last edited:
  • Like
Reactions: 1 user
Upvote 0
"I’m one of those crazy people that doesn’t care about dynamic range though..."

I'm with you on that one. DPReview and others seem to salivate over that. In reality, in the past 8 years, I've yet to shoot a situation where I felt I didn't have enough dynamic range. For my normal commercial shoots, I've never said, "Gee, I wish I had a bit more dynamic range". For my personal landscapes etc., if I really feel I want or need more DR, I'll bracket and combine with HDR software.
 
  • Like
Reactions: 1 user
Upvote 0
Jan 29, 2011
10,675
6,121
I've run into many situations where I didn't have enough DR. But in those situations, an extra stop or two would not have been enough.
Generally that is my experience as well, and I have been roasted here for saying it!

However I do like the control the newer sensors give you over the shadows without the hit in noise the older sensors gave you, case in point, I print a good amount and find to get accurate renditions on paper the darker shadows (zones I and II for those that still work the system) always need lightening. With my old 1Ds MkIII's I'd get noise instantly using the shadows slider and generally had to use the exposure slider and drop highlights and blacks instead, basically touching the shadows slider induced too much noise. With the newer sensors that isn't true, you can push and pull the shadows slider to get zones I and II exactly where you want them and don't have to think about any noise being amplified.
 
Upvote 0
Feb 28, 2013
1,615
280
70
You’re optically measuring the position and orientation of the sensor from the mount? That’s cool. Do you have information about the mfr tolerances, or do you just compare actuals to the nominal flange distance?

While I understand the sensitivity, I wonder how much it really matters in the real world. I’d bet most rigidly mounted camera sensors are on average better controlled than a frame of flimsy film being pulled past the gate.
Its an "in-house device" that we developed for digital movie cameras, and no we would not disclose how it works. We use another machine for measuring the DR again developed originally for digital movie cameras (not the sphere but the DR grating) so if more that one camera is being used on critical visual effects we can get cameras that are as closely matched as possible.
 
Upvote 0

stevelee

FT-QL
CR Pro
Jul 6, 2017
2,383
1,064
Davidson, NC
I've run into many situations where I didn't have enough DR. But in those situations, an extra stop or two would not have been enough.

I think I have given examples in other threads, such as cathedral interiors where I don't want the room too dark to see, but I still want the stained glass windows to show detail and rich colors. Another was when I was shooting in the Garden of the Gods in Colorado Springs late in the afternoon, taking shots of back-lighted rock formations. I also wanted detail and color in the back-lit clouds. Either would at best push Camera RAW's ability to recover highlights and lift shadows to their limits with one shot. Two shots combined and edited in ACR handled both situations with ease.
 
Upvote 0