Dynamic range testing of the Canon EOS R3 is complete

stevelee

FT-QL
CR Pro
Jul 6, 2017
2,383
1,064
Davidson, NC
I’ve been enjoying the discussion of Raw files and how raw are Raw. In real life I don’t have that much problem with noise. Even my shots of Venice taken near dark from a moving ship with the G5X II (1” sensor) are noisy mostly in the expanse of sky. I did apply some Photoshop noise reduction, although given that my printer uses just 8 or 9 inks, noise in the dark sky is unlikely to show up on a print. I like the best picture of the lot enough to be tempted to get the Epson printer that adds purple ink to the mix to see if expanding the gamut that direction will make the sky look a bit more like the very deep blue I see on screen. The picture now is framed and on display in three states, so I’m not the only one who likes it even as is.

More to the point of how raw is Raw, is the file that is written to the card already the product of demosaicing or is that handled by the software in the computer?

I gather that we can’t really see a Raw picture until it is interpreted in some way. Would it be clusters of red, blue, and two green dots, or would it be the half white non-gamma interpreted weirdness?
 
Upvote 0
More to the point of how raw is Raw, is the file that is written to the card already the product of demosaicing or is that handled by the software in the computer?

I gather that we can’t really see a Raw picture until it is interpreted in some way. Would it be clusters of red, blue, and two green dots, or would it be the half white non-gamma interpreted weirdness?

Generally speaking, Canon RAW stores the bayer CFA data, "undemosaiced" pixels. There are differences in CR2 vs CR3, dual pixel, etc.
Your image editor (Canon DPP, Lightroom, etc) reads the CFA-encoded data and demosaics it for your screen according to it's knowledge of the physical color filter array layout.
There are many demosaic algorithms and I'm not sure which one Canon uses for camera JPG files.
Note that the TIFF spec and, by extension, the DNG spec have provisions for Color Filter Arrays (like Bayer) or LinearRaw pixel arrays.
 
  • Like
Reactions: 1 users
Upvote 0
Apr 29, 2019
271
260
Why should dynamic range be lower when using ES?!
Cameras giving 12 bit color depth in ES and 14 in MS do have less dynamic range.
How to transfer 12 steps dynamic range with just 12 bit code depth?

The R3 provides 14 bits color depth while shooting 30 FPS in ES, the R5 provides 12 bit while shooting 20 FPS in ES, 14 bit with 12FPS in MS.
This gives the experience of less dynamic range in ES.
 
Upvote 0
Cameras giving 12 bit color depth in ES and 14 in MS do have less dynamic range.
How to transfer 12 steps dynamic range with just 12 bit code depth?

The R3 provides 14 bits color depth while shooting 30 FPS in ES, the R5 provides 12 bit while shooting 20 FPS in ES, 14 bit with 12FPS in MS.
This gives the experience of less dynamic range in ES.
Eos R5 (from the manual) quote 13Bit MS H+
 
  • Like
Reactions: 1 user
Upvote 0
Lenses are an interesting parallel here. Canon now has three RF lenses, one of them an L-series lens, where they chose to sacrifice geometric distortion correction in the optical design in favor of forced correction in software (although obviously they can only force correction in their own software).
Wandering a little Off Topic (OT) but FWIW distortion correction is generally stored in the image as metadata and is then applied regardless of the software.
One way to see the data is to convert to DNG and look at the "opcode" tags.
You can also use programs like Exiftool to strip out the distortion correction and see the "raw" distorted image.
You might do this is you wanted to apply your own corrections.
In this respect it's not analogous to the raw image data issue since that cannot be "un-cooked".
 
  • Like
Reactions: 2 users
Upvote 0
Cameras giving 12 bit color depth in ES and 14 in MS do have less dynamic range.
However, even if ES is the same bit depth as MS it often has higher read noise and lower dynamic range; the R3 is an example.
How to transfer 12 steps dynamic range with just 12 bit code depth?
As it turns out you can capture n+0.5 stops of dynamic range with an n-bit Analog to Digital Converter (ADC).
The limiting factor is something called Quantization Error and it's very technical.
You can see Quantization Error in Practice but it's not light reading.
 
  • Like
Reactions: 2 users
Upvote 0
Jul 21, 2010
31,088
12,851
Wandering a little Off Topic (OT) but FWIW distortion correction is generally stored in the image as metadata and is then applied regardless of the software.
One way to see the data is to convert to DNG and look at the "opcode" tags.
You can also use programs like Exiftool to strip out the distortion correction and see the "raw" distorted image.
You might do this is you wanted to apply your own corrections.
In this respect it's not analogous to the raw image data issue since that cannot be "un-cooked".
The reference to being 'forced' is that for the three lenses in question (RF 24-240, RF 16/2.8, RF 14-35/3L), Canon has disabled the ability to turn off distortion correction in-camera (it's ON and the selection is grayed out), and in Canon's DPP when you set distortion correction to zero, it is still corrected. My testing of corrections for the RF 14-35L are posted here.

I was not aware that those corrections are written to metadata in a way that can be applied by other RAW converters. Certainly when images from those lenses were opened in non-Canon RAW converters (Adobe, Capture One) early on, prior to availability of lens profiles, the uncorrected distortion was manifestly evident to the great dismay of a small corner of the internet.
 
Upvote 0
The reference to being 'forced' is that for the three lenses in question (RF 24-240, RF 16/2.8, RF 14-35/3L), Canon has disabled the ability to turn off distortion correction in-camera (it's ON and the selection is grayed out), and in Canon's DPP when you set distortion correction to zero, it is still corrected. My testing of corrections for the RF 14-35L are posted here.

I was not aware that those corrections are written to metadata in a way that can be applied by other RAW converters. Certainly when images from those lenses were opened in non-Canon RAW converters (Adobe, Capture One) early on, prior to availability of lens profiles, the uncorrected distortion was manifestly evident to the great dismay of a small corner of the internet.
Interesting. If Canon locks you in to their software (which others do not) I think that's too bad.
You're probably right but "proof" would be a raw file converted to DNG and then inspected for "opcode" tags.
Message or email if you're interested and need details.
 
Upvote 0
Jan 27, 2020
826
1,796
All the graphs are really "cool". But the "money" is in IQ.

The Z9 has superior IQ to the R3 and 1DXMKIII

....
Just curious as to how you have come to this conclusion.

Have you, by chance, used the Z9 and R3? Which would, of course, be the only way that you could compare their IQ and make a judgement.
 
Upvote 0

Pixel

CR Pro
Sep 6, 2011
297
187
I’ve been enjoying the discussion of Raw files and how raw are Raw. In real life I don’t have that much problem with noise. Even my shots of Venice taken near dark from a moving ship with the G5X II (1” sensor) are noisy mostly in the expanse of sky. I did apply some Photoshop noise reduction, although given that my printer uses just 8 or 9 inks, noise in the dark sky is unlikely to show up on a print. I like the best picture of the lot enough to be tempted to get the Epson printer that adds purple ink to the mix to see if expanding the gamut that direction will make the sky look a bit more like the very deep blue I see on screen. The picture now is framed and on display in three states, so I’m not the only one who likes it even as is.

More to the point of how raw is Raw, is the file that is written to the card already the product of demosaicing or is that handled by the software in the computer?

I gather that we can’t really see a Raw picture until it is interpreted in some way. Would it be clusters of red, blue, and two green dots, or would it be the half white non-gamma interpreted weirdness?
Actually a RAW file is black and white or shades of gray and the color is applied post capture.
 
Upvote 0
Actually a RAW file is black and white or shades of gray and the color is applied post capture.

In the demosaicing process?
It can be a color, or grayscale, whichever you prefer. From a software perspective, your raw file has numbers representing illuminance at photosites (after hardware and firmware have done their thing). Each photosite was covered by a red, green, or blue color filter; knowing the photosite's location in the sensor array will tell you which.
Now, if you *interpret* a photosite number as the filter's color brightness, and display it as such, then you get a (funky) color image.
But if you interpret a photosite number as luminance only (not caring which RGB color the number represented), then you have a grayscale image.
Until you demosaic the Bayer pattern data into RGB pixels, I suppose it's neither, or both.
 
  • Like
Reactions: 1 user
Upvote 0
My biggest problem with my Canon cameras is not the overheating but the mediocre dynamic range. See the lab test results from CineD. https://www.cined.com/lab-tests/ If the R3's new stack BSI sensor is only comparable or slightly better in ES mode with R5's dynamic range, it puts it way behind its competition even after trying its latest tech on sensors. This is not the typical Canon I would expect. Canon should be the "if I want to get something good, I can" type of company. Until I see dynamic range being hyped in their marketing, I won't buy another Canon camera.
 
  • Haha
Reactions: 1 user
Upvote 0

RayValdez360

Soon to be the greatest.
Jun 6, 2012
787
555
42
Philadelphia
My biggest problem with my Canon cameras is not the overheating but the mediocre dynamic range. See the lab test results from CineD. https://www.cined.com/lab-tests/ If the R3's new stack BSI sensor is only comparable or slightly better in ES mode with R5's dynamic range, it puts it way behind its competition even after trying its latest tech on sensors. This is not the typical Canon I would expect. Canon should be the "if I want to get something good, I can" type of company. Until I see dynamic range being hyped in their marketing, I won't buy another Canon camera.
Canon Europe actually said the dynamic range on their site for clog 3. Also how is the DR crappy. it doesnt look right or unusable. You people kill me with the exaggerations. You go ape shit over a stop or less than stop like the camera is unusuable or cant look comparable to any modern camera.
 
  • Like
Reactions: 2 users
Upvote 0
Jul 21, 2010
31,088
12,851
Interesting. If Canon locks you in to their software (which others do not) I think that's too bad.
You're probably right but "proof" would be a raw file converted to DNG and then inspected for "opcode" tags.
Message or email if you're interested and need details.
Thanks, Bill. No, there's no lock in – Canon RAW files are supported by every 3rd party converter of which I'm aware (personally, I use DxO PhotoLab). This is just another mainly academic issue, IMO, although perhaps a bit less so than 'baked-in' NR because in the case of the geometric distortion it's not baked in (just forcibly applied only by Canon's software). In fact, in the tests I linked above I found that the RF 14-35mm can actually deliver an FoV of ~13.5mm when processed with DxO's geometric corrections, so there's some real-world benefit in this case.
 
Upvote 0
Jul 28, 2015
3,368
570
The raw data is the value as it was read from the Analog to Digital Converter (ADC) probably with some offset (Black Level).
Plenty of cameras appear to report this raw data in their raw files.

If the value is manipulated in the firmware before it is written to the then it is "cooked".
Occasional manipulation of isolated pixels such as bad photosite repair isn't considered "cooking".
Changing every (or most) pixels based on inspecting their neighbors is "cooking".

Nearest neighbor algorithms are detectable using a variety of mathematical tools to inspect "raw" files.
My tools can indicate whether an algorithm was applied.
Thanks Bill - this is my first visit to the site after I posted.
Even if you want being as technically precise as possible there is still the possibility of misinterpretation. And when terms like 'noise' or dynamic range' become surrogate markers for how good a product is, it can be a case of 'little knowledge is a dangerous thing'. Which I was hoping to highlight.

When it comes to these in-depth technical knowledge of these things I can get easily lost, but you seem to be able to strip it down to essentials and get your point over very effectively. All with accessible data to support it. Kudos to you! :D
 
  • Like
Reactions: 1 user
Upvote 0