Dustin Abbott Nikon D850 vs Canon 5D mark IV sensor comparision

Geek said:
[words cut for space]
You're overstating how much the ADC is doing and also how distinct it is from the rest of the processor. The ADC matters, yes, and in past generations some of the biggest leaps in image quality and possibilities were down to advancements in ADC efficiency, but for several hardware generations now we've been at a point where the ADC is A) capable of far more than is being asked of it, and B) so refined and 'direct' in its work that it really can't have any particular image qualities attributed to it. Additionally, it has been a long time since it was normal for the A-D conversion to be handled by a physically separate unit than the main processor (I say 'main'; of course many SLRs have been made with multiple CPUs, and which one is doing the most work varies from camera to camera) and in many cases the whole imaging sequence—light capture by the sensor, conversion to digital, and processing and saving—is now handled within what is technically a single part (by patent purposes), rendering the distinction between ADC and the CPU not only irrelevant but actually inaccurate.

Analogue-digital conversion matters, but your take on it is a little out of date and even in the cases where it's not (e.g. Fuji cameras) it's still not really correctly crediting each part.
 
Upvote 0

Don Haines

Beware of cats with laser eyes!
Jun 4, 2012
8,246
1,939
Canada
Old tech.... the 7D2....

By going to an external A/D system like the traditional Canon system, you have 8 (or 16) A/D converters, and 20 million pixels to read at a burst rate of 10FPS.... That means that in a dual DIGIC setup, the A/D has to be able to read a new value every 80 nanoseconds.... to read video at 60FPS, you are now looking at a new reading from each A/D every 13 nanoseconds... Darn Fast! This is why there is no 4K on the 7D2.

Then switch to new tech.....

Compare this to to an on-chip A/D.... you have one for each row.... for the same size sensor you are now looking at 5472 readings times 10 frames per second..... or 18,265 microseconds per reading..... a heck of a lot slower, and that means greater accuracy and much lower noise.

The change to the type of A/D system means far more than which processor is running the algorithm to place the output data into a file......
 
Upvote 0
Even in cases where the ADC is embedded in the same chip as the (digital) processor, I would not count it as part of the processor for this discussion. For me the ADC is part of the sensor (even if not on the same chip). The ADC of course has an influence on the noise performance of the image, but not on color (except for cases where it introduces crosstalk between pixels of different colors).

I have played around with the raw files (self-made c++ program using libraw) from the 70D and there are at least two areas where it is appears to not be completely raw. The raw file contains meta data (including the preview-jpeg) and a 14 bit number for each pixel. Each pixel is sensitive to only one color. On the edges of the sensor there are some special covered pixels, some black, some always saturated.

The black level in the file is always 4096, independent of exposure time and ISO. I would have expected that to go up due to dark current. It seems like the camera is subtracting (or adding) an offset to all values to shift black to 4096. The camera can compute the offset from dedicated pixels at the edge of the sensor which are covered by some material and therefore always receive no light. It is however not completely impossible to do the offset completely analog before the ADC, so the processor might not actually alter the data from the ADC here.

Known bad pixels (a list of bad pixels in the camera) are masked (likely replaced by the average of the surounding pixels). My camera has developed two or three hot pixels during the years, which no longer appear in the raw file after letting the camera check for them after or during manual sensor cleaning (close body cap, set camera into manual sensor cleaning, wait ~30s, turn camera off, wait some more). The fact that they are not removed automatically before making them known to the camera leads to the conclusion, that no additional noise removal (like Sony's star eater) is performed on the raw data.

The processor can do nothing for color (except for adding meta data and the preview jpeg) in raw files. If changes in color appear with changes of the processor, this might be caused by changed color filter arrays which produce better results but need more processing power to generate the in-camera jpeg (which only the new processor can do reliably). Or just changes to the default processing profiles in the camera which then are also included as default in raw converters.
 
Upvote 0
9VIII said:
Geek said:
...
And yes, the raw file is the digital data from the output of the A/D converters along with some other information about the required to turn the data into a usable image. I'm sure as mentioned above there is some processing applied to the raw data to reduce noise adjust for ISO levels, etc. But that is relatively minimal compare to the processing required for final images.
...

This paragraph is self confliciting.
All RAW files are baked, it is impossible to access data as it was generated directly from the ADC on any modern camera. The design of the processor will affect the final image. Hook up Digic 4 to a 5DS and you’ll get a very different image, not just get it slower.
Maybe at one point things weren’t processed that way but they are now.

I disagree completely!! Hook up a Digic 4 to a 5DS and use all of the same algorithms and the A/D converter from the 5DS and you will get the same results as the 5DS with dual Digic 6's. Much slower, but the same results. In fact the original Digic will give the same results, far too slowly to be used in a product, but still the same results.

If you call raw files baked by having perhaps a little noise reduction(even that is somewhat speculation) applied and any offsets for ISO, dark current, etc., then I agree with you that the raw files are baked. However, the data provided in a raw file is representative of the digital values read from the individual photo sites on the sensor with minor tweaks specific to the camera hardware. The raw data does not have any significant processing applied.
 
Upvote 0
If any of y'all don't think the processor can or does make a difference to colour, you clearly were not around Canon during the big DIGIC II crisis. Go look up the history of that processor and its cameras. Note they all used different sensors and different ADC (as was prior to ADC being physically encapsulated within the CPU housing), but they all ended up with the same (raw) colour (accounting for sensor size variations, of course) and that DIGIC II-specific colour is something that is still frequently requested (especially for video) and the DIGIC II bodies have managed to hold their value better (relative to age) than the DIGIC III bodies as a result.


In any case, at least we're now (mostly) in agreement that it's not just the sensor responsible for producing everything, which was the original point being made.
 
Upvote 0

Don Haines

Beware of cats with laser eyes!
Jun 4, 2012
8,246
1,939
Canada
aceflibble said:
If any of y'all don't think the processor can or does make a difference to colour, you clearly were not around Canon during the big DIGIC II crisis. Go look up the history of that processor and its cameras. Note they all used different sensors and different ADC (as was prior to ADC being physically encapsulated within the CPU housing), but they all ended up with the same (raw) colour (accounting for sensor size variations, of course) and that DIGIC II-specific colour is something that is still frequently requested (especially for video) and the DIGIC II bodies have managed to hold their value better (relative to age) than the DIGIC III bodies as a result.


In any case, at least we're now (mostly) in agreement that it's not just the sensor responsible for producing everything, which was the original point being made.

Having been active is DSP since the late 1970s, I disagree.

You build up a RAW file by recording the various settings on the camera and the lens, and then reading the sensor data from the output of the A/D converter (typically an 8X8 pixel block), compressing it, and proceeding to the next block until the entire sensor is read. A Jpg file may also be created and stored separately, and a jpg thumbnail may be appended to the RAW file.

The processor used to do this has no effect on the image, it only affects speed and power consumption.
 
Upvote 0
aceflibble said:
If any of y'all don't think the processor can or does make a difference to colour, you clearly were not around Canon during the big DIGIC II crisis. Go look up the history of that processor and its cameras. Note they all used different sensors and different ADC (as was prior to ADC being physically encapsulated within the CPU housing), but they all ended up with the same (raw) colour (accounting for sensor size variations, of course) and that DIGIC II-specific colour is something that is still frequently requested (especially for video) and the DIGIC II bodies have managed to hold their value better (relative to age) than the DIGIC III bodies as a result.


In any case, at least we're now (mostly) in agreement that it's not just the sensor responsible for producing everything, which was the original point being made.

Again to be precise, the processor does not have any impact on the final image. The lens, focusing mechanism, exposure mechanism, sensor, all of the analog circuitry, A/D converters, algorithms (firmware/software) used by the processor to process images, etc. have an impact on image, but the processor itself does not.

As Don stated, the processor just impacts the speed and power used.
 
Upvote 0
"Everything that makes up and is used by the processor matters... but the processor doesn't matter."

... You two realise how you're contradicting yourselves there, right? Or did you skip the earlier part of the thread? We've already gone over how everything other than the sensor itself is encapsulated by the processor now (and this has been the way for over a decade now).
 
Upvote 0
Mar 2, 2012
3,188
543
aceflibble said:
"Everything that makes up and is used by the processor matters... but the processor doesn't matter."

... You two realise how you're contradicting yourselves there, right? Or did you skip the earlier part of the thread? We've already gone over how everything other than the sensor itself is encapsulated by the processor now (and this has been the way for over a decade now).

Problematic is that some of which “we” have gone over is bad information buried in good, for example that the basis for which we call dynamic range is the processor throwing away data from adjacent pixels of equal magnitude, when it is in actuality a function of full saturation and the lowest meaningful signal (i.e. where SNR is greater than 1).

What is lost in all this is that a processor merely does what it’s told to. It’s a series of logic gates configured programmatically by people. If you took the same logic from a digic 5 chip and ported it to digic 2 (assuming the image fits), they would act the same on the same supplied compatible data.
 

Attachments

  • A1592E65-E1CD-46EA-84C8-55B6B421852C.jpeg
    A1592E65-E1CD-46EA-84C8-55B6B421852C.jpeg
    218.3 KB · Views: 146
Upvote 0
aceflibble said:
"Everything that makes up and is used by the processor matters... but the processor doesn't matter."

... You two realise how you're contradicting yourselves there, right? Or did you skip the earlier part of the thread? We've already gone over how everything other than the sensor itself is encapsulated by the processor now (and this has been the way for over a decade now).

I think I've been trolled!! Either that or these people are lumping way too much stuff into their definition of a processor. Apparently including some "magic"!

Uncle, I'm tapping out!
 
Upvote 0