I recall this very argument from you in one of the previous conversations. Digital post-processing manipulations are intrinsically flawed because they change the detail, NR simply decreases the resolution.
If that was all it did it would still alter SNR. But that's not an accurate summary of modern NR algorithms. And it's completely wrong for color NR, color noise being arguably the most intrusive component.
But it's getting even worse when you apply NR to one image and don't apply it to another.
No one is arguing that there's 0 read noise difference between a 5D4 and a D850. And no one is arguing that you can't also apply NR to a D850 file. The question is how perceivable is the read noise difference before processing. And if the answer is "not very" then what happens when NR shoves that difference further towards or below the absolute limits of human perception?
In the real world it plays out this way: the D850 owner does a hard shadow push and prints. The 5D4 owner does a hard shadow push, maybe bumps LNR/CNR a bit, and prints. That's the source of my "it's a processing difference" statement. By analogy to resolution we're not debating 50mp vs 20mp. We're debating 50mp vs 45mp.
5Ds/sR vs D850? Yeah, there are scenes where I will blend two exposures but my friend can simply push the shadows. But the 5D4? Move the NR sliders a bit.
The 70D, which should be roughly the same as your 7D, is significantly worse than 5DIV, but A7RIII is a bit better and I'd be extremely happy if Canon catches up.
That might be the first time I've seen you refer to a 5D4 vs. Sony/Nikon DR difference as "a bit better." I would call the Sony/Nikon sensors a bit better. I guess I would by happy to if Canon closed that gap. But I don't anticipate it because of DPAF.
I know. I still think there's some flaw in their method. 14 stops is the theoretical limit. Any measurement above means they do digital manipulations so they don't actually measure the sensor performance.
Altering the view size simply trades spatial information for SNR. And it doesn't have to be through 'digital manipulation.' Make a print where the shadow noise seems unacceptable to you nose-on-print. Now view it from 10 ft away.
It's not just a matter of human perception or emotion either. Again, at the extreme you can treat an entire visible light digital camera sensor as if it was one single detector and, with a long enough light-blocked exposure, reliably and accurately measure small amounts of ionizing radiation. There's literally an app for that. But if you just look at the SNR and noise specs for the sensor vs. the impact of the radiation being measured you would assume it to be impossible.
But in the filed I don't care about the sensor design tricks. I care if I should shoot with dual pixel enabled and adjust my exposure hoping that DPRSplit will help get 1 stop more in the highlights. Yes it helps but somewhat randomly and often fails, so I can't rely on it.
Not arguing that at all. But the fact that it can work...sometimes...tells us that the 1ev difference is not due to Canon's ADC design. It's due to the dual pixel arrangement.
The difference is small but the IV lags behind in many ISO points at high ISO which is exactly where the pixel size difference should be more prominent.
Not for DR. Even Canon's oldest, worst DR designs converge with Sony's best and have similar DR at high ISO (within 0.5ev) because high ISO is dominated by photon shot noise. Between the latest Sony A7r's the differences are so small that I don't think we can reliably tell if pixel size is related at all. Pixel size should be related...the 1DX2 and 1DX and D5 should absolutely dominate DR measurements...but for some reason that's not what we observe at this time.