The concern over the effects of diffraction for high resolution sensors is completely misplaced.
You don't lose sharpness to diffraction by increasing pixel density, because the size of the diffraction effect is strictly determined by the lens. To understand why, suppose you have two camera systems that are identical in every respect except that one has twice the linear pixel density than the other (i.e., every pixel in the low-resolution sensor is split into four pixels in a 2x2 arrangement in the high-resolution sensor). Ignoring the effect this has on noise (and noise on perceived resolution), it is true that, as an increasing function of f-number, the higher resolution sensor will be able to reveal the effect of diffraction sooner than the low-resolution sensor. But the reason for this is because the low-resolution sensor is unable to resolve that effect, not that the effect is stronger in the high-resolution sensor. The Airy disks are IDENTICAL in the two systems because the lens is identical.
Therefore, increasing sensor resolution does not confer any disadvantage with respect to diffraction. You always have something to gain, and you never do any worse than the low-resolution sensor. You might not gain as much as you theoretically could (i.e., a high-resolution sensor might not realize the full sharpness in the sharpest plane of focus at f/16 compared to when it is shot at with a near-ideal lens at f/2.8 ), but you won't do worse than a low-resolution sensor that couldn't SEE the diffraction at f/16 in the first place.
The hesitation to go with higher resolution because of fears of diffraction reveals a complete misunderstanding of the phenomenon. If you said "I don't want high resolution because I want better dynamic range," then I can be on board with that statement. But if you said "I don't want high resolution because I would be more severely diffraction-limited," I would tell you that you don't understand what you're talking about.
I think it is a concern/hesitation due to "diminishing returns" by using higher sensor resolutions - as I
read the comments here.
With a 5D classic I see losses of detail from f/8 on while doing macro shots at 1:3 (I think it
is actually comparable to f/11). From that I extrapolate that the 5Ds can be used at a
max aperture of f/5.6 to use the lenses capabilities in similar situations. The lens used was
the near ideal EF 2.8 100mm Macro (non-L).
To use the full potential of the 5Ds you need stacking at least for macro but I think it will
be necessary for landscape and architecture as well.
Is that a valid extrapolation? Go to the slrgear site where they measure IQ at different apertures for both full frame and crop. For all the good lenses, there is minimal diffraction degradation at f/8, a tiny amount at f/11, and then more significant at f/16. It is exactly the same for FF and the 7D with 5Ds size pixels.
5D III, 7D II, EOS-M, Powershot SX50, 300/2.8 II, 1.4xTC III, 2xTC III, 70-200/4 IS, 24-105, 15-85, 100-400 II, Sigma 10-20, EOS-M 18-55, f/2 22.