LOL. Sorry, but you entirely misunderstood the point of my blog article, which had to do with the myth of diffraction as it relates to pixel size, a myth that presumes once you stop a lens down to the diffraction limited resolution of the sensor, you suddenly experience worse IQ than a sensor with larger pixels (yes, many photographers actually DO believe that). That's a different issue, though.
Smaller pixels won't automatically make the result worse, except if their small size means relatively more space is dedicated to non light gathering circuitry. But the claims I saw in that blog go further: "That means softening caused by diffraction can fairly easily be corrected with some sharpening while post-processing.". It then goes on to show that F/22 and sharpening yields the same result as F/8 here, although even with this sample image the extra noise from F/22 and sharpening is quite obvious.
And the reason for this extra noise is simple to explain: the diffraction limited lens acts as a low pass filter, which unfortunately does not low pass filter sensor noise at the same time. Which means you lower the signal to noise ratio for higher image frequencies. Once you boost the higher frequencies, you also boost high frequency noise components, and that's what you see in that sample image.
Actually, the noise in the f/8 vs f/22 example is primarily due to the fact that the image was saved as an animated GIF (256 color palette). The first frame is what the color palette is based on, all subsequent frames kind of get the shaft when it comes to their color, so they end up a little more noisy (the exactly correct colors for the f/22 image cannot be found in the color palette, so the nearest color is picked instead). You have to realize there was a pretty minimal mount of sharpening involved there...not enough to produce artifacts or enhance noise to the point it is a visible problem.
As for sharpening, it mitigates the impact of diffraction, it does not eliminate the effects of diffraction entirely, or make lenses behave purely geometrically. Sharpening an f/22 image does not make it diffraction limited f/2 performance. There are also limits as to how far sharpening takes you the farther you stop down...sharpening an f/32 or f/45 or f/64 is certainly not going to reduce the impact of diffraction enough to produce geometric results. It does, however reduce the muddiness of diffraction blurring that affects the f/16+ image to an acceptable level. But that's all post-processing. Lenses behave as lenses behave. Anything you do in post does not actually change the behavior of the lens.
The diameter of an Airy Disk is measured between its first minima, so yes, some extra pixel resolution below this diameter can be helpful, but after you put more than three pixels in each dimension you will barely gain extra information from higher pixel density. As you stated it: F/16 will be ok on full frame, but F/32 will bring visible loss of detail. The whole "myth of diffraction" boils down to "diffraction hurts, but later than many believe" and is therefore no myth at all, although Sigma evidently wants us believe so
Again, your not understanding the point of my article. I'd been asked on several occasions about why someone would choose a sensor with smaller pixels, "because wouldn't diffraction just make the IQ worse when it affects the image at f/6.3 rather than f/8?" THAT is the myth I was aiming to debunk...that because diffraction STARTS affecting IQ on a sensor with smaller pixels at wider apertures than sensors with larger pixels, supposedly using smaller pixels is only useful if you use wider apertures. I wrote the article to explain to those people that diffraction is absolute, it exists due to the nature of light as it passes through the lens, and that pixels size really has nothing to do with it...diffraction is a lens trait. Sensor pixel size simply allows the ever-present effects of diffraction to be realized at a finer resolution when they are smaller.
Whether you have big pixels or small pixels, diffraction is going to affect the real image projected by the lens onto the sensor the same way. The exception is that smaller pixels will always be able to resolve more detail when more detail can be resolved (i.e. up through that first minima, which obviously grows as the aperture is stopped down.) The point is that smaller pixels can never be a bad thing, but they can be a good thing, as far as image resolution is concerned.
Your reading something into my article that simply isn't there if your trying to make some argument about Sigma's geometric MTFs based on anything I've written there. Please don't twist my words. In assuming purely geometric traits for their lens MTFs, Sigma is really just looking for a way to edge their MTF plots higher up the chart, make their lenses seem better. Unsuspecting customers who really don't know what an MTF is will inevitably be comparing geometric MTFs of Sigma lenses with diffraction MTFs of Canon lenses, for example, and the comparison will be invalid. It's a cheat. Not exactly unexpected, someone was bound to try it sooner or later.
We (humanity) have known for a very long time that lenses do not behave purely geometrically, that they exhibit diffraction limited behavior when aberrations are eliminated.