For the billionth time, more pixels does not mean more noise. In fact, given the same basic sensor performance, more pixels means less noise (given the same total sensor area, of course). This is because bigger pixels do nothing but simple block averaging while noise reduction software uses far more sophisticated approaches to reducing noise than that.
Think of it this way - a perfect sensor would record each photon's location. This is sort of equivalent to "infinite" pixel count.
There are a couple factors here. Hand-holding ability suffers when pixel density is too high,
No, it doesn't. The extra pixels are capable of showing the blur that was already there in more detail. Reducing the pixel count just hides that blur inside the blur due to poor sampling.
On top of that is encountering diffraction earlier on.
That's also baloney, and for the exact same reason.
There are several drawbacks to cramming more pixels on small sensors. You can take as many shortcuts as you like, but eventually physics will emerge victorious, thus the booming full frame market.....
The only drawbacks to more pixels are that better manufacturing is needed, faster processing pipelines have to be included, more storage is consumed by the final files, and more processing power is needed to post process the final images. There are no image quality disadvantages except in certain extremely way out there edge cases no one ever mentions anyway.