With a physic degree, you should have a good understand of clasiccal optics and wave optics. Then you should know that the 18MP is diffraction limited at F6.7. Any opening smaller than that will cause unsharpness of the picture.
Yes, of course, but so what?
Of course at some point more sensor resolution is not usable for details anymore, but this is not what we are talking about right now.
The image the optics throw at the sensor is only depending on the optics, and not the sensor.
Your previous comment says the amount of light that hit the pixel is only depends on the overall size of the sensor not the size of pixel is also wrong. Please go back to Optics 101. There is a big difference of "integrating photon overtime and position" between hardwared and software. If you integrating by soft ware, you are integrating electronics noise from the circuitry of eachy pixel. If you have a larger pixel, you will have more poton and the elcectrical noise from only ONE pixel. Therefore larger pixel will have less noise.
You need to understand the difference between SIGNAL and NOISE.
Integrating noise means eliminating it.
Noise is random. Integrating lots of lots of noise makes it disappear.
Let me give you an example:
Let's say we have 1 Million 50% gray pixels. Noise introduces some pixels that are brighter, and some that are darker than 50% gray. The too-bright pixels will be roughly the same number than the too-dark pixels.
Downsampling the image means building an average - and the average of this noise is ZERO (well, almost).
Semiconductor production does not make you a noise expert.
You do not need to show your credential to convince people that you are right, if you have a valid reason.
There are so many people around not knowing what they are talking about, I just tried to separate from them.
Sorry for this.