One of the primary thing that determines the noise level is the pixel size.
Actually, pixel size in and of itself has no bearing whatsoever on noise. Once again, it's entirely a question of (absolute) enlargement.
The only reason pixel size appears to relate to noise is that people like to compare noise at a 100% pixel view. But, right there, you're now comparing different enlargements. A 36 megapickle full-frame camera has twice the linear resolution as a 9 megapickle full-frame camera. To compare the noise, you'd need to either show the one at a 50% view (turning the noisy small pixels into unnoticeable fine-grained smoothness) or the other at 200% view (thus making the noise in those big pixels much nastier and blotchier). Or, much better, by actually making prints and comparing the prints. But, then again, your 24" x 36" print (or whatever) is going to be done at 300 ppi by the one camera and only 150 ppi by the other.
The short version is that there will be the exact same S/N ratio (all else being equal) between the two; you just get to pick between more fine-grained noise or less large-grained noise. If it helps, imagine scanning film at different resolutions; the grain is still there no matter what, and all you get to do is decide how faithfully you want to render the grain.
And, of course, the usual engineering caveats apply. Newer cameras are made with more megapickles and thus smaller pixels, yes, but also with newer and better and more efficient electronics that's therefore less prone to noise. And there may well be engineering matters that make it easier to design bigger circuitry and so forth.