Consider a photo taken with a D3s and the same taken with a camera with a sensor with a size of 360 * 240 millimeters (with the same photosite size). There is no equivalence in directly comparing the two photos; the one taken with the camera with the larger sensor would have an absolutely stupefying technical quality. Why? Because the larger sensor gathers 100 times more light for the same exposure - almost 7 stops (6.64 to be exact), but the noise level doesn't increase proportionally with the sensor size.
I'm not very fond of this simplified argument, removing the optical system from the question, which you really can't: for your simile to work, you have to assume that both detectors use the same (10xFF) lens. Now what if you were imaging a bird that just fit onto the small FF frame. Would the image of the bird be better with the 10xFF detector? No. Sure, you would capture 100x more photons, but 99% of those photons would come from the boring forest, of no consequence for the image quality of the bird.
A detector doesn't produce an image by itself. It needs optics. The reason a FF camera has an IQ advantage over APS-C is that it is easier to produce suitable optics for FF than the equivalent for APS-C. In your example, if you put a 50/1.2 lens in front of the FF and a 500/12 lens in front of the 10xFF, they would produce equivalent images. They would collect the same number of photons. There would be no difference in IQ. But while a 500mm f/12 lens can be readily produced at home by an amateur astronomer (they actually do a bit better), you need Canon's expensive top-of-the-line L-optics to find a 50mm f/1.2 lens.