Developmental costs would be pretty high. Digic processors are only setup to handle one sensor. Balancing color and brightness between sensors, combining images, synchronizing them, and getting rid of 4X the heat would probably push the cost out of sight, and even then, a large sensor would likely outperform the combination.
I'd never buy one over a single sensor, too many compromises that are not necessary. For a multimillion dollar aerospace uses where a multi million dollar supercomputer reassembles the images over a period of days with a staff of scientists to manage it is one thing, a consumer camera is another.
Your reasoning makes a lot of sense. It seems like it would be pretty impractical to use an arrayed sensor setup like the ones they're using for telescopes. I just thought it would be pretty cool if it actually was cheaper than the equivalent sized sensor and could offer the same image quality. Thanks for the response!