If one can see moire in the image, there is not enaugh resolution on sensor, and we are not lens limited yet. Waiting fo 64Mpx APS-C cam. muhahaha...
Actually, if one can see moire in the image, they have an improperly designed AA filter. We don't NEED to significantly oversample the lens to avoid moire. We've been avoiding moire for over a decade...the problem today is that manufacturers are removing the AA filters while we are still often UNDERsampling the lens. Moire shouldn't be a problem...the fact that it is, is because photographers and manufacturers are artificially making it a problem by systematically weakening and entirely removing AA filters from cameras that were doing just fine with them before.
We have different angle of view on that. I tried to point to a fact that if you see moire, the lens certainly resolves more than sensor itself. That way If we want to up the resolution, increasing sensor resolution still IS the way, as the lens can do that. Of course with some losses, but that´t the deal. Still worth it. If you don´t see moire in the image taken with AAless sensor where it should be, then you´re using your lens resolution potential at its full capabilities, and that´s where we (at least me) we want to go one day. One day nobody will need to be bothered with sensor resolution. It will be absolute compared to lenses we put in front of it, only what will matter will be DR, efficiency, noise supression and stuff. This megapixel fight will move to different aspect.
In that respect, I agree. We DO eventually want to get sensor resolution to the point that it oversamples, eliminating the NEED for AA filters. Were a pretty long way off from that day, though. If lenses like the Otus are any indication, we can push 400lp/mm from an ultra high quality lens at wide apertures. That means we would need pixels around 1.25µm in size to simply MATCH that resolution, let alone oversample it. The theoretical limit on useful minimum pixel size is 0.9µm (900nm, well into the wavelengths of near-IR light!) A full-frame sensor at point nine microns would be a GIGAPIXEL sensor. Assuming were at least at 16-bit ADC by the time such a sensor arrives, we would need in-camera data throughput of over 2.3GB/s just to process one frame per second, and data throughput of approximately 13GB/s to process six frames per second.
That kind of technology is beyond extreme. Relatively few things process data at such incredible speeds...high end, high power GPUs are one of the few that come to mind, along with the level three and lower data caches on a CPU. Those devices require considerable amounts of power to operate.
So, yes, the notion of a sensor that outresolves every lens you can put in front of it is the ideal...it's a very lofty one. I think we may see sensors that outresolve lenses that peak in resolution somewhere between f/4 and f/2.8 at some point, as many current lenses already achieve their optimal near-diffraction-limited resolution somewhere around f/4. Were still talking about sensors with hundreds of megapixels, though, and the data throughput requirements are still rather insane by todays standards.
That's nothing to say of the hardware requirements for the PC's that would be used to process such images, or all the pixel-peepers who would look at their images and freak out because of how "soft" they look (when, ironically, that's the entire point...to OVERsample.