Yes, you can get slightly more detail without an AA filter, but for many forms of photography the minor advantage is nothing compared to the major disadvantage. Also, some here have argued that if the number of MP are increased to the point where the sensor out resolves most lenses, you can get away without it. My question is, in that case, why do you need no AA filter? If the output is already optically blurred at a pixel level by the lens, no lack of AA filter will sharpen it back up. And then what happens if you buy a new yet-to-be-released super sharp lens in the future and it does out resolve the sensor at certain apertures? Moire, and you being forced to defocus some shots. So why not just keep the AA filter?
If we had 32MP APS-C sensors (83MP FF), I would just be jumping for joy if a lens came out that out-resolved it. Really, I doubt any lens will ever come out that significantly out-resolves current lenses at their best (usually macro lenses at f5.6). If we pick something good enough for those it shouldn't be a problem.
Resolving power increases with aperture. At f/5.6, MTF 50% (standard measure for photography), with a diffraction limited lens you have 123lp/mm of resolving power. We already have sensors that resolve that much, and it is certainly no stretch to say that many lenses are diffraction limited at f/5.6. At f/4, diffraction limited MTF50 resolving power increases to 173lp/mm. At f/2.8, diffraction limited MTF50 resolving power increases to 247lp/mm. Now we are really pushing resolution, however making a truly diffraction limited f/2.8 lens is a much more difficult ordeal than making a diffraction limited f/4 or f/5.6 lens. Zeiss once had a lens they had designed explicitly to test high resolution films, which was capable of resolving about 400lp/mm (which would have been diffraction limited around f/1.7-1.
...however I'd be doubtful if many major brand name lenses, including any from Ziess these days, was actually diffraction limited at apertures above f/4. If we assume future lenses get better, then the only way they could resolve more would be to increase resolving power at apertures wider than f/4...maybe up to 200lp/mm at f/2.8 (not diffraction limited, but still better than anything we have today.)
Now, don't forget that the final resolving power of an entire camera system is a convolution of its component parts. We can't really know the exact PSF of any lens or camera sensor (some manufacturers probably do, but they don't actually publish the information), but we can use a simple formula to approximate: SQRT( lensBlur^2 + sensorBlur^2) allows us to determine total system blur, and from there we can extend the formula to tell us a whole lot of things. Because of the nature of system resolving power as demonstrated by this formula, you can never actually reach the maximum resolving power of your least powerful component...you can only approach it. That means, no matter how high your sensor resolution is, you could never actually "out"-resolve a diffraction limited f/4 lens at f/4...at best, you would be able to resolve 172.99999 with the whole system. The notion of either a sensor or a lens "outresolving" the other is a bit of a misnomer.
To demonstrate how ironic and even a little ridiculous this little relationship is...you would need an APS-C (1.6x) sensor with 1115000x745000 pixels (22.3mmx14.9mm) to render 172lp/mm spatial resolution in an output image with a diffraction limited f/4 lens...that is an 830.7 gigapixel sensor!
Obviously that's impossible...the pixels would be 10 nanometers in size, and pixels that small would be too small to allow light through, so the sensor simply wouldn't function (not, at least, with visible light...it might function with gamma rays.
The only other way to increase total system resolution is to use a lens with higher resolving power, which can only be achieved at wider apertures. If we assume we have our 200lp/mm f/2.8 lens (which gives us plenty of headroom to work with), to resolve 173lp/mm we would need a 158mp APS-C sensor (15380x10277 pixels @ 22.3mmx14.9mm). That comes out to 411mp FF (24830x16553 pixels @ 36mmx24mm). At any aperture below f/2.8, your total system resolving power would again become diffraction limited, and would be less than 173lp/mm...yet still higher than if you used a sensor with fewer megapixels.
Sorry if that comes off as too complicated. However it is the only way to be clear about "resolution". Sensors do not outresolve lenses. Lenses do not outresolve sensors. The two work together to create a final outcome, and that final outcome will continue benefit by increasing the resolution of either lens or sensor for a long time to come. At the moment, the best APS-C sensors are resolving around 124lp/mm. The best FF sensors resolve far less, and further still the actual output resolution of our actual photographs is even less (and if we apply noise reduction, EVEN LESS!). We have a LOT of headroom before we run into the limitations offered by current f/4 and faster lenses (many of which, while not diffraction limited, still offer resolving power that can be well above 124lp/mm) and experience diminishing returns. An 32mp APS-C/83mp FF sensor doesn't even scratch the surface of how far we could take sensor resolution before we started experiencing diminishing returns the the degree where there was no point in investing in shrinking pixels any further.
I'd actually bet that we could take pixels right down to the wavelength size of light (the point at which pixels become too small to allow visible wavelengths of light through), and still have a useful gain. As a matter of fact, with the new generation of small form factor CMOS image sensors (CIS) that will become mainstream through 2014-2015...the tiny sensors used in smartphones and the like...reaching 0.95µm, or 950nm, we are already well into the realm of wavelength-size or smaller for about half the range of infrared light. Within the next generation or two of smartphone CIS designs, pixels will be as small as they possibly can be...about 750nm, and we won't be able to shrink them any further without filtering out red light!!
This convolution of lens and sensor resolution should also make it clear that there won't necessarily be any benefit from removing AA filters for a very long time to come as well. Everything I've discussed here is at MTF50, or a 50% contrast level. As pixel size shrinks, sensors will be able to resolve detail at lower and lower levels of contrast as well, so the spatial resolutions scale up (more lp/mm) as pixel size drops. I wouldn't go so far as to say a sensor could produce any real meaningful result at MTF10 (10% contrast), but it is very likely that small pixels will be able to resolve detail at 20%, 15% contrast...and lens resolving power at those levels is considerably higher than it is at 50% contrast (i.e. at f/4 MTF15, resolution is closer to 350lp/mm!) We will always need AA filters...there really isn't any good reason to get rid of them.
The only use case where not having an AA filter might result in sharper detail would be if you only ever photograph scenes without ANY repeating patterns of any kind whatsoever. Landscape photography is the most likely scenario where you would encounter only purely random detail at nyquist, but even in the case of landscapes...personally, I don't like them being too sharp. I find that the best landscape photography tends to have that certain "softish" quality to them, clean edges, rather than finger-dicing sharp edges, kind of like "bloom" in modern games makes everything "soft", but in general just much better.