I am not sure if you are just being sarcastic, but you realize that at 62mp, not more than 2-3 lenses in Canon's entire lineup could resolve enough detail at any aperture to actually use all those pixels, right?
I'm really sick of people saying this sort of thing, since it's totally false.
Here's what the full-frame version of 184MP looks like on the old 100-400L (18MP 1.6-crop + 2x TC = 72MP on 1.6 crop = 184MP on full-frame)
Yeah, in the corners of many lenses you're going to start having trouble at higher pixel densities, but you're still going to get more detail than you would with less pixel density. You can't add pixels and get a less detailed shot, and on the better lenses, you're going to get more and more detailed.
Here's 288MP on crop (18MP + 4x = 18*4*4 = 288) or 737MP (288*1.6*1.6) on full-frame through the 400/2.8L (the old one):
I don't really get your math there...an 18mp sensor is an 18mp sensor, it will never produce an image with more than 18 million pixels in it. The number of pixels in the image has nothing to do with "resolution" as I'm using it. I believe you are mixing the concept of magnification (reproduction factor) with spatial resolution. They are not the same. Just because you magnify the moon enough such that, if you took a collage of photos of the whole moon at a given reproduction factor that, when combined, produce a 288mp image, does NOT mean that you have a 288mp sensor or lens, nor does it mean you can resolve beyond a certain degree of detail fineness. I'm not referring to total image resolution, magnification, reproduction factor, etc...I am referring to spatial resolution
Spatial resolution is limited by optical aberrations at maximum aperture, which usually overpower diffraction, and limited by diffraction at narrower apertures. There are few lenses in Canon's lineup that offer near-perfect image reproduction (i.e. near-perfect lens characteristics) at maximum aperture, and once you stop down beyond about f/5.6, diffraction reduces spatial resolution below that which can be captured by Canon's highest density sensors (which are the 18mp APS-C sensors, or what would be a 47mp FF sensor.) The 500/4, 600/4, 70-200/2.8 II, 300/2.8, and a couple others get pretty darn close at maximum aperture, and nearly 100% at f/8 (which is limited to a max of 86 lp/mm by diffraction), but still less than perfect. Assuming perfection in a lens at f/4, you could resolve as much as 173 lp/mm, and some of Canon's lenses do indeed get very close to that (at least theoretically, I don't know if any of Canon's MTF's are actually real, and independent lab tests that produce real MTF charts for Canon lenses usually tend to indicate far lower than "perfect" lens resolution.)Here is a simple matter of PHYSICS:
Any and all lenses, no matter who designs it or how perfectly it may be designed, at f/5.6, MTF 50% (about the minimum for a camera to effectively resolve two nearly overlapping points of light as distinct...i.e. neighboring line pairs), is capable of an absolute maximum spatial resolution of 123 lp/mm, assuming total perfection. It would take roughly a 52mp FF sensor, or a 20mp APS-C sensor, to resolve exactly that much detail...WITH an AA filter. We know for a fact that the 100-400 is NOT a perfect, diffraction-limited lens at 400/5.6, by a fair percentage. As someone who shoots with this lens for roughly 12-16 hours every weekend, and several more hours during the week, I can state with confidence that this lens does not resolve more detail than my 7D can resolve itself...at best it can resolve just about enough in the center...116lp/mm. That would be a loss of reproduction accuracy from "perfect" of about 5-6% (beyond margin of error.)
Now, if you slap on TWO additional 2x TC's, that you reduce the maximum aperture to f/22!!! At f/22, MTF 50%, your absolute maximum resolution shrinks to a meager 31lp/mm!!! A mere 3.3mp FF camera, or 1.2mp APS-C camera, would be sufficient to capture maximum detail at f/22. Every lens is diffraction limited by f/22, so our 100-400mm with two 2x TC's can certainly resolve that maximum of 31lp/mm.
Lets assume a modern sensor is capable of resolving detail at Rayleigh, which is an MTF of 9%. The human eye can barely discern detail at this level, and it is a far superior imaging device with cones and rods packed to a density an order of magnitude higher than sensor pixels, not to mention its powered by a vastly superior image processor. But, lets just assume that a modern camera is capable of discerning detail enough at a contrast level of only 9%. At f/22, you could resolve about 68 lp/mm. You could capture maximum detail at that aperture with a 16mp FF sensor, or a 6mp APS-C sensor.
Thats nothing to say of the increase in optical aberrations with two 2x TC's stacked on top of the already less than perfect 100-400mm lens.
There is no way your resolving enough spatial resolution to equal 288mp with the 100-400mm f/4.5-5.6 lens with TWO 2x TC's tacked onto the end. You might be able to achieve a reproduction factor that would produce a 288mp image if you took a collage of photos at 1600mm. When it comes to the moon, average contrast is very, very low, far lower than say the barbs on the feathers of a bird or even individual hairs on a deer or elk, or a myriad of other common subjects photographed with the 100-400mm L, so your probably safe computing resolution based on an MTF @ 10-12% contrast, rather than 50%. You'll get closer to that physical maximum
of 68 lp/mm rather than 31 lp/mm, but there is little chance your actually going to resolve enough spatial resolution to utilize everything an 18mp sensor has to offer, let alone a 288mp sensor.
Don't confuse spatial resolution
, which is how the resolution of lenses is measured with MTF charts, with reproduction factor
, or effective magnification
that a lens is blowing a subject up by. Two very
For reference: http://www.luminous-landscape.com/tutorials/resolution.shtml