To start, its best to measure resolution in
lp/mm, or
line pairs per millimeter. This is a measure of
spatial frequencies...light/dark oscillations or waveforms that compose a two dimensional image, be that the virtual image projected by a lens, the image recorded by film or a digital sensor, etc. Measurement in line pairs, or a light line paired with a dark line, is essential to measure microcontrast, or the ability to discern the difference between a light line and a dark line that are right next to each other.
(NOTES: For reference, the human eye is able to discern a line pair when contrast is as low as 9% (this is called
Rayleigh Criterion). Modern sensors capable of barely resolving detail around the same level, probably closer to 12-15%, however at such levels details can become inconsistent and often "useless". Low-pass filters are usually used to cut off spatial frequencies somewhere below this level to eliminate what effectively becomes additional image noise and possibly moire in the presence of regular repeated patterns. Sensors can only really resolve a line pair as consistently separate light and dark lines when contrast is about 50%. For any imaging medium to resolve a line pair, it must have twice the resolution as the frequency being sampled...so for a sensor to resolve 100 lp/mm, it must have at least 200 rows of pixels per millimeter. This called the
Nyquist Rate, and the maximum resolution of an image that can be captured...the maximum frequency that can be usefully sampled...is the
Nyquist Limit.)
Both lenses and sensors have resolution, and they can be measured independently of each other, as well as part of a greater whole. When it comes to sensors, its pretty easy to compute the theoretical resolution. This is usually pretty close, although not exactly the same as, real-world resolution. Real-world resolution can differ a bit when you factor in bayer interpolation, low pass filters, bayer array layout, etc. I'm going to quote from another answer I gave on another topic, as it has relevant information about sensor resolution:
I'm wondering though, how the line widths/ picture height (LW/PH) figures from lense tests translate to sensor resolution.
So 18MP result in 3456 "lines per picture height", while the highest LW/PH scores for APS-C I found were around 2600. If this was a 1:1 conversion, no APS-C sensor above 12MP would be of much use. So I'm guessing that's probably not it. I'd like to find a way to calculate the corresponding sensor resolution to any given lens resolution (and vice versa) OR know why this is not possible. Can anyone help?
If I understand how your measuring LW/PH, then an 18mp APS-C sensor resolves the same as an 18mp FF...both the 7D and the 1D X produce images that have 3456 lines. Generally speaking, a more tech-agnostic way to measure resolution is with lp/mm, or line pairs per millimeter (its important to use the term line pair, which denotes the waveform nature of spatial frequencies, a light line (white) followed by a dark line (black)...for camera sensors, line pairs generally need an MTF of 50% contrast, or not fully resolved but about half way there...to be clearly imaged as a full "line pair"...anything less and you are losing resolution to diffraction). In that respect, the highest resolution APS-C's are able to resolve more detail than an 18mp FF sensor, which is exactly correct...the 7D (or for that matter the Sony A77 @ 24mp APS-C) is a higher resolution sensor from the level of fineness of detail resolved than the 1D X...its just in a smaller package with a crop factor. In resolvable lp/mm, an 18mp APS-C sensor can resolve 115.97 lp/mm (3456 lines/14.9mm sensor height = 231.94 l/mm, divide by two to get lp/mm). The 18mp FF sensor of the 1D X, however, can resolve 72 lp/mm (3456 lines/24mm sensor height = 144 l/mm, divide by two to get lp/mm). It is possible to derive the necessary FF megapixels that would produce the same fineness of detail as an 18mp APS-C sensor if you were interested. Take the height and width of the APS-C, calculate the lp/mm for both dimensions, and derive the image width and height for FF from that by multiplying by the correlated sensor dimensions:
3456L/14.9mm = 231.94 l/mm
5184L/22.3mm = 232.47 l/mm
231.94 l/mm * 24mm = 5566 L
232.47 l/mm * 36mm = 8368 L
5566 * 8368 = 46,576,288 pixels ~= 46.6mp
You would need a 47mp FF sensor to capture the same lp/mm, or "resolution", as an 18mp APS-C sensor. For reference, the 36.3mp Nikon D800 sensor resolves about 102.3 lp/mm, so even though it has greater megapixels than an 18mp 7D, the 7D is still resolving slightly more detail at a pixel level (barring any intrusive factors such as sensor noise...can't say exactly how the noise of the D800 will be in real-world tests.)
The story is not quite as cut and dry as that, given that (excluding Foveon) most sensors are bayer arrays, usually with a low pass filter in front of them, so that mucks with the final resolution a little bit, and makes it tough to nail down nyquist limit...but from a theoretical standpoint, there you have it.
Lenses themselves are projecting a virtual image that is simply recorded by the sensor, however the resolution of the image projected by a lens does not have a single "resolution". Depending on the aperture setting, and whether you measure resolution at the center of the lens or the edge of the lens, lens resolution will vary considerably. Assuming an ideal, or "perfect" lens, one that is entirely free of any form of optical aberration, the maximum possible resolution at maximum apertures above f/4 can FAR outresolve current sensors at minimum detectable contrast, and considerably outresolve them at 50% contrast (a key level, as noted above.)
Perfect lenses are also called "diffraction-limited" lenses, in that the resolution possible is only limited by diffraction and not optical aberrations. Real-world lenses tend to be aberration-limited at wide apertures, and diffraction limited at narrower apertures, and the narrower the aperture, the more diffraction will limit maximum resolution. Thus the reason why a photo will start to soften beyond f/11, and exhibit pronounced degredation beyond f/22, on an APS-C sensor. Because of optical aberrations at wide apertures, lenses exhibit idealistic behavior at middle apertures, such as f/8. However thats just about where things get dicey from a whos-outresolving-who standpoint.
The highest resolution Canon sensor on the market today, their 18mp APS-C sensors, resolve 116 lp/mm (see quote above for reference and details about how this number is derived.) If we assume a perfect lens, at f/2.8 and 50% contrast, you can resolve about 247 lp/mm, which is slightly more than twice what Canon's highest resolution sensors are capable of resolving (for reference, you would need a
210mp FF or 81mp APS-C sensor to resolve that much detail.) Given that real-world lenses are aberration-limited at wide apertures like f/2.8, lets take a more realistic aperture. The Canon 7D 18mp APS-C sensor is diffraction-limited at f/6.9, so if we assume an f/7.1 aperture, we can resolve roughly around 95-100lp/mm. The sensor is now outresolving the lens at this aperture, and all apertures smaller than f/7.1. At f/8 the lens can only resolve 86 lp/mm, f/11 it drops down to 63 lp/mm, and at f/22 it is at a mediocre 30 lp/mm!! The same lens at f/6.3 would probably resolve just about 118 lp/mm, just ever so slightly better than what the sensor is capable of resolving itself.
When it comes to resolution, its not quite a simple as "Lens A outresolves Sensor A, but Lens B does not". For pretty much any lens these days, at f/8, pretty much all modern sensors with at least 15mp are capable of resolving enough detail to match the lens. Its at wider apertures where lenses have the potential to resolve considerably more detail, and how much more depends on how well aberrations (and flare) are controlled. The more aberration and flare control a lens has, the sharper it will be at wider apertures, and the more likely the lens will be to outresolve even the highest density sensors.
As for Canon lenses, it depends on what you mean by current. Canon made a claim (I forget where...I've been searching for the reference) that their "newest" L-series lenses, which at the time seemed to mean their Mark II lenses and all "new entrants", or brand new designs like the 8-15mm L Fisheye, are capable of resolving approximately enough resolution for a 45mp full-frame sensor. This accounts for a fair number of lenses released in the last several years, possibly as far back as 2006-2007. I believe a large part of the reason Canon is starting to release more updated lenses, such as the new
24-70mm f/2.8 L USM II, despite the fact that its predecessor was considered one of their best lenses ever...is to get resolution "up to snuff", and ensure they are capable of resolving enough detail for upcomming (and even current, when accounting for their 18mp APS-C sensors) ultra high resolution sensor designs.
For top end superteles like the 500mm L II and 600mm L II, given the stunning near-perfect MTF charts, I would effectively consider them "perfect", diffraction limited lenses at all apertures, and therefor capable of about 173 lp/mm at f/4. Thats enough resolution for a
103mp FF sensor, or a
40mp APS-C sensor.