EOS 7D Mark II Information [CR2]

Status
Not open for further replies.
neuroanatomist said:
Pi said:
jrista said:
I'm curious about the f/4.5 bit...how exactly does that work? Is that only for the outer points? (I believe the center AF point is still f/2.8 compatible like with most Canon AF systems.)

It works at f/2.8, of course, but that is equivalent to f/4.5, even though some people do not want to hear about that. Assuming that it has the same precision: 1/3 of DOF or so, it is 1/3 (or whatever) of the f/4.5 eq. DOF. It is like shooting with FF at f/4.5, with 1/3 DOF precision. Well, that is 1/3 of the DOF at f/4.5 Even if f/4.5 is all you need as DOF, your precision is lower. Some empirical evidence on that can be found on the FoCal site.

Sorry, but that's incorrect. The precision of the AF points at a given aperture isn't specified in terms of DoF. Well, ok, maybe it is...but in that case, you keep using the letter F in the abbreviation, and I do not think it means what you think it means.

The AF sensor precision spec is 'within one depth of focus' for a standard precision point, and 'within 1/3 the depth of focus' for high precision (f/2.8, usually) points. Depth of focus is in 'image space' and is measured in micrometer distances at the AF (and/or image) sensor. It is related to, but distinct from, depth of field, which is measured in larger distances in 'object space'.

Depth of field is determined by aperture, subject distance, and focal length (and CoC, but since that is related to sensor size, let's leave that out). When we discuss 'shallower DoF on FF', that's a function of either subject distance (with APS-C you're further away for the same framing) or focal length (with APS-C, you need a shorter focal length for the same framing).

However, depth of focus is relatively insensitive to subject distance (once you're out of true macro range) and focal length. Thus, depth of focus is primarily determined by aperture, and that doesn't change with sensor size.

OTOH, aa stated, from a practical standpoint the APS-C sensor does have a deeper depth of field. So, even though the specified AF sensor precision is the same, the manufacturing tolerances for APS-C could, in theory, be looser. Users of 1-series bodies have long known their AF is' better' than consumer cameras. I wonder if part of the recent improvements in measured precision of AF with the 5DIII and 6D derive at least in part from Canon tightening up the manufacturing tolerances.

Thanks Neuro for a well written explanation about DOF, sensor size, focal length, distance to subject & background, AF focussing accuracy. etc, etc.

That's the way I have understood & work with these variables for some time in my photography. It's a shame many people who take photos and own cameras / lenses don't understand or apply these. People should practice, practice, practice - like I did years ago - taking photos with a FF at f/2.8 or a APS-C at f/1.8 - and determining how to use and control DOF for impact in photos.

That's the reason I'm waiting for a new 50mm f/1.4 - f/2 lens; that's the focal length and DOF that I enjoy taking many photos on my APS-C (Canon 7D).

I wonder if part of the recent improvements in measured precision of AF with the 5DIII and 6D derive at least in part from Canon tightening up the manufacturing tolerances.

And this, in red font, above is one of the things I'm very keen to see in a 7DmkII. I have worked very well with my 7D's AF (I have again practiced with many photos and different scenarios). I have been able to achieve photos with with my 7D that I'm very happy - including macro using AF (though I usually use MF for most of my macros), BIF, portrait, event photography, etc. There are a few scenarios that I would like the 7D's AF to be somewhat more accurate and consistent (like the 5DmkIII) - but the 7D is no slouch WHEN you know how to use it.

Regards all....

Paul
 
Upvote 0
pj1974 said:
Don Haines said:
We seem to have two groups of people arguing....

One group says that FF has the best image quality.
The other group says that APS-C has the best image quality that they can afford.

Both sides are right.

:D Great summary.....

... though there is (at least now) a third group of people that say "option C please, both the above are true!"

And you (& I) are in that group, Don!
I was outside last night doing something stupid..... trying to hand-hold a 60D with a 400F5.6 and a 2x teleconverter and shoot the ISS as it passed overhead. Surprisingly enough, it worked and the resulting image is 22 pixels across. I'd have loved to have a 1DX instead of the 60D and an 800F5.6 instead of the 400F5.6, but with what I can afford to spend, that's just not going to happen. Like so many of us, I have to settle for the best I can afford.
 
Upvote 0
Pi said:
jrista said:
The only difference then would simply be that the 7D frame is cropped, resolved by a higher density sensor, and thus appears to be zoomed more.

And that decreases the DOF and magnifies the AF errors (just another way to say the same thing).

Cropping does not change the depth of field. Depth of field is a function of the lens...the size of the square inside the imaging circle does not have any impact on the depth of field whatsoever. Circle of Confusion (which CAN be, but is not necessarily, a function of the pixel density, plays a role...but as I stated before, that is for all intents and purposes an arbitrary value. You can pick a number, so long as it is not smaller than the pixel pitch times two at the smallest, and use it for both APS-C and FF):

Code:
DoF = (2 * N * c * f^2 * s^2) / (f^4 - (N^2 * c^2 * s^2))

Where:

N = F#
f = focal length of LENS (crop factor need not apply)
s = distance to subject
c = circle of confusion

Lets say we are scaling for web. CoC is a non factor...we can pick anything, lets say 30 microns (0.03mm, probably far to small, but it really doesn't matter). If we run that for a 600mm lens at f/4 with a subject at 40 feet, we get a DoF of 4" (four inches), or about a third of a foot. With a CoC of 20 microns, we get a DoF of about 2.6". If we pick a CoC that is some happy medium between three times the pixel pitch of the 5D III and 7D cameras (to allow for AA filters and the nature of a bayer design), we get 16 microns. Our DoF is still about 2". If we are scaling down by 2x or more, these differences are moot...the effective CoC is FAR larger than any of these options.

We could even print somewhere near the native size of a 24mp APS-C or a cropped and scaled 23mp FF, something in the range of 16x24. There is a CoC difference, but from a practical standpoint, it doesn't produce a meaningful visual change in such a print. If we scaled up by 2x or so, then we'll probably start seeing a difference in DoF just by observing the print. Is it a meaningful difference? I guess it depends...if your printing at 150ppi on 30x40, its not really going to be the most significant factor affecting IQ or the sharpness of your subject, and your viewers will usually likely be standing back far enough to compensate for the difference. It may be an issue in this case, but so long as the important parts of your subject are in focus (which in my case is usually a birds head and maybe the side of its body, not even necessarily the whole body...anything on the back side of a bird can be entirely out of focus since it isn't visible...and a bird angled towards the lens can have blurry tail feathers and it doesn't really matter so long as the head and eyes are in clear focus), again CoC isn't really going to be the most important of a factor in determining the depth of field.

The real (actual) focal length of the lens, distance to subject, and selected aperture are the things that truly matter when it comes to DoF. Crop factor should NOT be factored into the focal length to produce an effective focal length first. Pixel pitch differences may need to be factored in if one intends to enlarge and print large, especially if they are printing at a higher resolution than 150ppi on anything other than canvas. Pixel pitch differences are effectively a non-issue if one intends to scale down and publish to the web.

In the event that you get closer with a 5D III+600/4 setup and frame the subject identically, then there would indeed be a fairly significant change in DoF. But that would be because the distance to subject shrunk. All things being equal, I would prefer to have the thinner DoF (and getting closer with a 5D III has the potential to pack even more pixels onto the subject than even a 7D can), even when photographing birds...but it is not always a possibility.
 
Upvote 0
Pi said:
neuroanatomist said:
Sorry, but that's incorrect. The precision of the AF points at a given aperture isn't specified in terms of DoF. Well, ok, maybe it is...but in that case, you keep using the letter F in the abbreviation, and I do not think it means what you think it means.

The AF sensor precision spec is 'within one depth of focus' for a standard precision point, and 'within 1/3 the depth of focus' for high precision (f/2.8, usually) points. Depth of focus is in 'image space' and is measured in micrometer distances at the AF (and/or image) sensor. It is related to, but distinct from, depth of field, which is measured in larger distances in 'object space'.

I believe I was correct. While DOF and depth of focus are different, 1 D-O-focus is defined as the distance at which you get an image blurred by "1 DOF" (with a fixed COC), so 1/3 D-O-focus corresponds approximately to 1/3 DOF blur.

Another way to look at this: an f/2.8 eq. lens on crop is a f/1.75 one. The crop AF sensor cannot see rays coming from the periphery of such a lens. It has to somehow compensate this by judging the phase difference of f/4.5 (eq.) rays. Of course, those are rays of a shorter FL, so this is not exactly a proof without knowing how the AF system exactly works.

Hmm, I know that the image sensor cannot see from the periphery of any EF lens. I was not aware that the AF sensor was also limited in the same way. The point spread is certainly smaller than on a FF sensor (but I always figured that was an advantage as it doesn't have to deal with vignetting). I do not believe that actually limits the sensor's periphery vision. Do you have some kind of reference for this?
 
Upvote 0
jrista said:
Pi said:
jrista said:
The only difference then would simply be that the 7D frame is cropped, resolved by a higher density sensor, and thus appears to be zoomed more.

And that decreases the DOF and magnifies the AF errors (just another way to say the same thing).

Cropping does not change the depth of field. Depth of field is a function of the lens...the size of the square inside the imaging circle does not have any impact on the depth of field whatsoever. Circle of Confusion (which CAN be, but is not necessarily, a function of the pixel density, plays a role...but as I stated before, that is for all intents and purposes an arbitrary value. You can pick a number, so long as it is not smaller than the pixel pitch times two at the smallest, and use it for both APS-C and FF):

Another way to put it is:
What is the difference between a crop sensor and a FF sensor? An APS-C crop sensor is sampling the central 40% of the image circle, but typically at a higher sampling density. To say that F-stop or depth of field is different between FF and APS-C is to believe that if you shot an image with a FF sensor and then cropped it in photoshop, that the lens would magically change. A crop sensor throws away all but that 40% of light in the center of the image. If you built a FF sensor with the same technology and pixel size as an APS-C sensor, that 40% of pixels in the center of the FF image would be identical to the APS-C image and nobody could tell the difference.
 
Upvote 0
Don Haines said:
One group says that FF has the best image quality.
The other group says that APS-C has the best image quality that they can afford.

Both sides are right.

I think some in the second group are also (erroneously) saying APS-C is (almost?) as good as FF, and that is hurting the ego of some in the first group, and feeding the vicious cycle.
The discussion of DoF seems like a breath of fresh air in these circumstances.
Not to digress, but these pointless discussions find so many contributors, but if you need technical help, responses are pretty sparse...
 
Upvote 0
Pi said:
neuroanatomist said:
Sorry, but that's incorrect. The precision of the AF points at a given aperture isn't specified in terms of DoF. Well, ok, maybe it is...but in that case, you keep using the letter F in the abbreviation, and I do not think it means what you think it means.

The AF sensor precision spec is 'within one depth of focus' for a standard precision point, and 'within 1/3 the depth of focus' for high precision (f/2.8, usually) points. Depth of focus is in 'image space' and is measured in micrometer distances at the AF (and/or image) sensor. It is related to, but distinct from, depth of field, which is measured in larger distances in 'object space'.

I believe I was correct. While DOF and depth of focus are different, 1 D-O-focus is defined as the distance at which you get an image blurred by "1 DOF" (with a fixed COC), so 1/3 D-O-focus corresponds approximately to 1/3 DOF blur.

Another way to look at this: an f/2.8 eq. lens on crop is a f/1.75 one. The crop AF sensor cannot see rays coming from the periphery of such a lens. It has to somehow compensate this by judging the phase difference of f/4.5 (eq.) rays. Of course, those are rays of a shorter FL, so this is not exactly a proof without knowing how the AF system exactly works.

Sorry, but I don't think so. First off, an f/2.8 lens on FF is only equivalent to f/1.75 on APS-C if you change the focal length or subject distance. If you change neither, the DoF doesn't change. You don't have the same picture, of course...but the AF sensor neither knows nor cares. Second, your argument about the light from the periphery is irrelevant - neither AF sensor sees light rays from the periphery - for a variety of reasons, the phase detect AF points are clustered in the central region of the image frame (in fact, relative to the image frame, the AF points on crop bodies are actually more widely spaced (e.g., 7D vs. 5DII). (But note that this doesn't apply to the dual pixel CMOS AF, which uses ~80% of the frame. But with what accuracy?)

Pi said:
...so this is not exactly a proof without knowing how the AF system exactly works.

FWIW, my info on this comes from a rather lengthy email exchange with Chuck Westfall, who is in a position such that he can know how the AF system exactly works.
 
Upvote 0
neuroanatomist said:
Sorry, but I don't think so. First off, an f/2.8 lens on FF is only equivalent to f/1.75 on APS-C if you change the focal length or subject distance. If you change neither, the DoF doesn't change. You don't have the same picture, of course...but the AF sensor neither knows nor cares.

If you do not change FL and distance, and keep it at f/2.8, DOF changes; cropping makes in shallower. AF errors are magnified.

Second, your argument about the light from the periphery is irrelevant - neither AF sensor sees light rays from the periphery - for a variety of reasons, the phase detect AF points are clustered in the central region of the image frame (in fact, relative to the image frame, the AF points on crop bodies are actually more widely spaced (e.g., 7D vs. 5DII). (But note that this doesn't apply to the dual pixel CMOS AF, which uses ~80% of the frame. But with what accuracy?)

The AF sensor does see rays from the periphery (of an f/2.8 lens) even in the center. You can try it at home. Here is a random diagram found with Google:

af_diagram.png


FWIW, my info on this comes from a rather lengthy email exchange with Chuck Westfall, who is in a position such that he can know how the AF system exactly works.

I am not sure whether you cite him correctly but I have read things from him that raise some eyebrows.

Lets us make it simple. You do not need to know how the AF system works. Assume that Nikon ;) told you that their AF sensors are accurate within 1 DO-Focus with f/2.8 lenses. In plain language, that means that your f/2.8 images are just barely in focus, in average, and that does not depend on FL or sensor size, because ... Nikon told you that. Now, you get the same defocus blur with 50/2.8 on crop and 80/2.8 on FF (assume 1.6 crop factor) but the former is 80/4.5 eq., i.e., stopped down.
 
Upvote 0
I probably shouldn't even try, because it seems that you won't be convinced, but why not one last go? ;)

Pi said:
If you do not change FL and distance, and keep it at f/2.8, DOF changes; cropping makes in shallower. AF errors are magnified.
Sorry, no. Cropping doesn't change DoF. Cropping then magnifying does change DoF, but sensor size doesn't affect magnification. Basic stuff, and if you don't understand how DoF works, then understanding how AF works is even more problematic.

Pi said:
The AF sensor does see rays from the periphery (of an f/2.8 lens) even in the center. You can try it at home. Here is a random diagram found with Google:
af_diagram.png
Ah, yes. A random diagram. A little knowledge is a dangerous thing. Ever see little notations on diagrams, like 'not to scale'? So, say I select an AF point, like the one on the butterfly:

0.jpg


According to your diagram, light from the purple flowers at the bottom of the frame, from the OOF brown smudge at the top left corner, light from the OOF yellowish flower (?) under the far right AF point, all of those are 'seen' by my selected AF point? How was focus on the butterfly achieved? Yes, the AF sensor sees light from many parts of the frame (but not the true periphery). But an individual AF point sees light from only a small part of the frame. That's the point of having a point, if you get my point.

By the way, here's the actual spread of the 7D's AF points and metering zones, in context of the image frame. (Total non sequitur, but when you look at this, you can see that the 1-series feature of spot metering linked to any AF point is pure marketing, since on the 7D there's a single one of the 63 metering zones associated with each of the 19 AF points.)

zu5.jpg


Each AF point is sampling a small dedicated region of object space (small, but slightly larger than that little box that represents the AF point). Light from that region of object space is split across two parts of a sensor line, and a phase difference (magnitude and direction) is determined. The accuracy of that determination is dependent on the physical baseline separation of the two parts of the sensor (f/2.8 sensor line pairs are physically further apart) and/or the pixel density of the sensor line itself (the latter is how some AF points on the 1DIII, 1DIV, and 1DsIII have f/2.8 accuracy with an f/4 lens). The precision of that determintion is relevant only in the image space (at the AF sensor), and neither focal length subject distance have a significant influence on that precision.

Pi said:
Lets us make it simple. You do not need to know how the AF system works. Assume that Nikon told you that their AF sensors are accurate within 1 DO-Focus with f/2.8 lenses. In plain language, that means that your f/2.8 images are just barely in focus, in average, and that does not depend on FL or sensor size, because ... Nikon told you that. Now, you get the same defocus blur with 50/2.8 on crop and 80/2.8 on FF (assume 1.6 crop factor) but the former is 80/4.5 eq., i.e., stopped down.
In fact, I do understand how the AF system works. I don't think you can say the same. But we're back to the sensor size affecting DoF, which as stated above, it does not (directly, only indirectly after you make some other change - focal length or distance to subject - to compensate for the changed angle of view). Why would I assume a Nikon crop was 1.6x? It isn't, it's 1.5x. But ok, assume something incorrect, fine. 50mm f/2.8 on crop and 80mm f/2.8 on FF. The framing is identical. The depth of field of the FF shot is shallower, because the focal length has been changed. But the depth of focus will be approximately the same in both cases, from the perspective of the AF sesnor.
 
Upvote 0
neuroanatomist said:
I probably shouldn't even try, because it seems that you won't be convinced, but why not one last go? ;)

I'll give it a try.....

You have two cameras, one is a 51.2 megapixel FF camera, the other is a 20 megapixel APS-C camera. Both have sensors made from the exact same process with the exact same pixel size. The center 40% of the FF sensor is then EXACTLY the same as the APS-C sensor. an image cropped from the central 40% of the FF sensor is indistinguishable from an image taken with the APS-C sensor.... the tech is the same, the pixel size is the same, the location of the pixels relative to the lens is the same.... everything is the same.

If you do not crop the FF image, the depth of field and the lens F stop do not change. To argue otherwise would be to suggest that cropping an image taken yesterday in Photoshop today will somehow go back in time and change the optical properties of the lens.

A crop sensor does not multiply focal lengths, it does not change F stops, these are optical properties that are intrinsic to the lens. What it does do is to only sample a subset of the light that passes through the lens, and that reduced sample area results in a reduced field of view, but it has no impact on the lens properties.
 
Upvote 0
Don Haines said:
pj1974 said:
Don Haines said:
We seem to have two groups of people arguing....

One group says that FF has the best image quality.
The other group says that APS-C has the best image quality that they can afford.

Both sides are right.

:D Great summary.....

... though there is (at least now) a third group of people that say "option C please, both the above are true!"

And you (& I) are in that group, Don!
I was outside last night doing something stupid..... trying to hand-hold a 60D with a 400F5.6 and a 2x teleconverter and shoot the ISS as it passed overhead. Surprisingly enough, it worked and the resulting image is 22 pixels across. I'd have loved to have a 1DX instead of the 60D and an 800F5.6 instead of the 400F5.6, but with what I can afford to spend, that's just not going to happen. Like so many of us, I have to settle for the best I can afford.

Wow, that's actually quite cool - taking a photo of the ISS with quite a focal length!

I've seen the ISS - but not ever got close enough to take a photo (maybe I should stand on a ladder next time)... Plus my 7D and 70-300mm L doesn't have quite sufficient reach :P

Hmmmm... keeping in theme with this thread, I wonder if depth of field is an issue though as you might have focussed on the distant tip of the ISS, rather than the nearest tip - particularly if you couldn't AFMA on the 60D... ;D

So- would you care to share your 22 pixels?

Paul
 
Upvote 0
pj1974 said:
Don Haines said:
pj1974 said:
Don Haines said:
We seem to have two groups of people arguing....

One group says that FF has the best image quality.
The other group says that APS-C has the best image quality that they can afford.

Both sides are right.

:D Great summary.....

... though there is (at least now) a third group of people that say "option C please, both the above are true!"

And you (& I) are in that group, Don!
I was outside last night doing something stupid..... trying to hand-hold a 60D with a 400F5.6 and a 2x teleconverter and shoot the ISS as it passed overhead. Surprisingly enough, it worked and the resulting image is 22 pixels across. I'd have loved to have a 1DX instead of the 60D and an 800F5.6 instead of the 400F5.6, but with what I can afford to spend, that's just not going to happen. Like so many of us, I have to settle for the best I can afford.

Wow, that's actually quite cool - taking a photo of the ISS with quite a focal length!

I've seen the ISS - but not ever got close enough to take a photo (maybe I should stand on a ladder next time)... Plus my 7D and 70-300mm L doesn't have quite sufficient reach :P

Hmmmm... keeping in theme with this thread, I wonder if depth of field is an issue though as you might have focussed on the distant tip of the ISS, rather than the nearest tip - particularly if you couldn't AFMA on the 60D... ;D

So- would you care to share your 22 pixels?

Paul
Here it is blown up.... this may well be the worst picture ever shown on this forum....
 

Attachments

  • IMG_9300-4.jpg
    IMG_9300-4.jpg
    78 KB · Views: 427
Upvote 0
neuroanatomist said:
I probably shouldn't even try, because it seems that you won't be convinced, but why not one last go? ;)

Pi said:
If you do not change FL and distance, and keep it at f/2.8, DOF changes; cropping makes in shallower. AF errors are magnified.
Sorry, no. Cropping doesn't change DoF. Cropping then magnifying does change DoF,

Of course, you magnify when you crop (by crop, I mean use a smaller sensor). Do you use a smaller screen do display your smaller sensor photos? DOF by definition is relative to a reference print size. That size is kept the same regardless of the sensor. Do I need to spell out everything?

but sensor size doesn't affect magnification. Basic stuff, and if you don't understand how DoF works, then understanding how AF works is even more problematic.

Sensor size does change the enlargement. To print at the same reference size, you enlarge the crop image 1.6x more. If you ever worked with film of different size and optical printing, you must have experienced that firsthand.

Ah, yes. A random diagram. A little knowledge is a dangerous thing. Ever see little notations on diagrams, like 'not to scale'? So, say I select an AF point, like the one on the butterfly:

0.jpg


According to your diagram, light from the purple flowers at the bottom of the frame, from the OOF brown smudge at the top left corner, light from the OOF yellowish flower (?) under the far right AF point, all of those are 'seen' by my selected AF point? How was focus on the butterfly achieved? Yes, the AF sensor sees light from many parts of the frame (but not the true periphery). But an individual AF point sees light from only a small part of the frame. That's the point of having a point, if you get my point.

This is fundamentally wrong. You are under the wrong impression that somehow, the image projected on the sensor is projected on the lens as well. You want more pictures? Here we go:

http://www.google.com/images?q=<phase detect af>

By the way, here's the actual spread of the 7D's AF points and metering zones, in context of the image frame.

Completely irrelevant with one exception - the points away from the center cannot see well the needed rays (call it vignetting).


Each AF point is sampling a small dedicated region of object space (small, but slightly larger than that little box that represents the AF point).

But it is sampling rays focused there coming from very different parts of the lens. Which is the whole point. Focus on this (pin intended). It is the key for understanding how phase detection works. Here is how a single point on the (image) sensor is created:

soft-focus-1.jpg


Why would I assume a Nikon crop was 1.6x? It isn't, it's 1.5x.

A simple exercise in abstract thinking, helps sometimes.

But ok, assume something incorrect, fine. 50mm f/2.8 on crop and 80mm f/2.8 on FF. The framing is identical. The depth of field of the FF shot is shallower, because the focal length has been changed. But the depth of focus will be approximately the same in both cases, from the perspective of the AF sesnor.

Not true, actually: In small-format cameras, the smaller circle of confusion limit yields a proportionately smaller depth of focus. (wikipedia). But still irrelevant because the guaranteed accuracy is just what it needs to get a barely focused image, in my example. And that happens at different DOF!
 
Upvote 0
Don Haines said:
If you do not crop the FF image, the depth of field and the lens F stop do not change. To argue otherwise would be to suggest that cropping an image taken yesterday in Photoshop today will somehow go back in time and change the optical properties of the lens.

Well...............

Cropping won't change the properties of the lens, obviously. But magnifying that cropped image does. Say you viewed your 51 MP image full-screen on your 27" Thunderbolt Display, and on the second daisy-chained 27" Thunderbolt Display next to it, you viewed the 20 MP image cropped from the middle of the frame, but also full-screen. By doing so, you've changed the magnification of the image...and thus, you've also changed the DoF. Or, say you printed the FF image at 24x36", and the center crop at 15x22". If you hang them side-by-side on the wall, the DoF is the same. But if you set them both on easels and place one 5' away and the other 15' away, you've changed the viewing distance (the apparent magnification from your vantage point), and thus the DoF is different.

No time machine required. ;)
 
Upvote 0
@ Pi - diagrams are intended to present a simplified view to aid in understanding. The idea is that you then extrapolate to gain a complete understanding. Are all of your camera lenses a single, biconvex element? Mine aren't.

News flash - crop sensors don't directly result in deeper DoF any more than wide angle lenses have a deeper DoF.

Regardless, I'll just stop. Agree to disagree and all that.
 
Upvote 0
pj1974 said:
Don Haines said:
pj1974 said:
Don Haines said:
We seem to have two groups of people arguing....

One group says that FF has the best image quality.
The other group says that APS-C has the best image quality that they can afford.

Both sides are right.

:D Great summary.....

... though there is (at least now) a third group of people that say "option C please, both the above are true!"

And you (& I) are in that group, Don!
I was outside last night doing something stupid..... trying to hand-hold a 60D with a 400F5.6 and a 2x teleconverter and shoot the ISS as it passed overhead. Surprisingly enough, it worked and the resulting image is 22 pixels across. I'd have loved to have a 1DX instead of the 60D and an 800F5.6 instead of the 400F5.6, but with what I can afford to spend, that's just not going to happen. Like so many of us, I have to settle for the best I can afford.

Wow, that's actually quite cool - taking a photo of the ISS with quite a focal length!

I've seen the ISS - but not ever got close enough to take a photo (maybe I should stand on a ladder next time)... Plus my 7D and 70-300mm L doesn't have quite sufficient reach :P

Hmmmm... keeping in theme with this thread, I wonder if depth of field is an issue though as you might have focussed on the distant tip of the ISS, rather than the nearest tip - particularly if you couldn't AFMA on the 60D... ;D

So- would you care to share your 22 pixels?

Paul

I think I am going to resolve to try to be more like you two. I don't even know why I get suckered into these silly and endless debates where the same topic is rehashed over and over again.

I got to spend two hours tonight processing images from a Saturday shoot. I've probably got another 30 to 40 hours ahead of me in the coming evenings. Much work, but much more enjoyable than this debate/discussion.

And, I'm doing something good: giving starving young actors much-needed headshots that they couldn't possibly afford otherwise. While learning every step of the way.

Need to remind myself that Robert Frank, Henri Cartier-Bresson, Brassai, August Sander, Garry Winogrand, Lee Friedlander, Diane Arbus and hundreds of others aren't remembered for Image Quality, but for Quality Images.
 
Upvote 0
Pi said:
neuroanatomist said:
I probably shouldn't even try, because it seems that you won't be convinced, but why not one last go? ;)

Pi said:
If you do not change FL and distance, and keep it at f/2.8, DOF changes; cropping makes in shallower. AF errors are magnified.
Sorry, no. Cropping doesn't change DoF. Cropping then magnifying does change DoF,

Of course, you magnify when you crop (by crop, I mean use a smaller sensor). Do you use a smaller screen do display your smaller sensor photos? DOF by definition is relative to a reference print size. That size is kept the same regardless of the sensor. Do I need to spell out everything?

Actually, DOF, by definition, is this:

Code:
DoF = (2 * N * c * f^2 * s^2) / (f^4 - (N^2 * c^2 * s^2))

The factor for CoC, circle of confusion, is c. It is effectively arbitrary. One does not necessarily know what the final output size of their photo will be in the end, or even if there may be multiple output sizes. If one wishes to be as concrete as possible, the CoC is physically limited by pixel size. To be "safe" when using a bayer type sensor, one should usually use a CoC that is at least twice the pixel pitch to account for the uneven sampling. Because of the even sparser nature of red and blue pixels relative to green, and due to the fact that an AA filter is usually used, it is better to use a CoC roughly three times the pixel pitch. That would be the only truly concrete definition of CoC for any given sensor.

Assuming one uses the pixel pitch x3 for CoC, then that greatly simplifies the initial argument, and do away with the notion of a reference print size. One could assume that the pixel pitch for a FF sensor and an APS-C sensor are identical. If that is the case, then one could photograph the same subject with the same lens at the same distance with both sensors, crop the FF to the same image dimensions as the APS-C, and the depth of field will be 100% identical in every respect, regardless of what size the images are scaled to. Therefor, depth of field has nothing to do with crop factor or field of view.

Nor, for that matter, does it really have anything to do with a reference print size. I would also offer the argument that even in print, as print size increases, so too does the most comfortable viewing distance. If you have the luxury of 80mp of MFD goodness, you might be able to print a highly detailed photo in an immense 40x60" size at 360ppi, drawing your viewers to within a few feet to examine all the detail. It you are incredibly meticulous, careful scaling might eek out enough detail do do something similar from something like the D800 or a hypothetical 40-50mp Canon FF. Generally speaking, as print size increases ppi drops, and so too does the viewers desire to stand back farther and farther to take the whole thing in. As visual acuity is also a function of distance, CoC could, for all intents and purposes, be a constant...and therefor a non-factor when one needs to determine their depth of field. You can pick whatever CoC your "comfortable" with, use that same value every time you compute DoF...at which point the formula above proves the point once and for all.

I guess therefor that one could then state that DoF is, by definition, purely a function of the lens and relative to viewing distance of the final output, at the time the photo is taken. ;)
 
Upvote 0
Status
Not open for further replies.