Is the Canon EOS R7 the next camera to be announced? [CR2]

unfocused

Photos/Photo Book Reviews: www.thecuriouseye.com
Jul 20, 2010
7,184
5,483
70
Springfield, IL
www.thecuriouseye.com
As I said... Maybe(!) for sports shooters (which I'm not!) Aps-c(!!) wide angle lenses don't make sense. But for hiking purposes it does perfectly make sense... At least if you want a good compromise between wide angle, reach and weight!
It seems like the RF 16 f2.8 would be a very useful lens for APS-C. Compact, tiny, cheap, fast, just over 24mm and no edge vignetting in crop mode.
 
  • Like
Reactions: 1 users
Upvote 0

jd7

CR Pro
Feb 3, 2013
1,064
418
Leaving aside the +, does 2 2=4 make any sense?

Magnification is the amount of enlargement applied to the image between the size at which it is captured and the size at which it is viewed. Therefore, enlargement ratio and viewing distance are actually part of magnification – you can't 'leave them aside'.
When I referred to magnification, I was thinking about reproduction ratio (as discussed on various websites such as https://photographylife.com/what-is-magnification, ie the same sort of magnification as when we talk about a lens having a particular maximimum magnification) and considering enlargement and viewing distance separately because (as you say in relation to enlargement) they come into play after the image is captured. Re-reading Michael Clark's email now, I can see he was referring to magnification in the same sense you are, where enlargement and viewing distance are inherent elements of magnification.
 
Upvote 0

Michael Clark

Now we see through a glass, darkly...
Apr 5, 2016
4,722
2,655
Leaving aside enlargement ration and viewing distance, and to be picky, I thought DOF was (essentially) dependent on entrance pupil and subject distance or f-stop (relative aperture) and magnification. Have I got that wrong?

Depth of field is an illusion. Though, as Einstein might say, it's a rather persistent one.

There is only a single distance that is in sharpest focus. Everything closer or further from that distance is blurry to one degree or another. What we describe as the depth of field of an image is the total distance from in front of to behind the actual focus distance that looks just as sharp to our eyes as things at the actual focus distance.

There's no magic barrier at which everything closer to the focus distance is perfectly sharp and everything further is hopelessly blurry. It's a gradual progression from sharper to blurrier. The size of the entrance pupil (the effective aperture, if you insist on using that less precise term) affects how far from the things in focus other things on either side of the focus distance have to be to get blurry enough for us to perceive them as blurry.

Our perception is based on the size of the blur as it is projected by our corneas onto our retinas. If it's smaller than the limits of our perception, we see it as a sharp point. If it's larger than the limits of our perception we see it as an area of blur.

The two things that affect how far from the actual focus distance things can be and still be perceived by our eyes as sharp in a photo are the size of the lens' entrance pupil and TOTAL magnification, which is measured between the real life size of the things captured in the photo and the size of those things projected onto our retinas.

Subject distance is one factor in magnification.

How far the subject is from the camera affects how large the subject will be in the image projected by the lens onto the sensor or film. If the subject is 1/100 life size in the projected image that's a magnification ratio of 0.01X. If the subject is life size in the projected image, that's a magnification ratio of 1.0X. If the subject is one-third actual size in the projected image that's a magnification ratio of 0.33X.

It doesn't matter if the focal length is 100mm and the distance is 10m, or if the focal length is 200mm and the distance is 20m, the size of the subject on the projected image will be the same in both cases because the doubling in focal length is countered by the doubling in distance. Both factors combine to determine how large the the subject is as projected on the sensor.

Focal length is also one factor in magnification.

So are display size and viewing distance.

Why? Because we don't usually view images at the same size they are projected onto a camera's sensor or film. We enlarge them significantly. When we enlarge the image, we also enlarge to size of any blur in the image. The larger blur will be more easily perceived as blur by our eyes unless we also increase the viewing distance by the same factor. If we enlarge a 135 format negative to view the printed positive at 8x12 inches, we've enlarged by a factor of 8.5X. If we back up from one foot to 8.5 feet to view the image, then the amount of blur in the 8X12 inch print will look the same as the amount of blur in 36x24mm print viewed at one foot. But we don't usually view 8x12" prints at a distance of 8.5 feet. We usually view them at a distance of about one foot. Thus the blur in the print can be seen as 8.5X larger in the 8x12" print than in the contact print from the negative. (That is, unless we lean even closer and use a magnifying glass when viewing our contact sheet with all the life sized negatives printed on them.)

The total magnification is the only one that really matters, because all of those things (distance, FL, sensor size/enlargement ratio/display size, and viewing distance) combined determine how large the subject is in the image projected by the viewer's cornea onto the viewer's retina. Ultimately it is how large the parts of an image are on the viewer's retina that count.

This is because whether blur is perceived as blur or perceived as a single point is determined by how many seconds of arc that blur occupies in the viewer's vision.

This is easy enough to test. The next time you are looking at images from a shoot, try and determine which ones have the "sharpest" subjects looking only at the "filmstrip" sized thumbnails. Then open each image displayed in a main window with the thumbnails in a "filmstrip" underneath or beside the viewing window and look again. Then view each image at 100% (1 screen pixel equals one image pixel). It will be obvious to you that things that look sharp at postage stamp sizes don't always look sharp at post card sizes, and even things that look sharp at post card sizes don't all look sharp at poster board sizes when all three sizes are viewed from the same distance.
 
Last edited:
  • Like
Reactions: 3 users
Upvote 0

Michael Clark

Now we see through a glass, darkly...
Apr 5, 2016
4,722
2,655
Leaving aside the +, does 2 2=4 make any sense?

Magnification is the amount of enlargement applied to the image between the size at which it is captured and the size at which it is viewed. Therefore, enlargement ratio and viewing distance are actually part of magnification – you can't 'leave them aside'.

Total magnification includes everything that influences the angular size of the subject on the viewer's retina:

Subject distance
Focal length
enlargement ratio
viewing distance
 
Upvote 0

Michael Clark

Now we see through a glass, darkly...
Apr 5, 2016
4,722
2,655
As I said... Maybe(!) for sports shooters (which I'm not!) Aps-c(!!) wide angle lenses don't make sense. But for hiking purposes it does perfectly make sense... At least if you want a good compromise between wide angle, reach and weight!

What you actually said was:

"Maybe sports shooters aren't interested in wide angle lenses."

What I said in reply was:

"I'm a sports shooter and I'm definitely interested in wide angle lenses."

But the vast majority of us who are interested in WA lenses aren't interested in using them on APS-C bodies. We've got FF bodies to maximize the AoV provided by our WA lenses and we might (or might not) have cropped bodies to maximize the "reach" of our telephoto lenses, or to reduce the size/weight/cost of a specific amount of "reach".

None of us, regardless of what sensor size(s) we are using, have time to change lenses between the instant an athlete running straight towards us transitions from further than about 15-20 yards to closer than 15-20 yards from our shooting position. We don't really even have time to switch which camera we're shooting with. So we begin reaching for the "wide" body as soon as we take the last shot with the "long" body and move our right hand off the shutter button and controls, and we often start shooting even before we have time to raise the viewfinder to eye level.

My friend Gary Cosby, Jr. has had hard news photos published on the front page of the New York Times. He's had an entire photo project printed in ESPN The Magazine. He's now the chief (only one left) photographer at the Tuscaloosa News, which is owned by Gannett. A large percentage of his current assignments are pushing images from University of Alabama athletic events to Gannett's wire services. I often see him on TV coverage of Alabama football games.

Back in about 2015 he got several images, published in his coverage of a high school game I was also at, between the time the ball was batted away by a defender and the receiver, defenders, and the ball hit the turf. At the time he was shooting this sequence, he was also rising up off his knees to get out of the way. He still didn't have the second body all the way up to his eye when the play was over. It was roughly 1.1 seconds, based on the EXIF sub-second field, between my first and last frames below. (Don't be too tough on Gary's images in terms of color. The flicker in that stadium is horrible and he was shooting straight to JPEG - his filing deadline was less than two hours after the end of the first half when this took place and there was no usable WiFi anywhere nearby back then - with a company issued Nikon that didn't have any kind of flicker reduction.)

201509181100LR.JPG

201509181101LR.JPG
201509184395LR.JPG


Apologies for the cropped and downsized blurry images. I was about 30 yards away having just finished taking pictures of the band warming up for their halftime show when I saw the play, a 50+ yard pass, developing towards the other side of the field.

You can see me in this frame he grabbed a split-second before my last frame above.

55fcd8987795f.image.jpg
 
  • Like
Reactions: 5 users
Upvote 0

AlanF

Desperately seeking birds
CR Pro
Aug 16, 2012
12,355
22,534
What you actually said was:

"Maybe sports shooters aren't interested in wide angle lenses."

What I said in reply was:

"I'm a sports shooter and I'm definitely interested in wide angle lenses."

But the vast majority of us who are interested in WA lenses aren't interested in using them on APS-C bodies. We've got FF bodies to maximize the AoV provided by our WA lenses and we might (or might not) have cropped bodies to maximize the "reach" of our telephoto lenses, or to reduce the size/weight/cost of a specific amount of "reach".

None of us, regardless of what sensor size(s) we are using, have time to change lenses between the instant an athlete running straight towards us transitions from further than about 15-20 yards to closer than 15-20 yards from our shooting position. We don't really even have time to switch which camera we're shooting with. So we begin reaching for the "wide" body as soon as we take the last shot with the "long" body and move our right hand off the shutter button and controls, and we often start shooting even before we have time to raise the viewfinder to eye level.

My friend Gary Cosby, Jr. has had hard news photos published on the front page of the New York Times. He's had an entire photo project printed in ESPN The Magazine. He's now the chief (only one left) photographer at the Tuscaloosa News, which is owned by Gannett. A large percentage of his current assignments are pushing images from University of Alabama athletic events to Gannett's wire services. I often see him on TV coverage of Alabama football games.

Back in about 2015 he got several images, published in his coverage of a high school game I was also at, between the time the ball was batted away by a defender and the receiver, defenders, and the ball hit the turf. At the time he was shooting this sequence, he was also rising up off his knees to get out of the way. He still didn't have the second body all the way up to his eye when the play was over. It was roughly 1.1 seconds, based on the EXIF sub-second field, between my first and last frames below. (Don't be too tough on Gary's images in terms of color. The flicker in that stadium is horrible and he was shooting straight to JPEG - his filing deadline was less than two hours after the end of the first half when this took place and there was no usable WiFi anywhere nearby back then - with a company issued Nikon that didn't have any kind of flicker reduction.)

View attachment 202491

View attachment 202492
View attachment 202489


Apologies for the cropped and downsized blurry images. I was about 30 yards away having just finished taking pictures of the band warming up for their halftime show when I saw the play, a 50+ yard pass, developing towards the other side of the field.

You can see me in this frame he grabbed a split-second before my last frame above.

55fcd8987795f.image.jpg
Wow - that shows what a pro can do. Remarkable reflexes and muscle memory.
 
  • Like
Reactions: 1 user
Upvote 0

Lee Jay

EOS 7D Mark II
Sep 22, 2011
2,250
175
Get thineself to ye olde camera store post haste! The R5/6 evf's are much improved over the older tech..I was a mirrorless holdout for my main body until I tried them and I'm now sold.
I just tried an R6. Same old problems - blurry while panning, blown brights, laggy. Less crushed blacks though.

I fiddled with things and found that the EVF uses the same shutter speed as the final exposure will. This is a big part of the blurry while panning issue. It got better when I set it to a faster shutter speed (1/640 vs 1/125) but then the view was dark because ISO was pegged, in the brightly-lit store.

I'd still find it unusable for most of what I do.

Need to improve another order of magnitude at least.
 
  • Like
Reactions: 1 users
Upvote 0