I doubt this is even [CR1] but I guess it starts somewhere - 7D3 specs?

The original specs certainly seem to be a wish list.
Here's mine:
24 Mp
No AA filter (if you "pixel peep" it is hard to see any pixel-level resolution on my 7DII images. My guess is that the AA filter "smudges" over two, and looks like more of a 10Mp sensor than 20Mp.)
tilty- flippy
all AF points F/8 compatible

and I'd be happy, since I often find I'm using a 1.4x with the 100-400 for small birds.
 
Upvote 0
haggie said:
My point with regards to the 7D Mark III is that Dual Pixel sensors are not needed for the 7D mark III because its primary audience will need it.
Dual Pixel sensors are fantastic for video and to some extend for Live View shooting.
The primary purpose of the "7D" is photography of swift subjects, e.g. sports, aircraft in flight and birds in flight. This is never done in Live View; this is where the optical viewfinder is at it's best.

I think you're confusing '7D users don't need it' with 'I personally don't need it'.

I think a lot more 7D users use the camera for general purpose photography (including Live View) and video than you think. I know of several professional cinematographers who use the 7D series for their video work.

Plus, I think we all can assume it's going to have a tilty-flippy screen. And with that you pretty much guarantee that DPAF is essential.
 
Upvote 0
unfocused said:
jolyonralph said:
...I think we all can assume it's going to have a tilty-flippy screen...

I'm curious what that assumption is based on. I'm neutral, so long as they don't mess with my buttons. But, I'm not sure why someone would assume that it will have a tilt or flip screen.

Because. ;)
 
Upvote 0
Mikehit said:
haggie said:
Dual Pixel technology involves 2 separate photon wells. For AF purposes they are used as separate entities (by the camera's firmware - for obvious reasons in a way not made public by Canon). To get the image, these 2 separate photon wells are 'combined' to form one pixel. Again, this is done by the camera's proprietary firmware.
When discussing the image of a Dual Pixel sensor, it is useful to realize that the limit where diffraction becomes visible in an image from a digital camera is determined by the size of the photon well and the aperture in use. The smaller the photon well, the sooner diffraction will be visible in the image. And also, the smaller the aperture, the sooner diffraction will be visible in the image.
And therefore, because a Dual Pixel sensor has smaller photon as a direct result of the Dual Pixel architecture, softer images are unavoidable.
Other contributing factors may be the required computation to 'construct' 1 pixel from the data read from 2 separate photon wells. But that is just guessing because like I just said, what the firmware is a secret.

So from witnessing: softness in Dual Pixel sensors is a thing. And at least one technical reason for it is available.
It is for sure not a matter of "thinking" and also it is not due to unsharp images from a Dual Pixel camera with sharp images from another camera (as a result of not performing AFMA or otherwise).

Diffraction is a factor of the lens - the sensor has nothing to do with it.


I admit that have taken some large steps in my initial text in an effort to keep it short. In my text I differentiate between diffraction and the resulting effect that diffraction has on the image as captured by the camera’s sensor.

Diffraction is a change in direction of waves (any wave!) as they pass through an opening or around a barrier.

In photography, diffraction is indeed caused by the properties of the lens. Diffraction by a lens shows itself in an image as a pattern in the projected image. For a given lens, the aperture setting is the only variable and therefore the aperture setting determines the amount of diffraction that is ‘created’ in the image that the lens projects.

How the diffraction that is caused by the lens, is recorded (captured) by the camera’s sensor is a completely different matter. A sensor in a camera consists of photosensitive elements that capture the light. The latter may be referred to as a “photosite” or “photon well”. I want to stay away from a definitions discussion about “pixels”, “dots”, “subpixels”, etc because that may just blur the understanding, but these are not the same as the pixel in an image you can process or view.

To be able to qualitatively and quantitatively discuss the phenomenon of diffraction, usually a flat waveform entering an opening is taken. The resulting size of the diffraction pattern in case of a given lens depends on the aperture used. Just imagine that the size of the photosensitive element is tiny in comparison to the diffraction pattern. Then the sensor will not be able to capture details that would be possible if diffraction were less – or when the sensor’s photosensitive element were larger. This results in blurring, which means softer images from your camera(‘s sensor).

So you see, how the diffraction that is caused by the lens, is recorded (captured) by the camera’s sensor DOES depend on the size of the photosensitive element in the sensor.

This is why I wrote “When discussing the image of a Dual Pixel sensor, it is useful to realize that the limit where diffraction becomes visible in an image from a digital camera is determined by the size of the photon well and the aperture in use”. So I deliberately wrote about diffraction visible in an image from a camera’s sensor; and the above illustrates that the size of the photosensitive element is a factor there.
I never wrote that the physical phenomenon of diffraction is caused by the lens and the pixel size, as you suggest. You just mix it all up.







Mikehit said:
haggie said:
Dual Pixel technology involves 2 separate photon wells. For AF purposes they are used as separate entities (by the camera's firmware - for obvious reasons in a way not made public by Canon). To get the image, these 2 separate photon wells are 'combined' to form one pixel. Again, this is done by the camera's proprietary firmware.
When discussing the image of a Dual Pixel sensor, it is useful to realize that the limit where diffraction becomes visible in an image from a digital camera is determined by the size of the photon well and the aperture in use. The smaller the photon well, the sooner diffraction will be visible in the image. And also, the smaller the aperture, the sooner diffraction will be visible in the image.
And therefore, because a Dual Pixel sensor has smaller photon as a direct result of the Dual Pixel architecture, softer images are unavoidable.
Other contributing factors may be the required computation to 'construct' 1 pixel from the data read from 2 separate photon wells. But that is just guessing because like I just said, what the firmware is a secret.

So from witnessing: softness in Dual Pixel sensors is a thing. And at least one technical reason for it is available.
It is for sure not a matter of "thinking" and also it is not due to unsharp images from a Dual Pixel camera with sharp images from another camera (as a result of not performing AFMA or otherwise).

Also you are technically incorrect. The two wells in each pixel both lie under the same microlens and are summed to create one single pixel output so as far as image construction is concerned there is no additional resolution so diffraction is no more or less visible.

I already admitted that have taken some large steps in my initial text.
But you are the one that is technically incorrect. You write “The two wells in each pixel both lie under the same microlens”.
However, whether or not the photosensitive elements share a micro lens or not has nothing to do with them capturing the effect of the diffraction caused by the lens. It is all about their size.

And if by writing “pixel output” you mean the pixels in the raw- or jpg-image from the camera, you may be even further off.
 
Upvote 0
haggie said:
So you see, how the diffraction that is caused by the lens, is recorded (captured) by the camera’s sensor DOES depend on the size of the photosensitive element in the sensor.
I agree. But the two photodiodes lie under a common microlens and output their image signal in a single output in so are acting as one unit - pixel size and (more importantly) pixelpitch in a 30MP sensor is the same with or without DPAF so in that respect the way they record diffraction is identical. So as far as I can see, any softening of the image due to being DPAF will be due to other properties.


haggie said:
I never wrote that the physical phenomenon of diffraction is caused by the lens and the pixel size, as you suggest. You just mix it all up.

I know you did not say it directly but I wanted to state the obvious to make it clear no assumptions were being made. It still surprises me how many people talk about high-MP sensor having more diffraction. They don't. But the misunderstanding arises from the fact the higher (density) MP sensors make it easier to see when viewing at 1:1.


However, whether or not the photosensitive elements share a micro lens or not has nothing to do with them capturing the effect of the diffraction caused by the lens. It is all about their size.
If they lie under the same microlens and output a single image signal, how do they 'record' separate images to be able to show diffraction?
Taking your logic it seems to be a 60MP image not a 30MP image.
 
Upvote 0
jolyonralph said:
tron said:
I truly hope that they leave it at 20mp and give these pixels 5DIV quality...

If you're using the same pixels as on the 5DIV you'd only get a 12 megapixel sensor at APS-C size.


My guess is that Canon are going to put a newer, better, revision of the 24mpx sensor in the 7D III which will, with fewer focus points etc, eventually work its way down to the rest of the range (90D, M5 Mark II, Rebels, etc in approximately that order)
I said 5DIV quality. I do not recall saying 5DIV pixel size! (I mean to use the new 5DIV sensor technology that impoved low light iq.
 
Upvote 0
Mikehit said:
haggie said:
So you see, how the diffraction that is caused by the lens, is recorded (captured) by the camera’s sensor DOES depend on the size of the photosensitive element in the sensor.
I agree. But the two photodiodes lie under a common microlens and output their image signal in a single output in so are acting as one unit - pixel size and (more importantly) pixelpitch in a 30MP sensor is the same with or without DPAF so in that respect the way they record diffraction is identical. So as far as I can see, any softening of the image due to being DPAF will be due to other properties.

haggie said:
I never wrote that the physical phenomenon of diffraction is caused by the lens and the pixel size, as you suggest. You just mix it all up.

I know you did not say it directly but I wanted to state the obvious to make it clear no assumptions were being made. It still surprises me how many people talk about high-MP sensor having more diffraction. They don't. But the misunderstanding arises from the fact the higher (density) MP sensors make it easier to see when viewing at 1:1.

However, whether or not the photosensitive elements share a micro lens or not has nothing to do with them capturing the effect of the diffraction caused by the lens. It is all about their size.
If they lie under the same microlens and output a single image signal, how do they 'record' separate images to be able to show diffraction?
Taking your logic it seems to be a 60MP image not a 30MP image.



I wrote that I wanted to avoid the definitions discussion about “pixels”, “dots”, “subpixels”, etc. But apparently that is not possible. However, the concepts behind it must be clear. Although not scientific definitions, the following can make this clear.
-A “Pixel” in a photograph is the smallest entity that can hold color (among others).
-To make a single Pixel visible, multiple composing entities are used. E.g. on a monitor screen a pixel is formed using a Red dot AND a Green dot AND a Blue dot. In a printed photograph a Yellow dot AND a Magenta dot AND a Cyan dot is used (also Black is present). To summarize: to ‘make’ a single Pixel, multiple composing items are required (I just call them ‘dot’ here).
-When capturing an image, something similar takes place. A photon wells by itself is not susceptible to color, it is just sensitive to the amount of light that hit it. A photon well can be made sensitive to only one composing color. This is achieved by placing colored ‘filters’ above it, thus making each photon well sensitive to only a specific basic color (often, but not necessarily, Red, Green and Blue). This means that the output of a photon well is not a "pixel" as usually perceived by photographers. A photon well captures one othe multiple composing elements for what is to be (!) a pixel.
-The next step is where the camera’s firmware makes a computation (and by the way, this is NOT a straightforward average due to the human eye’s sensitivity among others). This will construct an entity you might call a “pixel”. Therefore, ONLY after this step, you can talk about a “Pixel” in the usual sense.

Therefore, when you write “If they lie under the same microlens and output a single image signal" when talking about the photosensitive elements of a sensor, you completely misrepresent how it works.
And from this misrepresentation you then draw a conclusion that are even more wrong ….. that you attribute to …. me!
When you write “Taking your logic it seems to be a 60MP image not a 30MP image” you completely missed both the concept of "Pixel" versus "photon well" (again, the notion of the “Pixel” does not exist at that stage in the process) and also the concept of what Dual Pixel means after AF has taken place and the image is recorded by the sensor and formed by the camera’s firmware.

Furthermore, although it is hard to find any information about a specific camera, it seems very unlikely that the micro lenses are one simple geometric shape that is position above the two photon wells that form what Canon in its PR calls “Dual Pixel” sensors. That also makes your assumption a bit risky.


PM 1 In de specialist area of sensor technology, often the “photon well” is called “pixel”. But that should not be confused with what is meant with a “Pixel” in photography and graphics.

PM 2 On the internet there is also quite some misunderstanding going around on the subject of diffraction. This often results in the misconception that diffraction has no real negative effect on images from a higher resolution sensor.
A textbook on physics and on optics is the only way to avoid wrong interpretations and faulty simplifications, but that usually requires some background in mathematics. I googled to find this web page that describes more eloquently what I just wrote (or at least meant to write :) ) and that also offers some figures to illustrate it:
https://photographylife.com/what-is-diffraction-in-photography
 
Upvote 0
Since Canon's FF 5D-4 30MP sensor has obviously less DR /more shadow noise than the NIKON's D850 with it's additional 15MP,
http://www.imaging-resource.com/PRODS/nikon-d850/nikon-d850A.HTM#shooting1
Unless they've drastically improved their sensor technology in the last few months, it's hard to believe they can produce a 30MP APS-C sensor that's even close to "acceptable" DR standards?
 
Upvote 0
Leigh said:
Since Canon's FF 5D-4 30MP sensor has obviously less DR /more shadow noise than the NIKON's D850 with it's additional 15MP,
http://www.imaging-resource.com/PRODS/nikon-d850/nikon-d850A.HTM#shooting1
Unless they've drastically improved their sensor technology in the last few months, it's hard to believe they can produce a 30MP APS-C sensor that's even close to "acceptable" DR standards?
If your definition of "acceptable" is strictly "beating everything else to be #1 in the world" then no, of course they can't do that.
If your definition of "acceptable" is "enough that it covers all realistic uses" then yes they can totally do that. 35mm cameras from 10 years ago and APS-C cameras from 6-7 years ago could do that.

That said, I do still think 30mp seems like inane wishful thinking and 24 is more likely, with an outside chance of 26 or 28. More resolution helps with no AA and can compensate for noise/DR once the image is scaled to regular printing/viewing sizes, but more pixels also means it's harder to get the speed up, and the 7Ds are all about speed primarily. 1 more fps is more important for these types of camera than 2 more million pixels.
 
Upvote 0
This causes me some intense agony.
I have had a faithful 70D for four years. Recently its started to break down. As a little bit of prspective ot has been my first DSLR and has been my constant companion. It has taken me fro shooting in full auto with crap composition to getting my first two shots published in Athletics Weekly last month. I shoot a lot of sports and action. It has approx 110k shots on the shutter.

I was hoping it would survive until early next year when the next gen 90D/7dmkIII would be out and i could potentially move up a rung to the 7D line or stick with the 90D. Unfortunately this week i have had contant Error 80 errors that i cant fix plus the mount is starting to work loose (I hang a 70-200 of it most of the time) and the mode dial dropped off this morning.

I know have to decide what camera to get. Do i get a 70D/80D second hand to keep going until the 7D3 or do i just go for the 7D2 which i would also love but will be out of date next year.
 
Upvote 0
markjsmccall said:
This causes me some intense agony.
I have had a faithful 70D for four years. Recently its started to break down. As a little bit of prspective ot has been my first DSLR and has been my constant companion. It has taken me fro shooting in full auto with crap composition to getting my first two shots published in Athletics Weekly last month. I shoot a lot of sports and action. It has approx 110k shots on the shutter.

I was hoping it would survive until early next year when the next gen 90D/7dmkIII would be out and i could potentially move up a rung to the 7D line or stick with the 90D. Unfortunately this week i have had contant Error 80 errors that i cant fix plus the mount is starting to work loose (I hang a 70-200 of it most of the time) and the mode dial dropped off this morning.

I know have to decide what camera to get. Do i get a 70D/80D second hand to keep going until the 7D3 or do i just go for the 7D2 which i would also love but will be out of date next year.

Your suggestion of a used body sounds like a good idea until the 7D3 is out. But just take heed, only Canon knows when the 7D3 will be out, so you might be waiting a while. As in, it could be late 2018 or even into 2019...
 
Upvote 0
haggie said:
PM 2 On the internet there is also quite some misunderstanding going around on the subject of diffraction. This often results in the misconception that diffraction has no real negative effect on images from a higher resolution sensor.
A textbook on physics and on optics is the only way to avoid wrong interpretations and faulty simplifications, but that usually requires some background in mathematics. I googled to find this web page that describes more eloquently what I just wrote (or at least meant to write :) ) and that also offers some figures to illustrate it:
https://photographylife.com/what-is-diffraction-in-photography

Do you agree with the following or does your knowledge of pixel technology cut across the following? A point source of light is spread out by diffraction by the circular aperture of a lens to give a disk of light on the sensor, called the Airy Disk. Two points of light will not be resolved if they are separated on the sensor by distance less than the diameter of the Airy disk – the two disks of light will overlap to give a blur. The diameter of the Airy disk is directly proportional to the f-number of the lens and the wavelength of light, and is independent of the sensor. The diameter of the disk is exactly the same whether it falls on a 50 mpx sensor or a 20 mpx one. If the point of light gives a diffraction disk that covers 10 px of the 20 mpx sensor, it will cover approximate 16 on the 50 mpx sensor. But, just because it covers more pixels, it doesn’t mean it blurs the higher megapixel more, it blurs it by exactly the same amount. What the diffraction disk does is to waste the extra resolving power of the higher megapixel sensor – it doesn’t make it softer than the lower megapixel sensor.

I think that this is what Mike is arguing.
 
Upvote 0
haggie said:
I wrote that I wanted to avoid the definitions discussion about “pixels”, “dots”, “subpixels”, etc. But apparently that is not possible.

Because the definition of a pixel is fundamental to the resolution and therefore the very basis of your premise that the DPAF sensors are softer because they are more likely to show the effects of diffraction.

haggie said:
Therefore, when you write “If they lie under the same microlens and output a single image signal" when talking about the photosensitive elements of a sensor, you completely misrepresent how it works.

In what way am I misrepresenting how they work?


haggie said:
Furthermore, although it is hard to find any information about a specific camera, it seems very unlikely that the micro lenses are one simple geometric shape that is position above the two photon wells that form what Canon in its PR calls “Dual Pixel” sensors. That also makes your assumption a bit risky.
Given the lack of information, are you making an assumption and using that assumption to say I am incorrect.

haggie said:
PM 1 In de specialist area of sensor technology, often the “photon well” is called “pixel”. But that should not be confused with what is meant with a “Pixel” in photography and graphics.
We seem to be talking at cross purposes.
As far as I can tell, the information to create the image comes from a photon well. In the DPAF sensor, two photon wells are acting as a combined unit to create one pixel otuput. Please tell me where I am incorrect.
So how do you define a 'pixel' in terms of a photographic sensor? You still have not explained that.
Are you saying a sensel and a pixel is not the same thing?
haggie said:
PM 2 On the internet there is also quite some misunderstanding going around on the subject of diffraction. This often results in the misconception that diffraction has no real negative effect on images from a higher resolution sensor.

A textbook on physics and on optics is the only way to avoid wrong interpretations and faulty simplifications, but that usually requires some background in mathematics. I googled to find this web page that describes more eloquently what I just wrote (or at least meant to write :) ) and that also offers some figures to illustrate it:
https://photographylife.com/what-is-diffraction-in-photography

I know what diffraction is and I have never said it does not have an effect on the image. You seem to be making very general comments without actually saying anything pertinent to this discussion. That article about diffraction has nothing to do with why you think having 2 photons wells to a pixel (however you define it) as opposed to one photon to a pixel will enhance the effect of diffraction and therefore make the image softer - because that is what I believe you said originally. And you have not yet confirmed or denied that is what you meant.
 
Upvote 0
dolina said:
Anyone want to take a wager that this thread will still be active by the time the 7D3 is announced by September 2019 with drastically differing specs? :)

Go out and shoot!
Who knows, but discussions about diffraction will continue for ever...... MikeH is right, of course.
 
Upvote 0
dolina said:
djack41 said:
Canon should concentrate on improving ISO and dynamic range performance. Packing 30mp onto a crop sensor seems counter to improving both.
They should collaborate with Google and Qualcomm to make cameras that can best flagship smartphones.

A iPhone 8 can do 4K@60fps.

Maybe they could attach a cellphone camera on top of the viewfinder bump just for 4K purposes ::)
 
Upvote 0