More Specifications & Images of EOS 5D Mark IV

rrcphoto said:
justsomedude said:
unfocused said:
justsomedude said:
I'm just going off what was first reported by Canon Rumors about the new sensor tech. So, I'm not barking up anything... just repeating what was initially reported.

“The way Dual Pixel works, 30MP DP sensor means a 60MP dual pixel raw file with a new RRGGGGBB pattern (vs RGGB), so potentially better dynamic range, and maybe an improved debayering algorithm.”

That wasn't "reported" it was a quote from a tweet by some random guy named "Thomas" on the Internet. Don't believe everything you read on the Internet and don't assume that anything on a site with "Rumors" in its name are 100% accurate.

The actual quote from the original Canon Rumors blog post / spec list is as follows...

ever possible to no post-processing of the adjustment dual pixel RAW file (bad translation)
“The way Dual Pixel works, 30MP DP sensor means a 60MP dual pixel raw file with a new RRGGGGBB pattern (vs RGGB), so potentially better dynamic range, and maybe an improved debayering algorithm.”

Many, including those on this forum, assumed it was some type of technology aimed at improving DR, and it sparked some interesting conversation.

That theory has since been proven incorrect as more information on the Dual Pixel RAW has come out.

I'm not really sure why that progression of information/discussion is getting your panties in twist.

of course you skip the (thanks Thomas) which linked the twitter comment...
::)

and most people dismissed that right away because it would have made 3:1 images, and because of the CFA .. what he was suggesting was kind of impossible.

Did you know that the word 'gullible' is not on dictionary.com? ;)
 
Upvote 0
PureClassA said:
tr573 said:
PureClassA said:
BEFORE the shutter. BEFORE the exposure is made. (remember...shutters are electronic now, it's NOT the curtain) Whether shooting stills or shooting video, in Live View mode, focus is achieved without the 61pt AF system. The mirror is up, the curtain is down, the sensor is fully visable. The sensor has its own second focusing system that is employed in lieu of using the 61 AF points we can see when using the OVF. Dual Pixel AF enhanced the reliability of this sensor based AF system by splitting pixels.

No, it does not. The sensor in this case, just like in all others, turns the light into electrons, and then the CPU decides what to do with that data. Sometimes the CPU uses that data to assemble an image, sometimes it uses it to determine phase difference for focusing, and sometimes it uses it to meter the scene.

PureClassA said:
Back to the original point, once the mirror and curtain are up and the sensor is exposed, will the live view focusing system of the sensor have to engage in order to use this new feature? My bet is yes. So it will probably be Live View function ONLY if you want DP RAW stills

You should not have to, because the image sensor is collecting the exact same data regardless of whether you are in live view or not, it's just what the CPU does with the data afterwards.

So then what exactly is telling my lens to focus while in live view when i press the AF button? I'm selecting a point in live view. The sensor is feeding live data to a processor but the sensor itself has now become a million different AF points. With Dual Pixel Auto Focus it has become even more accurate and also allows for active focusing during video. The Digic Chip may be making the decisions to move the lens and achieve focus, but the Pixels themselves are now being used as AF points. Same basic process as using 61pt AF. That data is fed to Digic as well. That's what I'm getting at. And if the sub-pixels are going to be read out separately, I can't see how you can do that without the DPAF pixels themselves being used as the AF points. I think maybe we are both largely in agreement and just perhaps saying in different ways.

The pixels are always separated into two physically, so they always generate their own data independently of each other whether you are using live view focus or not. If the cpu is making an image from the data, it has to be combined. If it is using it to determine phase difference to focus, it does not. So it's all about what you've programmed the software to do - not whether you were using live view or not.
 
Upvote 0
justsomedude said:
3kramd5 said:
justsomedude said:
rrcphoto said:
justsomedude said:
LetTheRightLensIn said:
"Dual Pixel RAW"

had hope it would be more to do with DR but, as I sort of guess, by the name they gave it, nope

I hope this camera performs great, but I'm suspicious the DR won't match Exmor and the video quality won't match Sony (much less that of any Nikon D820 or possible A7R III next year) so for those finicky about that stuff, we'll see....

I'm admittedly a bit of a sensor/shadow nit, and with the descriptions of "Dual Pixel RAW" coming out, I am now also concerned about low-light/shadow-recovery performance compared with Exmor.

*sigh*

I'm hoping for the best, and waiting for some good image samples before I pre-order.

C'mon Canon... you really gotta get this one right!

why? apparently the 1DX Mark II is perfectly fine in that regard.

This is not a 1DX2. And we still don't know what sensor the 5D4 is getting.

But what about DPAF worries you about shadow recovery? Are they related in any way?

I'm not worried about DPAF as it relates to shadow recovery. That was RRCPHOTO misunderstanding the previous comments by LetTheRightLensIn and myself. We (and I'm sure others) had hoped that the DPAF was a dual-ISO type technology, just hardware based, as that is what some of the initial leaks interpreted.

However, we now know that DPAF has nothing to do with dynamic range at all, so all of our hopes for some dramatic DR improvement now rest solely on the sensor alone. As to whether or not the 5D4 can match the performance of the Exmor, is all we're getting at.

So would it be more correct to strike the "now" from your initial post, or replace it with "still?"

In other words, you were concerned with shadow recovery, and the random tweet and associated theories gave you temporary false hope?

Either way, I won't be pre-ordering. I'll probably buy one, and don't expect canon to have pipeline issues.
 
Upvote 0
cpreston said:
So, does this 30MP count for the sensor include the DPAF pixels. In other words, could this be more like a 20MP sensor with 10MP added through the DPAF sites?

It is not legacy OSPDAF. There are no DPAF-exclusive sites. Each and every pixel has two diodes. 30.4MP, 60.4 million photodiodes.
 
Upvote 0
cpreston said:
So, does this 30MP count for the sensor include the DPAF pixels. In other words, could this be more like a 20MP sensor with 10MP added through the DPAF sites?

No.Think of a 30MP DPAF sensor just as a regular 30MP imaging sensor but with the difference that some majority (in earlier DPAF sensors the center ~80%) of the pixels can also record phase information in addition to a luminosity value. This is internally implemented by actually having two independent photodiodes per pixel, but that's not really relevant. You can treat the pixels as black boxes. You just need to know that in addition to luminosity, they can sense phase difference that can be used to run the AF algorithm (and/or stored in the output file for later use in case of the 5D4).
 
Upvote 0
3kramd5 said:
cpreston said:
So, does this 30MP count for the sensor include the DPAF pixels. In other words, could this be more like a 20MP sensor with 10MP added through the DPAF sites?

It is not legacy OSPDAF. There are no DPAF-exclusive sites. Each and every pixel has two diodes. 30.4MP, 60.4 million photodiodes.

Ahh, I see that the the RAW file size in the specs indicates how the dual pixel RAW is being used.
 
Upvote 0
Sharlin said:
cpreston said:
So, does this 30MP count for the sensor include the DPAF pixels. In other words, could this be more like a 20MP sensor with 10MP added through the DPAF sites?

No.Think of a 30MP DPAF sensor just as a regular 30MP imaging sensor but with the difference that some majority (in earlier DPAF sensors the center ~80%) of the pixels can also record phase information in addition to a luminosity value. This is internally implemented by actually having two independent photodiodes per pixel, but that's not really relevant. You can treat the pixels as black boxes. You just need to know that in addition to luminosity, they can sense phase difference that can be used to run the AF algorithm (and/or stored in the output file for later use in case of the 5D4).
I think a single dual pixel does not record phase information at all, but just luminosity information independently in its two sub-pixels. The two photosites in a single pixel however gather light from different sides of the lens. So a single dual pixel cannot be used for autofocussing at all, because the only information available are two luminosity values. Autofocussing can then only be achieved by looking at a horizontal row of multiple pixels and by comparing (and for focussing aligning) the intensity values in a row of left and right pixels (just like in an ordinary PDAF sensor). Meaning that from my understanding, dual pixel autofocussing with horizontally separated dual-pixels will not work if there are only horizontal structures in the scene, but instead require horizontal changes in intensity. So DPAF is essentially as if there were a whole lot of line-type PDAF sensors.
 
Upvote 0
If the DPRAW feature is for micro-adjusting focus in post processing, then that could be a very big deal (depending on how much latitude there is). What a brilliant application for the Dual Pixel technology. I like it better than dual ISO for my photography. :P

Here's how I understand it:

  • There are two photo diodes per pixel.
  • Each photo diode serves one function: produce electrons/data that represent light.

That's it. That's all they do.

The question is, what can be done with that data?

Image Capture:

  • Combine the data of each diode pair into a value that represents one pixel.
  • Do whatever processing is done behind the Digic curtain.
  • Record image data.

Live View Auto-Focus:

  • Measure phase difference between diode pairs.
  • Adjust focus to bring the diodes under the focus square into phase alignment.*
  • Direction of focus adjustment is known from the phase difference, so no hunting is necessary.

* As focus is adjusted, diode pairs outside the plane of focus will shift more and more out of phase (the bokeh).

The simplified process is this:

[list type=decimal]
[*]Focus the lens.
[*]Capture the data.
[*]Process the data into an image.
[/list]

The method of focus (Live View with DPAF versus View Finder with AF sensor) is relevant only to how the lens is adjusted to affect focus. It does not change how data is captured by the sensor. What this DPRAW feature seems to do is store the separate values of each diode pair in the RAW file rather than combining them and only storing the sum (hence the nearly double file size).

Here's an analogy with arithmetic that illustrates how information is lost:

We know that 6+2 = 8. Easy. But if we save only the 8, we lose what the inputs were. Was it 4+4, 5+3, 6+2, 7+1 or 8+0?

With current cameras, the RAW data used to produce each pixel is like the 8 in the analogy. Each diode pair's output was combined to make a pixel.

The DPRAW data used to produce each pixel is like 4+4. It can still be combined to produce a single pixel, but preserving the inputs could allow software to do something with that data long after the image is captured. From this rumor, it sounds like that could be micro-adjusting focus in post-processing.

COOL!

Of course, I'm just some guy on the internet and could be totally wrong. :P
 
Upvote 0
Here's how Canon explains DPAF, by the way:

"To perform phase detection on the image plane left and right photodiodes are read independently and the resulting parallax images are used to calculate the phase-difference."

More info here: http://www.canon.co.uk/for_home/product_finder/cameras/digital_slr/dual_pixel_cmos_af/

70d_features_01f.jpg
 
Upvote 0
Sounds like that would also mean that DPRAW would only work in Live View mode.

AFMA in PP (if I'm reading the description right...) would be nice indeed, but even more so if it worked in regular mode as well!
 
Upvote 0
yeahright said:
I think a single dual pixel does not record phase information at all, but just luminosity information independently in its two sub-pixels. The two photosites in a single pixel however gather light from different sides of the lens. So a single dual pixel cannot be used for autofocussing at all, because the only information available are two luminosity values. Autofocussing can then only be achieved by looking at a horizontal row of multiple pixels and by comparing (and for focussing aligning) the intensity values in a row of left and right pixels (just like in an ordinary PDAF sensor). Meaning that from my understanding, dual pixel autofocussing with horizontally separated dual-pixels will not work if there are only horizontal structures in the scene, but instead require horizontal changes in intensity. So DPAF is essentially as if there were a whole lot of line-type PDAF sensors.

Yeah, I concur. I was a bit imprecise. From a single DPAF pixel you just get two luminosity values. To get enough information to drive focus you do need a longer baseline than just a single DP pair, just like PDAF sensors are made of linear strips of photodiodes.
 
Upvote 0
Act444 said:
Sounds like that would also mean that DPRAW would only work in Live View mode.

AFMA in PP (if I'm reading the description right...) would be nice indeed, but even more so if it worked in regular mode as well!

No. This has been discussed for the last couple of pages. There's no reason to think that DPRAW would work in Live View only.
 
Upvote 0
I too hope Canon work together with Adobe to implement the dual pixel raw features in Lightroom and ACR. The conveniences of using Lightroom over DPP are so massive, it would feel like a huge nuisance to have to switch software just to make use of DP raw files.

And being a 7D2 owner, I can't help but dream about them implementing the feature in all previous DPAF cameras. At least the 1DX2 and the 7D2? Pretty please?
 
Upvote 0
tr573 said:
PureClassA said:
tr573 said:
PureClassA said:
BEFORE the shutter. BEFORE the exposure is made. (remember...shutters are electronic now, it's NOT the curtain) Whether shooting stills or shooting video, in Live View mode, focus is achieved without the 61pt AF system. The mirror is up, the curtain is down, the sensor is fully visable. The sensor has its own second focusing system that is employed in lieu of using the 61 AF points we can see when using the OVF. Dual Pixel AF enhanced the reliability of this sensor based AF system by splitting pixels.

No, it does not. The sensor in this case, just like in all others, turns the light into electrons, and then the CPU decides what to do with that data. Sometimes the CPU uses that data to assemble an image, sometimes it uses it to determine phase difference for focusing, and sometimes it uses it to meter the scene.

PureClassA said:
Back to the original point, once the mirror and curtain are up and the sensor is exposed, will the live view focusing system of the sensor have to engage in order to use this new feature? My bet is yes. So it will probably be Live View function ONLY if you want DP RAW stills

You should not have to, because the image sensor is collecting the exact same data regardless of whether you are in live view or not, it's just what the CPU does with the data afterwards.

So then what exactly is telling my lens to focus while in live view when i press the AF button? I'm selecting a point in live view. The sensor is feeding live data to a processor but the sensor itself has now become a million different AF points. With Dual Pixel Auto Focus it has become even more accurate and also allows for active focusing during video. The Digic Chip may be making the decisions to move the lens and achieve focus, but the Pixels themselves are now being used as AF points. Same basic process as using 61pt AF. That data is fed to Digic as well. That's what I'm getting at. And if the sub-pixels are going to be read out separately, I can't see how you can do that without the DPAF pixels themselves being used as the AF points. I think maybe we are both largely in agreement and just perhaps saying in different ways.

The pixels are always separated into two physically, so they always generate their own data independently of each other whether you are using live view focus or not. If the cpu is making an image from the data, it has to be combined. If it is using it to determine phase difference to focus, it does not. So it's all about what you've programmed the software to do - not whether you were using live view or not.

From what I've read of your conversation, you're both in agreement.

My view is that DPRAW will work using the camera's normal off-sensor phase detect AF module to acquire focus but will record values using the dual pixel sensor for adjustment in post. The different between it and using dual pixel for AF is that, as tr573 says, the data from the pixels is used to calculate a phase difference and drive the lens AF before a picture is taken. This is compared with how I believe DPRAW will work, where the normal AF module will handle the focus, except that when the picture is taken, the dual pixel values are stored for later.
 
Upvote 0
Sharlin said:
Act444 said:
Sounds like that would also mean that DPRAW would only work in Live View mode.

AFMA in PP (if I'm reading the description right...) would be nice indeed, but even more so if it worked in regular mode as well!

No. This has been discussed for the last couple of pages. There's no reason to think that DPRAW would work in Live View only.

I'm certainly hoping this is the case...anyway, we'll find out soon enough.
 
Upvote 0
naylor83 said:
And being a 7D2 owner, I can't help but dream about them implementing the feature in all previous DPAF cameras. At least the 1DX2 and the 7D2? Pretty please?

Well, there was the notion of a firmware upgrade for the 7D2 coming in the next few weeks. Would make sense to see this feature on all the newer DP devices?

At the very least I could see the 80D and 1DXii getting it.
 
Upvote 0
LoneRider said:
naylor83 said:
And being a 7D2 owner, I can't help but dream about them implementing the feature in all previous DPAF cameras. At least the 1DX2 and the 7D2? Pretty please?

Well, there was the notion of a firmware upgrade for the 7D2 coming in the next few weeks. Would make sense to see this feature on all the newer DP devices?

At the very least I could see the 80D and 1DXii getting it.

depends if the sensor supports it. it may not.
 
Upvote 0
rrcphoto said:
LoneRider said:
naylor83 said:
And being a 7D2 owner, I can't help but dream about them implementing the feature in all previous DPAF cameras. At least the 1DX2 and the 7D2? Pretty please?

Well, there was the notion of a firmware upgrade for the 7D2 coming in the next few weeks. Would make sense to see this feature on all the newer DP devices?

At the very least I could see the 80D and 1DXii getting it.

depends if the sensor supports it. it may not.

I don't see how the sensor could not. The dual pixels ADC values have to enter the processor right? Once there, any processing, including a new RAW file format would be fair game, I would think. The only problem could then be throughput to storage, thus filling up the buffer faster.

If there is anything limiting the feature it would be the DIGIC processor and Canon's reluctance to pushing the feature to older cameras, no?

Just my guess.
 
Upvote 0