Here are some crazy Canon EOS R1 specifications [CR0]

Regarding resolution, that's interesting because if you scale up the 90D sensor from APS-C to full frame, the result is 83MP...
But there's little reason to cram such high resolution into a sports/action camera. Maybe the rumor is conflating two different cameras... A mythical "R1" sports model and a theoretical "R5s" studio rig to compete with medium format... Or maybe Canon is actually trying to build a single camera that does both.
The rear screen specs make no sense at all. There's simply no need for that level of resolution in an approx. 3" screen that will be used under a very wide range of lighting conditions. The EVF specs might be true for competitive reasons, but some Sony A1 reviewers comment that they see little difference compared to less extreme EVFs.
High speed frame rates... At what point does a stills camera become a video camera and your photographs become screen grabs?
 
  • Like
Reactions: 1 users
Upvote 0
I don’t believe 8K is the new standard at all. It’s a novelty that people downscale from at best. Thanks to the R5, I think 8K is what you have to have in your tag line to get the camera to move as compared to other cameras. A handful of people might actually NEED 8K for something but I don’t know what it is.

Until they start hanging 8K TVs on the wall I still see it as a niche. You still can’t get HQ 4K streams from any of the major services unless you pay more for it (most people don’t/won’t) and the 4K on YouTube isn't much better than their 2K If you can tell the difference at all.

Don't confuse acquisition with delivery. 8K acquisition will render out better 4K It will let you crop and pan in post. It well let the people doing effects create better effects.

Just like the BM 12K isn't about people watching 12K .
 
  • Like
Reactions: 2 users
Upvote 0
The new mirrorless flagship from Nikon is confirmed to be coming this year. It will have 8K video and is rumored to have a 60MP sensor. Sony’s Alpha 1 has a 50MP sensor and was designed to target the 1D series. Both can/will, shoot 8K video.

The Sony A1 doesn't shoot 8K. It's doing the equivalent of UHD. Even the new "cinema" camera they released doesn't do 4K.
 
  • Like
Reactions: 1 user
Upvote 0

DBounce

Canon Eos R3
May 3, 2016
500
544
I don’t believe 8K is the new standard at all. It’s a novelty that people downscale from at best. Thanks to the R5, I think 8K is what you have to have in your tag line to get the camera to move as compared to other cameras. A handful of people might actually NEED 8K for something but I don’t know what it is.

Until they start hanging 8K TVs on the wall I still see it as a niche. You still can’t get HQ 4K streams from any of the major services unless you pay more for it (most people don’t/won’t) and the 4K on YouTube isn't much better than their 2K If you can tell the difference at all.

4K DVDs are as big of a flop as BluRay was, if not bigger. I don’t know a single person that buys 4K DVDs. If I have to have something in 4K I just find a source and download it.

Until ISPs get rid of their data caps I can’t see real, clean, HQ high res becoming the standard over the crap they serve now. My ISP (I’m very choice limited because I live on an island) caps at 1.2TB per month which sounds like a lot but isn’t. NETFLIX garbage 4K can eat 7GB per hour and it’s terrabad.

No one is screaming for 8K streams are they? I don’t even people really crying for 4K. MOST people just watch what’s on and don’t think about whether they could count the person’s eyelashes or not.
That might be so, but I doubt Canon would ship a brand new flagship without 8K. And we all know that 8K had better not overheat.
 
Upvote 0

DBounce

Canon Eos R3
May 3, 2016
500
544
The Sony A1 doesn't shoot 8K. It's doing the equivalent of UHD. Even the new "cinema" camera they released doesn't do 4K.
The Sony A1 doesn’t shot 8K... it shoots oversampled 8K. Check your facts:

8K UHD resolution is 7680x4320

A1 8K resolution:
UHD 8K (7680 x 4320) at 23.976p/25p/29.97p [200 to 400 Mb/s]
 
Upvote 0
At first, this whole 21mp and 85mp makes no sense, but when you start thinking, it kind of does.
Let me explain.

On a normal 20 mp sensor, you have 20 million diodes. each pixel is covered with the bayer pattern which identifies every single pixel as either green, blue or red.

Since canon introduced dual pixel technology, we effectively had 40 million diodes on a "20mp" bayer sensor. That is 2 diodes hiding behind each green, blue or red screen. The 2 diodes have made the dual pixel focusing possible calculating the micro-contrast between each set of 2 diodes for a global "phase difference". Hence the name "dual pixel" because indeed it's 2 pixels effectively behind each bayer piece.

Now, since those dual pixels (diodes) were arranged in such a way that they were twice as tall as they are large, it didn't make sense to read them as separate pixels for purposes of resolving the image. Pixels would have been twice as high as they are large.
dualpixelstructure.jpg
This would have caused visible "stepping" or aliasing problem. Their only purpose was for focusing. So a 40 million diode sensor was still 20mp since each bayer piece was counted as 1 pixel. In other words, the input information of the 2 diodes were combined into 1 output pixel to keep everything square and proper.

But now, with introduction of quad pixel technology which further improves focusing, we solve the problem we had bafore with only 2 diodes behind each bayer screen. You effectively have 4 diodes (pixels) behind each green, blue or red bayer screen. This is a 2 by 2 square. Each diode being the same size. This means that you have 2 options of how you can read the information. You have a total of 80 million diodes. Either you read them as a 20mp sensor - 4 diodes behind each red, green, blue screen constitute one pixel - or your read each diode (so 80 million of them) as an indivudual pixel and simply modify your debayering algorithms.

Now, these debayering calculations would be way more complex and more taxing on the processor (I think up to 16 times) if you decided to use all 80 million diodes as pixels instead of using only the 20mp resolution, but it's possible. Hence, shooting 20 mp at 30 frames seems reasonable, but 20 frames at 80 mp seems a little sketchy. I think 10 fps at 80pm would be quite the achievement with the processing power involved. Unless they throw 2 current X processors into the R1, who knows how much processing power that actually is... maybe enough for 20 fps at 80mp despite the heavy processing needed.
 
  • Like
Reactions: 1 users
Upvote 0

Bert63

What’s in da box?
CR Pro
Dec 3, 2017
1,072
2,335
60
Don't confuse acquisition with delivery. 8K acquisition will render out better 4K It will let you crop and pan in post. It well let the people doing effects create better effects.

Just like the BM 12K isn't about people watching 12K .
I'm not confusing anything.

MOST people don't want to spend the cash that recording in 8K requires.

Record in 8K on the R5, then render it to 4K, then go record in 4K HQ and play them all side by side and tell me that in today's world, with streaming solutions what they are, that it makes any sense at all.

Most people wouldn't know something was rendered from 8K unless you told them and explained what they should look for to tell - ESPECIALLY if they are streaming it via normal means onto a large 4K television - If they even have a 4K television - as of 2018, only around 31 percent households had a 4K HDTV.

Less than 1 in 3.

On Amazon - which is a huge streaming service, you're lucky to get 720P clean, much less 1080P. 4K is like a striped unicorn unless you want to watch flowers bloom or a waterfall or one of their original 'woke' productions..
 
Last edited:
  • Like
  • Sad
  • Haha
Reactions: 2 users
Upvote 0

Bert63

What’s in da box?
CR Pro
Dec 3, 2017
1,072
2,335
60
That might be so, but I doubt Canon would ship a brand new flagship without 8K. And we all know that 8K had better not overheat.

We'll see. They've never been driven by anything other than their own vision when it comes to the 1DX line and that may hold true into the future. The big files aren't attractive when it comes to putting images on the wire and that's long been a driver for their flagship retaining a low-res (by comparison) option.

8K wasn't even a blip on the page when I was buying my R5. Aside from frame-stealing it's pointless AFAIC. The HQ 4K option looks as good as anything out there (IMO) and it's better than anything you can stream into your living room by a long shot.

It'll be a long time before 8K becomes mainstream as a desired choice for the masses. Hell, 4K isn't even there yet.
 
Upvote 0
Jan 29, 2011
10,675
6,121
7680 × 4320
This is the resolution of the UHDTV2 format defined in SMPTE ST 2036–1,[39][40] as well as the 8K UHDTV format defined in ITU-R BT.2020.[41] It was also chosen by the DVB project as the resolution for their 8K broadcasting standard, UHD-2.[42] It has 33.2 million total pixels, and is double the resolution of 4K UHD in each dimension (four times as many total pixels) or four times the resolution of 1080p in each dimension (sixteen times as many total pixels).

 
  • Like
Reactions: 1 users
Upvote 0

SteveC

R5
CR Pro
Sep 3, 2019
2,678
2,592
7680 × 4320
This is the resolution of the UHDTV2 format defined in SMPTE ST 2036–1,[39][40] as well as the 8K UHDTV format defined in ITU-R BT.2020.[41] It was also chosen by the DVB project as the resolution for their 8K broadcasting standard, UHD-2.[42] It has 33.2 million total pixels, and is double the resolution of 4K UHD in each dimension (four times as many total pixels) or four times the resolution of 1080p in each dimension (sixteen times as many total pixels).


There seem to be two different definitions of 4K and 8K, one is multiples of the 1920 width that no one calls 2K, and the other is multiples of 1024 or rather 2048, a "K" in computer speak. Hence the confusion here between 8K meaning 8192, and 8K meaning 7680. Looking around at wikipoo, nothing seems to indicate anyone is using 8192 pixel width for anything, though both versions of 4K seem to be in use in different contexts.

And of course the truly pedantic will state that K = 1000 and anything else is an abuse of the term, which is why computer people are encourage to talk of kebibytes instead of kilobytes (and mebibytes, gibibtyes and tebibtyes, all powers of 1024 instead of a 1000). This became necessary because hard drive manufacturers started advertising drive capacities in "gigabytes" meaning billions of bytes to try to make their drives look bigger. (Given that a sector is 512 bytes, it would be natural to use the 1024 base, but weasels will be weasels.)
 
  • Like
Reactions: 2 users
Upvote 0
Why they never did it with dual pixel?
dual pixel is:

RRGG
GGBB

So you end up with 6000x4000 if the sub pixels are combined into one and it's treated like a bayer array. If you don't combine them (the higher end bodies let you save a dual pixel raw where they aren't combined), you would effectively end up with a 12000x4000 image and would have to stretch out the vertical to 8000 pixels. With quad pixel AF, there's no need for that as it's two sub pixels horizontal, and two vertical.
 
  • Like
Reactions: 1 user
Upvote 0
In the dual pixel configuration, both photodiodes share the same microlens so you cannot really get any more resolution out of reading out each individually. I guess if the quad pixel configuration has a distinct microlens for each photodiode you could get more resolution, but then I'm not sure how that affects the phase information. In any case this rumor seems more like someone's wish list than reality.
They have to be under the same microlens for the phase information, and yes, it does result in more spatial resolution. On the higher end Canon bodies, you can save a dual pixel raw file where each sub pixel isn't combined and extract them out and on close inspection, each sub pixel is very clearly capturing distinct spatial information unique to each sub pixel position. The bear is there's no standard way to extract and combine the sub pixels and with dual pixel, you end up with a picture that is way wider than tall and you have to vertically stretch it, negating a bunch of the reason for doing it in the first place. With a quad pixel AF, not so much the case.
 
  • Like
Reactions: 1 users
Upvote 0
Not according to Canon's description of the Dual Pixel, they do not consider an individual photodiode to be a pixel. Specifically they say

View attachment 196116

View attachment 196115

Now I am sure there are going to be another round of photodiodes vs pixel threads but bare in mind the R5 has 47 million pixels and 94 million photodiodes, yet the marketing has never pushed that as anything other than a 45mp sensor.
You do realize that Canon's higher end bodies let you save dual pixel raw files where each sub pixel is saved separately, don't you?
 
  • Like
Reactions: 1 user
Upvote 0
Look, I don’t know what authority you purport to be, but according to CTA 7680x 4320 is 8K resolution.
There's 8K (as consumers see it, i.e. 7680x4320) and DCI 8K, which is 8192x4320. DCI 8K, like DCI 4K (which is 4096x2160) is a studio mastering and digital projection format that is used in movie theaters. In the home, it FullHD, UHD, and 8K, each at 1920x1080, 3840x2160, and 7680x4320.
 
  • Like
Reactions: 3 users
Upvote 0
Jan 29, 2011
10,675
6,121
What is your question? I didn't see one on the post I responded to.
Why would Canon call a 90 million photodiode sensor a 45 million pixel sensor if that was how they were looking at it?

A 21mp quad sensor would have 84 million photodiodes but according to Canon themselves, in it's current format/definition, would still only be a 21mp sensor.

Like I have said across threads now it isn't me that is splitting hairs on the terminology but we are going to have a whole load more threads on this if that is what they are doing.

Personally I never fully appreciated the distinction Canon have made, but they have, so now we might be looking at an interesting time of backpedaling and re-education on the finer points and definitions of pixels vs photodiodes.
 
  • Like
Reactions: 1 user
Upvote 0
Why would Canon call a 90 million photodiode sensor a 45 million pixel sensor if that was how they were looking at it?

A 21mp quad sensor would have 84 million photodiodes but according to Canon themselves, in it's current format/definition, would still only be a 21mp sensor.

Like I have said across threads now it isn't me that is splitting hairs on the terminology but we are going to have a whole load more threads on this if that is what they are doing.

Personally I never fully appreciated the distinction Canon have made, but they have, so now we might be looking at an interesting time of backpedaling and re-education on the finer points and definitions of pixels vs photodiodes.
Probably because to date, all raw processing software (at least that I know of) only sees Bayer arrays, and that would be Canon's default output. Even now, with being able to save dual pixel raw, Canon only uses that info so you can do AF micro adjustments after the fact, not actually generate a file with more resolution from the two sub-pixels.

Going to quad pixel AF still allows very easy bayer output, and if done, does allow very easy spatial resolution bumps by not combining the sub pixels, but does significantly bump up the post processing requirements. I suspect part of the reason why canon went to the CR3 file format over the CR2 format is to make it easier to store non-standard pixel arrays. You can save dual pixel raw files in CR2 files (like the 5DIV does), but it basically stores it as two bayer array images in sub-chunks. The CR3 format stores each color discretely in it's own chunk and the raw processor has to then read each color chunk, then combine it into a bayer array, then demosaic it. The CR3 format is a pretty big deviation to how Canon stores its sensor data over the CR2 format.

I also suspect Canon has very good reason to go quad pixel AF because it allows them to have more than 2 output gains. This is how they were able to get the DR increases and noise improvements in recent dual pixel bodies. Each sub-pixel is actually 1 stop different than the other one. The way they store it in CR2 files, again, is less than ideal, as they store the first bayer array as they normally would with both sub pixels combined, and the second bayer array with just the output of the second sub-pixel. With a quad pixel array, they'd have pretty good reason to store each sub-pixel by itself if saving quad-pixel files as it would mean a lot more flexibility when generating a full color image. That, and they could have quad gain structure where each sub pixel had 1 stop more gain than the next, giving a combined 4 stop spread between the sub pixels with which to generate an image from. This would be how they could get to 15.5+ stops (if outputting a ~24MP bayer array where all the sub pixels are combined). They could keep a 12 or 14 bit AD, and have 4 gain outputs. If they stored each sub pixel separately, they don't even have to store 16 bits per pixel, they could still do 12 or 14 bits, then when generating the full color image after the fact in their DPP software, store the full resulting RGB as a 16 bit TIFF file. I wouldn't be surprised in the least if it was just a straight 12 bit ADC (for speed), and they just use the multi-gain to get the DR.
 
  • Like
Reactions: 1 users
Upvote 0