I don't understand. 4:2:2 and 8bit are forms of image compression, and I don't think these HDMI ports can't produce higher color depth or sub-sampling than that.
All HDMI does 4:4:4 no sweat.
Higher levels of HDMI do deep color. I think starting with HDMI 1.2 10bit and even 12bit are supported. It's certainly possible that the 5D3 has an older HDMI maybe 1.1 or such and doesn't support above 8bits.
It is true that those are forms of image compression, I was referring to motion compression though, like h.264/mpeg2/etc. sort of compression.
When Canon says "uncompressed" they mean no frame or resolution compressions, but there is still chroma compression.
Yeah, that is what I meant. No motion/frame compression.
Your HDMI signal from a computer is most likely compressed, since HDTVs use rec.709 which is 8bit and there is no point in an 8bit DNG.
I don't think REC709 has anything to do with bit depth. It just sets tone transfer functions and primary locations AFAIK. Most HDTV don't quite match REC709 primaries, most undershoot at least some of the primaries a bit, although it is the assumed goal that they will be at least somewhat reasonably close.
And setting the tone response curve part of the spec to REC709 TRC on the set itself looks awful since they process the movies assuming the destination space will actually be more like Gamma 2.2.
Virtually no HDTV or monitors are more than 8bit displays (although some have up to at least 14bits color engines inside so you have room to calibrate without banding and the fancy ones with 3D 4bit LUTs and stuff that are wide gamut can even shift around color spaces and primary locations and so on and you get it all done while retaining perfect saturation curves and everything) are there are few that internally dither to 10bits if sent 10bit signal (such as NEC PA series I believe, some claim that some of them may be 100% true 10bits though, not sure, I don't have a 10bit video card to test it) and a very few that I think actually truly can show 10bits (HP Dreamcolor maybe?? not sure). I think some super fancy scientific and perhaps broadcast ones may be true 10bit or even 12bit, very esoteric monitors, not sure.
Most video cards don't put out more than 8bits so nothing is getting compressed even in a bit depth sense on HDMI out and all these monitors support full chroma resolution. HDMI does 4:4:4 so nothing gets compressed over that (unless you specifically set the computer to compress that). In regular modes many HDTV take the 4:4:4 input and chop it to 4:2:2 or 4:2:0 though. Lots of them have PC modes and such where you can get around that and they retain 4:4:4 pathways.
When I say CR2 I mean RAW which is not de-bayered in camera. sRAW and mRAW are not really raw, they have been de-bayered and sub-sampled at 4:4:4 in camera then to a smaller resolution by averaging the formed 'pixels' and then losslessly compressed with CR2 but information has already been lost compared a pure CR2 raw file. You may be right that the same process might have already been used for these 1931 x 1088 14bit RAW images Magic Lantern found, but it is unlikely.
They have to have something done to them because we have 22MP Bayer sensor and yet these are 1931x1088 and we know that the FOV of them is way wider than a huge crop factor would be so there is no way they can be untouched. At the least they would have to have been debayered insome sense and then rebayered I'd think, which would seem strange. I haven't looked inside the files though to see what is there yet.
Also if that 1931x1088 is in Bayer format then it's not really a true 4:4:4 you can get out of it anyway since Bayer format is not full chroma resolution by definition.
The images require a program that applies de-bayering to view them and most people agree that video resolutions are achieved by line skipping (Canon has even admitted to it) or in this case combining 'sensel' data in 3x3 blocks before any processing (de-bayering) is done.
Well the non-5D3/1DX cameras certainly do a ton of line skipping. The 5D3 does vastly less or none and thus it gets that near 2 stops SNR boost over the 5D2 from that alone, it is not tossing away all those extra samples.
Combing the blocks and matching to adjacent is processing and taking the initial bayer setup and doing all sorts of stuff to it.
Also Panasonic has admitted that they "bind their pixels in 2x2 block before any processing is done" then (like the sRAW files) they process them into YCrCb and down to the appropriate resolution before sending it to the h.264 encoder.
I think Canon said the C100 series uses 2x2 blocks too.
You might have to clip off the left and right side junk and send an offset address so it starts reading it below the top junk. You wouldn't even have to do that if the HDMI accepts an offset to get to each new lines address. I have no idea how it is setup, it might have something like that so no clipping is needed at all. COnsidering how many modes it can put out that might very well be the case.
hmm, my point is that HDMI (especially the one's in these Canon DSLRs) don't support resolutions over 1920 x 1080 so the 2K image needs to be down-resed or cropped and positioned (like the Canon official firmware already does)
Yeah you can't send the full DNG straight over HDMI but you might not need to clip off the top/bottom/sides and then rewrite to the buffer and then send. The top and bottom don't matter since you could just give it the address of where the image part starts instead of the address at the top left and it will end at 1080 and won't read the bottom junk. The left and ride sides, might need to be clipped. But it is quite possible that there is an offset value that says jump so and so many bytes to get to the next line and it could be adjusted to automatically skip over the left and right junk. If you have programmed graphics hardware at the register level such things may be familiar. I don't know how the system is setup in this case.
Ah you're right, a lot of the sharpness will be held in the luminance (duh) but not even the C100 and C300 support 10bit HDMI so I doubt they included a 10bit HDMI port here.
Very probably true (although it's not impossible they use HDMI 1.2 parts since so much stuff these days does, maybe it's mostly what is even made these days and they get better deals on big batches??? no clue, quite possible not though).