« on: May 02, 2013, 08:15:20 AM »
The HDMI port on it can do 1920x1080. What is compressed about HDMI? You can hook a computer to a TV through HDMI and the computer signal going over HDMI is not compressed at all.
14bits the HDMI probably can't handle, so the depth resolution would have to be less, down to at least 10bits I think and quite possibly 8bits, but the spatial resolution could retain the full DNG stream sharpness.
(that said the 1.2.1 actually does appear to be a bit sharper than the old firmware, either over HDMI or compressed internally, at least, which is good)
I don't understand. 4:2:2 and 8bit are forms of image compression, and I don't think these HDMI ports can't produce higher color depth or sub-sampling than that. When Canon says "uncompressed" they mean no frame or resolution compressions, but there is still chroma compression. Your HDMI signal from a computer is most likely compressed, since HDTVs use rec.709 which is 8bit and there is no point in an 8bit DNG.
1. DNG and CR2 are the same, both use the same lossless RAW algorithm and both are not de-bayered. The De-bayering is preformed in your computer (usually automatically) with a viewer or editor that supports De-bayering.
But the Bayer sensor is 22MP and these are 2MP so unless they are just reading a non-bayered mini-block, which they aren't something sort of de-bayer must be done already. Look at sRAW and mRAW they are are not in the original complete Bayer state and they are stored in a RAW file too.
When I say CR2 I mean RAW which is not de-bayered in camera. sRAW and mRAW are not really raw, they have been de-bayered and sub-sampled at 4:4:4 in camera then to a smaller resolution by averaging the formed 'pixels' and then losslessly compressed with CR2 but information has already been lost compared a pure CR2 raw file. You may be right that the same process might have already been used for these 1931 x 1088 14bit RAW images Magic Lantern found, but it is unlikely. The images require a program that applies de-bayering to view them and most people agree that video resolutions are achieved by line skipping (Canon has even admitted to it) or in this case combining 'sensel' data in 3x3 blocks before any processing (de-bayering) is done. Also Panasonic has admitted that they "bind their pixels in 2x2 block before any processing is done" then (like the sRAW files) they process them into YCrCb and down to the appropriate resolution before sending it to the h.264 encoder.
You might have to clip off the left and right side junk and send an offset address so it starts reading it below the top junk. You wouldn't even have to do that if the HDMI accepts an offset to get to each new lines address. I have no idea how it is setup, it might have something like that so no clipping is needed at all. COnsidering how many modes it can put out that might very well be the case.
hmm, my point is that HDMI (especially the one's in these Canon DSLRs) don't support resolutions over 1920 x 1080 so the 2K image needs to be down-resed or cropped and positioned (like the Canon official firmware already does)
cutting chroma resolution doesn't hit perceived resolution nearly that hard, the luminance resolution is still full
but yeah obviously going over HDMI unless it was HDMI with 10bit support it would lose a bunch of DR
Ah you're right, a lot of the sharpness will be held in the luminance (duh) but not even the C100 and C300 support 10bit HDMI so I doubt they included a 10bit HDMI port here.