5D Mark IV HDMI 1.3 port ...what?

M_S

Jul 31, 2013
158
10
5,986
Fresh from the the twitter channel:

"@CanonProNetwork Why no 4K over HDMI out? Why HMDI 1.3? Why the huge crop factor? Why that old codec? Why UHS I? #5DMarkIV #AskCanon"

Their reply on twitter:
"@xxx @xxx the 5D IV only has a Full HD hdmi output as HDMI 1.3 chips only where available during the design phase"

Locking at the release dates on wikipedia (https://de.wikipedia.org/wiki/High_Definition_Multimedia_Interface), the HDMI 1.3 came out in 2006, HDMI 1.4 came out in 2009. Even the 1.4b is out already out since 2011

I am a bit shocked now.
 
Does the launch of a new standard coincide with availability of chips supporting that standard, or is there a lag? When chipsets first come out, do they generally have the appropriate size, power consumption and and heat dissipation parameters to work in all devices?

For reference, CFast was announced in 2008. CFast2.0 was announced in 2012, and two years after that only one marketed product used it (the Arri Amira).
 
Upvote 0
neuroanatomist said:
Does the launch of a new standard coincide with availability of chips supporting that standard, or is there a lag? When chipsets first come out, do they generally have the appropriate size, power consumption and and heat dissipation parameters to work in all devices?

For reference, CFast was announced in 2008. CFast2.0 was announced in 2012, and two years after that only one marketed product used it (the Arri Amira).

Normally there is no lag between the release of a spec sheet and the working semiconductor chip. And just to make sure, I got in contact with our hardware delveloper in our company and he just told me that. If any lag there is a lag of availability, but that is in the region of a few month at least.
To quote wikipedia again: "HDMI 1.4 was released on May 28, 2009, and the first HDMI 1.4 products were available in the second half of 2009"
So the statement from Canon is a very brave one.
 
Upvote 0
And the bit about the 'working semiconductor chip' having sufficiently low power and heat dissipation requirements to work in, say, a weather-sealed camera body, without using too much power or causing overheating...is that also a given?

I'm not excusing Canon's statement, just pointing out that there be more considerations at hand than 'a working chip'.
 
Upvote 0
neuroanatomist said:
And the bit about the 'working semiconductor chip' having sufficiently low power and heat dissipation requirements to work in, say, a weather-sealed camera body, without using too much power or causing overheating...is that also a given?

I'm not excusing Canon's statement, just pointing out that there be more considerations at hand than 'a working chip'.
Please see the revised post above:)
 
Upvote 0
M_S said:
neuroanatomist said:
And the bit about the 'working semiconductor chip' having sufficiently low power and heat dissipation requirements to work in, say, a weather-sealed camera body, without using too much power or causing overheating...is that also a given?

I'm not excusing Canon's statement, just pointing out that there be more considerations at hand than 'a working chip'.
Please see the revised post above:)

Doesn't address my point at all. The reference for the products 'available in the second half of 2009' is "Silicon Image introduces First Products Incorporating HDMI 1.4 Features for DTV and Home Theatre Applications."

As you might imagine, the power consumption and heat dissiaption requirements are quite different for a big AC-powered box with vent holes and heat sinks, compared to a small, battery-powered box that's weather-sealed.
 
Upvote 0
neuroanatomist said:
M_S said:
neuroanatomist said:
And the bit about the 'working semiconductor chip' having sufficiently low power and heat dissipation requirements to work in, say, a weather-sealed camera body, without using too much power or causing overheating...is that also a given?

I'm not excusing Canon's statement, just pointing out that there be more considerations at hand than 'a working chip'.
Please see the revised post above:)

Doesn't address my point at all. The reference for the products 'available in the second half of 2009' is "Silicon Image introduces First Products Incorporating HDMI 1.4 Features for DTV and Home Theatre Applications."

As you might imagine, the power consumption and heat dissiaption requirements are quite different for a big AC-powered box with vent holes and heat sinks, compared to a small, battery-powered box that's weather-sealed.
Ok. Different thing, different use case. True. But we are speaking of a chip (HDMI 1.4) that came out 2009. That is 7 years ago. Lots of time to get things done. The first recorder that was able to record in 4K using that chip was the Shogun, which came out early/mid 2014. Thats 2 years ago. The NX30 had it in the body already by then, the A7 also. I really can't believe the story of the long design-process. If that would be true, we would get nowhere and companies wouldn't have the chance to react on market demand.
 
Upvote 0
HDMI 1.4 mandates 3D HD support as well as 100Mb ethernet and ARC. Things which Canon probably didnt see a need for in the Mk IV. As Neuro said, lower power, less interference etc is something you perhaps could not do cost effectively until the past few years, and supporting the higher requirements of 1.4 meant they would have had to wait longer or compromise / add cost somewhere else.

Ultimately there is no way Canon also wanted to allow 4K over HDMI, which 1.3 doesnt support and 1.4 has some support for. Is their (twitter) reply perhaps slightly disengenuous? ::) :-X
 
Upvote 0
M_S said:
If that would be true, we would get nowhere and companies wouldn't have the chance to react on market demand.

I don't that Canon has a good history of reacting to market demand, a trait not uncommon among conservative companies. They've clearly shown themselves to be good at anticipating and meeting market demand, and goven their position as the market leader, in some cases driving market demand. It's clear from prior information that development cycles for high-end dSLRs are long, and certain things have to be locked down early on so the rest of the development activities predicated on them can proceed.
 
Upvote 0
M_S said:
neuroanatomist said:
M_S said:
neuroanatomist said:
And the bit about the 'working semiconductor chip' having sufficiently low power and heat dissipation requirements to work in, say, a weather-sealed camera body, without using too much power or causing overheating...is that also a given?

I'm not excusing Canon's statement, just pointing out that there be more considerations at hand than 'a working chip'.
Please see the revised post above:)

Doesn't address my point at all. The reference for the products 'available in the second half of 2009' is "Silicon Image introduces First Products Incorporating HDMI 1.4 Features for DTV and Home Theatre Applications."

As you might imagine, the power consumption and heat dissiaption requirements are quite different for a big AC-powered box with vent holes and heat sinks, compared to a small, battery-powered box that's weather-sealed.
Ok. Different thing, different use case. True. But we are speaking of a chip (HDMI 1.4) that came out 2009. That is 7 years ago. Lots of time to get things done. The first recorder that was able to record in 4K using that chip was the Shogun, which came out early/mid 2014. Thats 2 years ago. The NX30 had it in the body already by then, the A7 also. I really can't believe the story of the long design-process. If that would be true, we would get nowhere and companies wouldn't have the chance to react on market demand.

Adding to that: Automotive Connection System (Type E) was especially design for the automotive industry "to meet the rigors and environmental issues commonly found in automobiles, such as heat, vibration and noise." (quote from: http://www.hdmi.org/manufacturer/hdmi_1_4/hdmi_1_4_faq.aspx). This design was available in 2010. So heat issues shouldn't have an issue, even in a closed body. The raw calculating power is mainly done by the processor and the interface chip only does the put through of the signals. For me it all comes down to this: Lazy and inconsistent development. Pushing 4K without implementing the interface standard, in which development stage whatsoever, is not thinking the solution quite through.
Perhaps they saw their most pressing problem in the sensor, which, judging from some pics floating aroung the net, is a tiny bit better (cleaner image) in higher (12800+) Isos.
 
Upvote 0
M_S said:
M_S said:
neuroanatomist said:
M_S said:
neuroanatomist said:
And the bit about the 'working semiconductor chip' having sufficiently low power and heat dissipation requirements to work in, say, a weather-sealed camera body, without using too much power or causing overheating...is that also a given?

I'm not excusing Canon's statement, just pointing out that there be more considerations at hand than 'a working chip'.
Please see the revised post above:)

Doesn't address my point at all. The reference for the products 'available in the second half of 2009' is "Silicon Image introduces First Products Incorporating HDMI 1.4 Features for DTV and Home Theatre Applications."

As you might imagine, the power consumption and heat dissiaption requirements are quite different for a big AC-powered box with vent holes and heat sinks, compared to a small, battery-powered box that's weather-sealed.
Ok. Different thing, different use case. True. But we are speaking of a chip (HDMI 1.4) that came out 2009. That is 7 years ago. Lots of time to get things done. The first recorder that was able to record in 4K using that chip was the Shogun, which came out early/mid 2014. Thats 2 years ago. The NX30 had it in the body already by then, the A7 also. I really can't believe the story of the long design-process. If that would be true, we would get nowhere and companies wouldn't have the chance to react on market demand.

Adding to that: Automotive Connection System (Type E) was especially design for the automotive industry "to meet the rigors and environmental issues commonly found in automobiles, such as heat, vibration and noise." (quote from: http://www.hdmi.org/manufacturer/hdmi_1_4/hdmi_1_4_faq.aspx). This design was available in 2010. So heat issues shouldn't have an issue, even in a closed body. The raw calculating power is mainly done by the processor and the interface chip only does the put through of the signals. For me it all comes down to this: Lazy and inconsistent development. Pushing 4K without implementing the interface standard, in which development stage whatsoever, is not thinking the solution quite through.
Perhaps they saw their most pressing problem in the sensor, which, judging from some pics floating aroung the net, is a tiny bit better (cleaner image) in higher (12800+) Isos.

The heat the car industry is talking about the heat the components need to withstand because the heat the engine produces.

Neuro was talking the opposite, you want low additional heat generated by the components within the camera, heat = noise and that isn't what you want....

4K is there because otherwise they would be seen as slipping behind their competitors a lot, but Canon don't believe in competition between their ranges, so mk IV is unlikely to exceed the 1DX M2, and neither will encroach on the video capabilities of the Cinema range.

That's Canon's way, and hasn't changed much.
 
Upvote 0
Or Occam's razor, they didn't use a better chip than they felt they needed (which isn't necessarily the same as what all customers might want).

Consider...Canon introduced Digic 4 in the 50D, back in 2008. They've moved on, Digic 5 in 2011, and starting in 2014 Digic 6 in dSLRs and even Digic 7 in a 2016 PowerShot. But, the T6/1300D that launched this year still uses a Digic 4 chip.
 
Upvote 0
M_S said:
Adding to that: Automotive Connection System (Type E) was especially design for the automotive industry "to meet the rigors and environmental issues commonly found in automobiles, such as heat, vibration and noise." (quote from: http://www.hdmi.org/manufacturer/hdmi_1_4/hdmi_1_4_faq.aspx). This design was available in 2010. So heat issues shouldn't have an issue, even in a closed body. The raw calculating power is mainly done by the processor and the interface chip only does the put through of the signals. For me it all comes down to this: Lazy and inconsistent development. Pushing 4K without implementing the interface standard, in which development stage whatsoever, is not thinking the solution quite through.
Perhaps they saw their most pressing problem in the sensor, which, judging from some pics floating aroung the net, is a tiny bit better (cleaner image) in higher (12800+) Isos.

As far as electronics are concerned, a car is a continuously exploding greenhouse on wheels. Electronics that are designed to work in that environment probably do not meet the stringent power and heat dissipation requirements to work in a camera. Sure, these automotive chips -could- work in a small sealed 180F (~82C) box, but the sensor and other parts would not appreciate the noisy neighbors.

Its not to say it can't be done, but swapping out any chip/plug/connection/screw in such a sensitive, and tightly integrated device is not something you do even a year before it is released. I would hazard a guess that the electronics and physical package of such a camera has been frozen for 2-3 years. and they've been tweaking internals, codecs, firmware and working out all the gritty details on how to efficiently mass produce it on existing assembly lines.
 
Upvote 0
You gotta give Neuro some credit. He's fighting a tough battle here for Canon. Sometimes though, it is futile.

But this is one area where Canon cannot make an excuse. In the PC technology world, things are brought to market very quickly. Heat and power concerns? As if this isn't a major factor in the ultra competitive computer market place? Devices are battling for battery life supremacy and smaller form factors. Ultrabooks, tablets, you name it. Canon's DSLR isn't some special needs case here. Computer manufacturers are counting every degree, and designing every device to maximize performance, heat dissipation and make smaller, lighter devices that can run just a few minutes longer than the competition. Devices are brought to the very edge.


Now, if it has only been maybe this one spec - we could give the benefit of the doubt. But when you take all the 4K related features together, as a whole - you then clearly see a pattern of Canon intentionally crippling this capability. No other way about it.

4K is on the 5D4 to simply be on the specs.

This is a message by Canon that they're not interested in competing with others on 4K. If you want good 4K, move up to the Cinema.

They're safe in making that bet too. Because competitor's 4K is not so great as they make it out to be, and they all suffering from issues, such as overheating. If 4K was so great and dandy on the other brands -- there would be NO complaints about Canon. Zero. Everyone would buy these other cameras and just ignore Canon.

Instead, there's widespread, industry-wide complaining about Canon regarding video. Why? Because these users are not getting what they need from other brands, and know that Canon could deliver, if Canon wanted. A lack of satisfaction. So they keep barking and barking up that tree.


When the other brands finally get their game together and put out a solid 4K filmmaking machine, Canon then *might* respond with better offerings. Until then, Canon is suggesting you pay to play, and upgrade to Cinema.
 
Upvote 0
K said:
You gotta give Neuro some credit. He's fighting a tough battle here for Canon. Sometimes though, it is futile.

But this is one area where Canon cannot make an excuse.

Which is why I stated:

neuroanatomist said:
I'm not excusing Canon's statement, just pointing out that there be more considerations at hand than 'a working chip'.

As pointed out above, one valid reason might be they don't see a need for 1.4, 1.3 does everything they need, and it's likely cheaper. Same rationale for a Digic 4 still being used in the xxxxD line in 2016. To me that rationale makes the most sense...but it also makes the initial statement about timing a lie.
 
Upvote 0
I think you're all missing the real reason for the 5D4 specs. When faced with the choice of a superior chip or an inferior one, they always choose the inferior one because their design teams only want to cripple the cameras and piss us all off. We all know their evil intent is to keep us from having a perfect video camera in a DSLR body, even though they could easily do it on a moment's notice. All they have to do is grab some of them fancy chips, plug 'em in, and sell the crap out of their cameras. They just don't want to.

I've preordered my 5D4, and now I'm going to preorder some tiny little crutches so it can work properly.
 
Upvote 0
M_S said:
Fresh from the the twitter channel:

"@CanonProNetwork Why no 4K over HDMI out? Why HMDI 1.3? Why the huge crop factor? Why that old codec? Why UHS I? #5DMarkIV #AskCanon"

Their reply on twitter:
"@xxx @xxx the 5D IV only has a Full HD hdmi output as HDMI 1.3 chips only where available during the design phase"

Locking at the release dates on wikipedia (https://de.wikipedia.org/wiki/High_Definition_Multimedia_Interface), the HDMI 1.3 came out in 2006, HDMI 1.4 came out in 2009. Even the 1.4b is out already out since 2011

I am a bit shocked now.

I'm a bit shocked that you'd think you would get design level factual information from the nerd that handles their twitter feed.
::)

Btw all your reference material doesn't mean it's actually in DiGiC..just sayin'
 
Upvote 0
Hypothetical question would you rather have Dual Pixel AF or a HDMI 1.4 connection?

I ask because the chip Canon is using could be using all of the high speed digital IO lines that could support the 4K output for that. The reason it still has a CF card may be to utilize older digital IO that are not suitable for a high speed Serial IO device such a CFast.

I do not know the details of 5D IV processor and neither do you. Could it have a HDMI 1.4 slot? Likely but what are you willing to give up to have that.

I still think that they should have put a more powerful processor in there so that they could support 4K clean output, H.264 and a CFast card. But it is just my opinion that it would make the 5D IV sell even more.

All that said the 5D IV looks like a great camera that will make may users very happy. I just wish it was smaller and had a flip screen.
 
Upvote 0
tcmatthews said:
Hypothetical question would you rather have Dual Pixel AF or a HDMI 1.4 connection?

I ask because the chip Canon is using could be using all of the high speed digital IO lines that could support the 4K output for that. The reason it still has a CF card may be to utilize older digital IO that are not suitable for a high speed Serial IO device such a CFast.

I do not know the details of 5D IV processor and neither do you. Could it have a HDMI 1.4 slot? Likely but what are you willing to give up to have that.

I still think that they should have put a more powerful processor in there so that they could support 4K clean output, H.264 and a CFast card. But it is just my opinion that it would make the 5D IV sell even more.

All that said the 5D IV looks like a great camera that will make may users very happy. I just wish it was smaller and had a flip screen.
Be pateint, just wait 4 more years and see how Canon reacts in 2020. ;)
 
Upvote 0