This thread is hilarious, watching everyone make mountains out of anthills (not even big enough to be a molehill).
Consider this. In South Australia, we just switched off our analogue TV signals for good. No more. Digital only. New TV or a Set-top box only. So I went with my mum to a shop to buy a new TV for the kitchen, that she listens to while she cooks. The old one was so old it didn't even have a Composite Video input, Aerial only, so it was set-top box plus VCR or something else to modulate the video to rf, or a new tv, and a new tv was cheaper.
Anyway, we get to the shop. We start looking at the cheapest in a decent size. We see a nice Teac, 32" half price for only $300. So I read the specs. "Full HD 1080i" it claims. I ask the salesman how it can be both "Full HD" and "1080i" at the same time. He explains that's how people market it, "full hd" just means 1080 lines, p or i.
Anyway, further down the spec list I read "1344 x 768 Pixel Screen". Again, I ask the salesman, how it can be "Full HD 1080, i or p" and only have "1344x768 pixels". He did look a bit sheepish for a minute, but came back with "well, the digital receiver can tune in to 1080i signals, but downscales it to 768 to put onto the screen. If you wanted to you could use an HDMI out to another screen for true 1080i display".
You know what? We bought it anyway. It was cheaper than anything else, beat her old tv by miles, and she wouldn't notice the difference anyway.
So who cares if Canon's $15k camera can do 4k video, but their $500 one can't, or even their $3k one? Can you play it anyway? If you could, do you have the editing power to edit it into something watchable? And then, can you distribute it on anything other than huge USB sticks or portable HDDs? And I'm not sure what's meant by "canon dslrs cannot even shoot true 1080p", is that because they use 442242 compression instead of 442444 or 444224? People can hardly tell the difference between 768 and 1080i and 1080p. If you ask them, they'll say that 1080p is better than 1080i, the ads have conditioned them to know that. Ask them to explain why or what it means, even pick between the two side-by-side, they won't know. I couldn't pick the 1344x768 screen from a 'real' 1080p screen next to it.
Here's a tip: Joe Public can't tell the difference either. Joe Public doesn't care. Joe Public just wants some pretty pictures to flash on a shiny box to distract him while he shovels nachos into his face. And the company that can deliver that to him easiest is the company that wins. Canon is that company, and Canon is winning, 10 years in a row it has been winning. If you're already winning a race, why stop and change your shoes?
This post isn't any more intelligent or knowledgeable than the others.
Your notion that the public cannot tell the difference between 720p and 1080p, or between interleaved and progressive, is just flat out wrong. People can tell the difference. The average TV show is 720p, with a few channels broadcast in 1080i. The difference between 1080i and 720p is quite visible. Flip between both versions for the same sports channel (usually sent on different sub channel blocks), and the improvement with 1080i will be clear. Progressive scan is even better, and that is usually only realized with BluRay these days (although some in lucky areas might be able to get 1080p TV, not sure).
The quality of picture that you get out of a BluRay is unparalleled. That is also the primary reason why millions and millions of people every year spend big bucks to buy top-end BluRay players and high end Full HD (1920x1080 with progressive scan capability) TVs to the tune of thousands of dollars. People aren't just chasing a big TV...they are chasing crystal clear, razor sharp PICTURE. People know this, they talk about it on forums dedicated to it, and they constantly spend money upgrading TVs or other equipment year after year to maximize that quality. It isn't every one of the 140 million homes in the US doing this every year, but tens of millions of people do.
It's a JOKE to think people don't care about getting the kind of quality expect out of the expensive gear they pay for. A 4k capable video camera, paired with some 4k capable video editing software, goes a long way towards making better videos. The "average" person who just wants to shoot home videos will pick up a camcorder. The guy who wants to make awesome, professional quality sports videos of his buddies doing awesome tricks with their snowboards would LOVE to have 4k video for an affordable price!
Last, I've already said this in my last post, but I'll say it again. The point of having 4k video is not so you can BROADCAST 4k TV!! The point is just the same as the reason you want a high resolution 18-36mp camera to downscale your photos to .5mp Web Size: Image Quality. Downscaling normalizes noise, sharpens detail, eliminates small artifacts, hides cinematography "tricks" or chop...it enhances quality. It also gives you additional editing latitide, and the ability to use more advanced tools like Adobe Premier to perform post-process image stabilization, panning smoothing, etc. You don't buy 4k to broadcast it at 4k. You buy 4k for downscaling. You buy 4k to maximize video IQ, and improve your editing capabilities if you have the post-processing tools.
Might not want to shoot your mouth off until you really know what your talking about.