Hmm, I don't think many people participating in this thread to much cinematography. For everyone who has talked about 4K not being broadcast, or 4K TV's not being mainstream, etc. as resons why we don't have it in our everyday and even high end DSLRs...I think you are generally missing the point of high resolution video capture. It really isn't about the way you stream the video to your customers. It's about capturing as much detail as possible initially, for a number of reasons.
For one, 4k video, even 8k video, and 16k video if/when it ever arrives...is usually DOWNSCALED in post processing. Just like taking a high resolution still photo, and scaling it down 2x or 4x, you mitigate problems with the original video. You reduce noise, you improve sharpness, you eliminate artifacts (hot pixels, frame tearing, etc.)
Second, having more pixels to work with gives you more "room" to work with, provided you frame adequately. With 4k video, or even better in the future 8k video...you can frame out a bit, adding a buffer for a variety of post-process corrections. This might be smoothing hand-held panning, stabilizing jittery hand-held video, just plain old simple cropping to cut out something that ended up in the corner or edge of a scene that shouldn't have been there, etc.
In the end, the ultimate goal is still to produce a 1080p final video product...regardless of whether you have 2k, 4k, or 8k RAW video source. In addition to that goal, though, is to have crisper, clearer, less noisy, stabilized, extremely smoothly panning video of unparalleled detail and sharpness...AT 1080p.
To be honest, I am rather certain that little in the way of mainstream broadcast 4k TV content when 4k becomes commonplace mainstream will actually be shot at 4k, even if the camera bins 8, 16, or 32 megapixels to produce it. I suspect that quality 4k programming will ultimately be shot with high end 8k cinematography equipment, for the same reasons we all want 4k video in our DSLRs now.
I think there are two fundamental reasons why we don't have 4k video in our DSLRs: For one, it is kind of a high end, prestegious thing, and it makes sense for companies competing in that arena to protect it. If we are really complaining about a $7000 camera not having 4k, then it isn't too much of a stretch to think someone could pick up a CinEOS that does 4k for $15k...one has to figure if your spending seven grand in the first place, you aren't just fooling around unless you are independently wealthy...so...$7k, $15k...whats the diff?
Second, it DOES take fairly high speed equipment to process 4K video frames at 30fps, let alone at any higher speed. A pair of DIGIC5+ could handle the input, but you would have to REQUIRE high speed writeout as well. That complicates the issue...creates a tech support nightmare for those who don't read manuals and don't understand nor care that the camera wasn't designed to support 4K video with a cheap 200x CF card from five years ago.
One also has to figure continuous high speed processing is going to produce high heat. That has a whole host of implications...the need for better passive cooling or even active cooling of most electronic components. The potential for additional noise to creep in over time at all ISO settings unless the sensor is actively cooled. Conforming to the various regulations around the world regarding battery design, power consumption, even limitations on allowed features in products of certain classes that lead to additional import or export taxes when those limitations are ignored, etc. etc.
I would put "The ability of TV broadcast stations to deliver 4K content" DEAD LAST on my list of reasons why we haven't seen 4k 30fps video in our DSLRs yet!
