Do you have a 4K display?

Do you have a 4K Television or monitor?


  • Total voters
    92
  • Poll closed .
My opinion:

4K is good for proofing, i.e. computer monitors. Mainly because your eyes are right up near the screen and you have lots of great 4k content (your pics).

For movies, pointless, due to distance from screen, diminishing returns with motion compression, and fact that most content does not resolve beyond 1080p in detail even if encoded at 4k. You need a minimum of a 10ft screen to see significant improvement from 1080p at normal viewing distances per Joe Kane, who is an unbiased industry video expert.
 
Upvote 0
David_in_Seattle said:
I'm debating on pulling the trigger on a couple new Dell 24" 4k displays, but have been hesitant to do so since I've heard issues with their display working on 60 Hz. Other displays are currently out of the question since the company I work for gets a sweet discount on these monitors.

The added resolution would definitely help with the type of photo and video editing I do on the job.

Just as a warning, my friend bought a top of the line Dell 32" Ultrasharp 4k monitor. It had many both stuck and dead pixels. He exchanged it for 2 replacements, both with lots of stuck and/or dead pixels. He eventually gave up and asked for refund, though Dell in the end hooked him up for his troubles. :)
 
Upvote 0
dolina said:
9VIII said:
It's good to see that you're aware of the difference, a significant majority of the people I talk to are completely unaware. In my opinion it basically amounts to false advertising.
Bravo I'm informed of the lies of the industry. Dude, get over it! It's just a marketing term to highlight a new feature that I am particularly thankful for.

Lower power consumption is _always_ welcome.

I'm just as happy as you are that the industry has switched to LED backlighting, it just should have been named differently.
 
Upvote 0
9VIII said:
dolina said:
9VIII said:
It's good to see that you're aware of the difference, a significant majority of the people I talk to are completely unaware. In my opinion it basically amounts to false advertising.
Bravo I'm informed of the lies of the industry. Dude, get over it! It's just a marketing term to highlight a new feature that I am particularly thankful for.

Lower power consumption is _always_ welcome.

I'm just as happy as you are that the industry has switched to LED backlighting, it just should have been named differently.

I suspect the industry will be skipping right by "true" LED displays, and heading strait for OLED displays. I don't think there is any way to market a "true" LED display (where there are discrete RGB LEDs for each and every pixel) such that the general public would understand the difference relative to an LED Backlit display (either edge backlit or matrix with local dimming.)

LG already has a 77" OLED TV (although it's curved, a feature I personally am not a fan of...I think it's just a gimmick.) Samsung is supposedly readying an 80" OLED display which features an adjustable curvature (again, a feature I think is a gimmick.)

At 80", standard 1920x1080 pixels are MONSTROUS, and there is no question such large screens could benefit from a factor of four shrink in pixel dimensions. I think 4k will do wonders for these large OLED screens...I just hope they end up flat at some point, as I'd prefer not to have some hulking curved monstrosity popping out of my wall, when the intent is to have a 1" deep beautifully flat monstrosity sitting nearly flush and otherwise inconspicuous.
 
Upvote 0
I'd love to sell my 27-inch Dell U2711 now and get a 32-inch Sharp 4K display but my 27-inch iMac would probably not be able to drive it properly and not to mention not match in terms of resolution and screen size.

I hope in 3 years time all the issues for 4K displays will be resolved for the largest 4K iMac I can get.
 
Upvote 0
I just got my 60" 1080p TV, 2 years ago.
So I'm not going to upgrade anytime soon.
But that doesn't mean that the next guy shouldn't.

You don't have to believe me or the next person here that tells you that "to see the difference you need to sit very close."

Here read it on CNET: http://reviews.cnet.com/8301-33199_7-57610862-221/four-4k-tv-facts-you-must-know/

Yeah sure, you can see the difference on Retina display, but really... you sit very close to it. Measure it, I'm sitting less than 3 feet from my computer screen. When I pull out my iphone or a friends ipad, it isn't more than 2-3 feet away.

You want to see the difference between 1080p? I guess wait for the 8K TVs?
 
Upvote 0
From the CNET article:
Larger TVs or closer seating distances make that difference more visible, as do computer graphics, animation, and games...

Like I was saying, if someone is looking at something inherently blurry the resolution of it isn't going to matter.
I recently read that in the EU they're looking at including 100hz (double the standard 50hz PAL frequency) in with the 4K broadcasting spec. The original NHK UHD spec also included a 120hz refresh rate to help improve the image. I have to wonder if the archaic 24hz Hollywood standard frame rate isn't partly responsible for much of the negativity surrounding 4K?
 
Upvote 0
9VIII said:
From the CNET article:
Larger TVs or closer seating distances make that difference more visible, as do computer graphics, animation, and games...

Like I was saying, if someone is looking at something inherently blurry the resolution of it isn't going to matter.
I recently read that in the EU they're looking at including 100hz (double the standard 50hz PAL frequency) in with the 4K broadcasting spec. The original NHK UHD spec also included a 120hz refresh rate to help improve the image. I have to wonder if the archaic 24hz Hollywood standard frame rate isn't partly responsible for much of the negativity surrounding 4K?

The archaic 24Hz Hollywood standard is changing, as well. The recent Hobbit movies were shot at 48 frames per second. James Cameron is apparently shooting Avatar 2 and 3 (and however many more there may be after that) at 60fps. A 60Hz refresh rate fits well with 240Hz 3D BluRay playback as well. Several cable providers are already clearing bandwidth in order to have more free in order to deliver native content in 4k resolution (and I believe there may already be some 4k content distribution, with 2k downgrade on those channels when 4k isnt' available.) It isn't just TVs that are moving forward into a new era of quality and resolution...the technology used to create and deliver the content we view on them is moving forward as well.

Nay-sayers are simply uneducated as to the big picture. It isn't just 4k TVs that will be playing back ancient Standard HD content (720p). It is 4k TVs that will be playing back native 4k content, from TV and BluRay, as well as internet enabled content delivery networks like NetFlix (which has adopted Super HD for a lot of it's content already, and is also working on preparing their system for delivering 4k content.)

Even assuming one "only" watches 1080p content on a 4k TV. That 2k content is supersampled (more pixels than necessary are rendering it), therefor it still looks better than on a native 2k device.
 
Upvote 0
jrista said:
The recent Hobbit movies were shot at 48 frames per second.

Have you seen the Hobbit at 48 fps?
It looks bad! You can see the make-up and you are able to discern the fake props.
I don't think 48 fps should be used in movies that use heavy costumes, make-up, props and CGI.
It possibly can be used in stuff like the Silver linings playbook, but anything else...

The beauty of photographs @ 4K, 5K and higher is that you can remove blemishes, soften the image, etc. with photoshop. You can't do the same with video, unless you want to go through 48 frames and correct it individually, which will take forever when it is a 2 hour movie.
 
Upvote 0
mkabi said:
jrista said:
The recent Hobbit movies were shot at 48 frames per second.

Have you seen the Hobbit at 48 fps?
It looks bad! You can see the make-up and you are able to discern the fake props.
I don't think 48 fps should be used in movies that use heavy costumes, make-up, props and CGI.

and 48 fps has what to do with resolution?

It looks bad! You can see the make-up and you are able to discern the fake props.

because of 48 fps?.... i doubt that.
 
Upvote 0
mkabi said:
jrista said:
The recent Hobbit movies were shot at 48 frames per second.

Have you seen the Hobbit at 48 fps?
It looks bad! You can see the make-up and you are able to discern the fake props.
I don't think 48 fps should be used in movies that use heavy costumes, make-up, props and CGI.
It possibly can be used in stuff like the Silver linings playbook, but anything else...

The beauty of photographs @ 4K, 5K and higher is that you can remove blemishes, soften the image, etc. with photoshop. You can't do the same with video, unless you want to go through 48 frames and correct it individually, which will take forever when it is a 2 hour movie.


you can easily do that with film, and you dont have to go through each frame individually, and if youre making a filom at 48 fps in such high res, im sure youve got the resources to check every frame.
 
Upvote 0
There is a huge distance between 480I and 1080p, but I find that mostly I'll watch dvd quality upscaled video which is tolerable. I'll watch satellite broadcast hd signals which are fine, but not really that impressive. I have a 3d tv and I don't hate the glasses and I don't mind the crud where they try to send debris out of the screen towards you, but the content isn't there.

I like basketball in 3d. Baseball and football aren't all that impressive.

So my issue is that I don't know there will be a clamor for content. If Directv comes out with ten channels, Wil that make Comcast and other cable providers respond in kind? So I say no thank you until they flood my screen with higher quality programming.

And honestly... with football we are still getting blah hd feeds. I think it is cbs who down grades the image quality and it is noticeable. Give me 1080p first for a few years before I even consider upgrading.
 
Upvote 0
jrista said:
James Cameron is apparently shooting Avatar 2 and 3 (and however many more there may be after that) at 60fps.

Oh thank goodness. The whole "48fps" thing always sounded like a halfway measure. I'm sure he'll get the job done right.


mkabi said:
jrista said:
The recent Hobbit movies were shot at 48 frames per second.

Have you seen the Hobbit at 48 fps?
It looks bad! You can see the make-up and you are able to discern the fake props.
I don't think 48 fps should be used in movies that use heavy costumes, make-up, props and CGI.
It possibly can be used in stuff like the Silver linings playbook, but anything else...

The beauty of photographs @ 4K, 5K and higher is that you can remove blemishes, soften the image, etc. with photoshop. You can't do the same with video, unless you want to go through 48 frames and correct it individually, which will take forever when it is a 2 hour movie.

People said the same thing about "HD" when it was introduced. They adapted well enough, and will do so again.

It's interesting that this fits my point perfectly, higher framerates make the image more detailed.
It's just like taking a picture with a faster shutter speed. If you want to capture detail in motion, you need a shorter exposure. They could make 24fps movies with a really fast shutter speed, but then the movie would look like a slide-show.
Applying that to 4K, the amount of blur you're allowed before it crosses multiple pixels on the display becomes that much shorter, so framerate does make a difference as you increase resolution.

The complaint about 48fps that I read most often is it reminds people of a TV show. Live action has almost always been 60fps, if you put a side by side recording of a football game at 24fps and 60fps I doubt anyone would prefer the 24fps version. Why should it be any different in movies?
 
Upvote 0
mkabi said:
I just got my 60" 1080p TV, 2 years ago.
So I'm not going to upgrade anytime soon.
But that doesn't mean that the next guy shouldn't.

You don't have to believe me or the next person here that tells you that "to see the difference you need to sit very close."

Here read it on CNET: http://reviews.cnet.com/8301-33199_7-57610862-221/four-4k-tv-facts-you-must-know/

Yeah sure, you can see the difference on Retina display, but really... you sit very close to it. Measure it, I'm sitting less than 3 feet from my computer screen. When I pull out my iphone or a friends ipad, it isn't more than 2-3 feet away.

You want to see the difference between 1080p? I guess wait for the 8K TVs?

How far away from your TV do you sit?

My TV is a couple years old, but I have a Samsung 59" Plasma, and I generally sit about 7-8 ft away from it...short living room.

C
 
Upvote 0
9VIII said:
<snip>
The complaint about 48fps that I read most often is it reminds people of a TV show. Live action has almost always been 60fps, if you put a side by side recording of a football game at 24fps and 60fps I doubt anyone would prefer the 24fps version. Why should it be any different in movies?
Well, we've been used to watching true Movies at 24fps for decades now. To most peoples' eyes, that blur is part of what makes a movie look cinematic.

Strangely enough, our brains have been trained to think that what might be argued as old tech is actually what to many makes it look of higher quality or "movie-like".
 
Upvote 0
dolina said:
My dentist wanted to do that with his HDTV in his office so his patients can watch TV while he mucks around in their mouth but the contractor forbade it. :(

It won't be in HD, but maybe he can get TV headset goggles. Some dentists use those to let patients watch movies/TV during longer things like cavities and crowns and such.
 
Upvote 0
ajfotofilmagem said:
Just as a picture with 36 megapixel and coarse compression does not seem better than another with 8 megapixel and compression fine, the video resolution is less important than the compression codec used in the video. Currently H264 has very significant quality losses during the editing process. Yes, there are other video codec to preserve more image quality, but let's be honest: Who would be willing to record 4K video, generating files of 5 gigabytes per minute?

But more MP and a little more compression generally looks better than less MP and less compression.

(Of course in some cases compression is so over the top already.... you have 1920x1080 channels delivering about 720x640 detail and with lots of macro blocking all over. So for those super low bandwidths more MP is a waste.)

And often whatever total bandwidth you end up with using a given level of compression is better than using that same bandwidth for uncompressed.
 
Upvote 0
Samsung HDTVs are realllllly poorly built these days (actually all of the brands are, although samsung certainly isn't the best and they have impossibly bad warranty service (criminally so in many cases).)

And yeah when the panel goes, it usually doesn't make sense to fix it.

dolina said:
A tad out of topic but...

My 5yo 46-inch Samsung LCD TV's panel needs to be replaced. This happened after 24 hours attached to a IPTV box.

Parts and labour will cost me $850 and a week's wait.

Went TV shopping yesterday and my takeaway is that the most basic of 46-inch LED TVs can be had for $850. Add $100 and I get a 50-inch LED TV. Add $500 and I get a 60-inch LED TV.

Power consumption of LED is a fraction of what I am getting with LCD.

Smart TVs are nice if you dont have a smartphone, tablet or computer. Wish Apple would make one, I'd be more inclined to buy a solution from them.

Of course this isn't a 4K display. I was initially planning to wait 3-5 years before picking up one. In time for a slim Xbox One & slim PS4.

Now I'm back to my 8yo 32-inch Samsung LCD TV and 4yo 40-inch Samsung LCD TV. Which is really sad considering we switched to HD cable this year.

=======================

Now for the 4K TV part.

I auditioned the following

LG 65LA9700 (65-inch LED)
Sony Bravia KD-65X9004 (65-inch LED)

Both look awesome with a 2048px on the longest side JPEG even zoomed in at 200%

Playing a 1080p & 720p MP4 with a low bitrate looks like a SD content
 
Upvote 0