20/20 vision is by definition average. I'm a little better than that, but when I say that the difference between 1080p and 4K is blatantly obvious, I have confidence that it will be just as obvious for the average person reading this.
I also find the difference to be blatantly obvious.
I checked out the 4k TV display at an electronics store, and I was floored.
Unfortunately, at least for upgrades, my current set still works (and I hope it keeps working for a while).
The next one will have to be bigger... 63" isn't big enough. So glad I didn't buy a smaller set.
Viewers can't tell the difference at normal viewing distance, you have to be close, like 5 ft or less. That's why video stores arrange them so that you will be close to the screen. At 10 ft, it makes no difference.
I'm betting the numbers those people use came from tests that do not represent what is possible with a computer monitor.
However it was that they came up with those results, they're wrong.
It's easy to test limits for yourself. Open a pant program and draw a straight line at a slight angle (make sure the program does not apply smoothing, so look really close or use a magnifying glass to ensure that transitions from one column of pixels to another happen without half shaded pixels). See the jaggies. Now back away from your screen until the line is blurred smooth. Note that a line at a 45 degree angle will look smooth sooner than one just a few degrees off vertical.
In order for a screen "look" perfectly smooth, I should not be able to see any stepping on a line of any angle.
On my 100PPI laptop screen I have to stand 9 feet away before jagged edges start to blur. At that distance I should be using a 45" 4,000x2,000 screen. For a 60" TV I would want 5,400x2,700 resolution.
That's a minimum
number, higher would be better to give margin for error.
But wait, that's not actually the limit of what I can see. If I place one white dot (ensuring it's a single RGB cluster) on a black background, in a dark room, I can still see it from 18 feet away. At 20 feet it blurs in with the image noise in my eyes. For a display to perfectly reproduce the image that I see when I look at something, it's going to have to match that level of detail. That would be 200PPI at 9 feet, or a 60" 10,800x5,400 screen.
Some might say that those numbers are unreasonable. As noted in the Displaymate article (and by me earlier), detail is as much about contrast as resolution. How often you would be able to take advantage of that level of detail depends largely on what type of content you're consuming. Games in particular are very good at producing high contrast imagery, and the amount of detail in digitally produced content is inherently tied to your display. If we have cameras that produce images of similar resolution as well I don't see a reason not to use matching monitors. For pictures and games I'll take all I can get. Movies, as noted, tend to look terrible to begin with. That application probably wouldn't be as demanding.
For all the practical reasons you normally hear people whine about, 8K sounds like a good sticking point until we figure out better ways to shoot images into your brain.