jrista said:
I find this resistance to improved technology incredibly strange...to the point where I simply don't believe it.
I feel the same way a lot of the time.
In the home theater market, it seems like most of "the experts" have been harping on about black levels and colour density for years, seemingly getting little attention from manufacturers.
When people like me start saying that we need to triple the current pixel density to match the resolution of human vision, flying in the face of "common knowledge", people usually react negatively, but when the manufacturers start improving resolution instead of other things, that changes the situation.
"The experts" then move from promoting their preferences to putting down the ideas of other people, It's pretty sad.
But that's actually fairly normal behaviour, culture changes in generational steps. In many areas of society you literally have to wait for the "old guard" to die off before new ideas can be taken seriously.
DominoDude said:
Since I see mentions of video and watching distances...
I had a great chart showing suitable viewing distance for various resolutions and screen sizes, but it's embedded on a site that is highly linked to very X-rated content, so I think we should skip that URL. However, it showed the farthest distance at which an eye with perfect vision could resolve all the detail. In short, and as an example, it boils down to a 60 inch screen best being viewed from less than 10 feet if you want the eye to resolve all detail in a 1080p movie.
Let's toss in some Wikipedia that is less prone to being X-rated -> http://en.wikipedia.org/wiki/Optimum_HDTV_viewing_distance#Human_visual_system_limitation
I'd say that for a family of 4-5 to watch and enjoy every minute detail of a high-res video snippet they better huddle together really close. Thankfully the human brain is supposed to watch and enjoy the content, and it won't throw itself on the floor in a temper tantrum just because every single pixel isn't distinguishable.
Domino, I'm not commenting on you, but the Wikipedia article is a good example.
The first paragraph in the section titled "Human visual system limitation" states confidently, "one arcminute is seen as the threshold beyond which critical detail cannot be identified" and finishes off with "Sitting beyond these distances will result in a loss of detail"
But then you see an entire paragraph below that debating the first.
When I enter debate on this subject, the first response from an "expert" is normally denial that there even is a debate.
When I test my vision using a high frequency grid (it's easy enough to make things like that with a computer) the results I get agree with the "one acrminute" limitation. Which should be expected, It would be extremely hard to make a system correctly interpret an image made up of lines of the same size as the photocells in the system. Naturally all you end up with is noise.
When I test my vision for vernier resolution, using a low frequency grid with a bit of aliasing, the limits are approximately three times higher.
The "one arcminute" limit will still apply to fine, random texture, like the surface of concrete, but human vision is highly tuned to detect high contrast edges and motion, not wide, consistent texture. It's hard to say how much the "one arcminute" limit affects everyday vision.
I suspect part of the problem is that people underestimate the complexity of the human visual system. A while back Neuro recommended this book to me (http://www.amazon.com/Principles-Neural-Science-Edition-Kandel/dp/0071390111), and it has been nothing but a delight to read (well illustrated).
It has a few chapters that go over the eye in detail.
Then you have to factor in diminishing returns, which seems to be the biggest issue with most people, regardless of whether they agree with you.