Understanding 1080p Resolution in Displays
Now let’s relate this to a video display. This isn’t new, by the way. The NTSC standard was established in 1940 by the Federal Communications Commission. Part of that standard accounted for the size of an image as it relates to the eye’s ability to resolve the individual scanning lines of the display. In the past the general rule was, for best perceived picture quality, to have an image with a diagonal measure no more than 1/5 the seating distance. The advantage of HDTV (and EDTV for that matter) is that we can sit closer, thereby enjoying a larger image. A movie theater screen subtends a viewing angle of 30 degrees or more and, with the introduction of HDTV and progressive scan displays, so can home video media!
Using a 50-inch Plasma display as an example the dimensions of the actual image are approximately 44 inches wide by 25 inches tall. This yields the diagonal measurement of 50 inches on a 16:9 display. The pixel size of the average 50 inch plasma set is about 0.8 mm square. This translates to approximately .03 inches. With a .03 inch pixel, a 44 inch wide image requires 1466 discrete pixels. To keep the display consonant with current resolution formats, the typical product will offer a WXGA capability, which translates to 1355 x 768 (or possibly WSXGA at 1440 x 900). For a 50-inch set to offer true 1080p resolution (and not 1080p compatibility) it will need a pixel .023 inches in size, or a 25% improvement in pixel density with an attendant increase in manufacturing cost. Now that we’ve determined the size of the individual display elements, the remaining question becomes "How close must we sit to see individual pixels?"
Assume an average-sized living room. We hang the plasma on the wall and position the sofa eight feet away. Now we get to do some math to determine the limits of our ability to see image artifacts based on resolution. Keep in mind this article is written in general terms, so you scientists out there don't need to stand in line to file corrections! Using trigonometry, we find that our 50 inch display subtends a viewing angle of about 28 degrees. We know this because half the image width is (roughly now) 2 feet and the viewing distance is 8 feet. This creates a right triangle and, using the formula cosine x (half the subtended angle) = adjacent side length (8 feet) ÷ hypotenuse length (calculated to be approximately 8.25 feet), we find x=14.04 degrees. Multiplied by 2, we find our total viewing angle.
The resolution of our eyes is 12 vertical lines per arc angle (one line per arcminute for 20/20 acuity) times 2. Now 28 degrees x 12 lines x 2 = 672. This means we really can't see a display component (pixel) smaller than 1/672 x image width. Our minimum resolvable element size is about 0.065", or about twice the size of the pixels of the WXGA image! Put bluntly, from 8 feet away while watching a 50 inch plasma TV, the human eye is generally incapable of reliably distinguishing any detail finer than that shown on a true 720p display!
Of course there are other factors that affect perceived image quality. The way color is handled, the latency of pixel illumination, motion artifacts and the effects of the algorithms that fit the image data to the native resolution of the display (and more importantly the SOURCE) all play a part in a qualitative assessment of the image. It‘s safe to say, however, that increasing resolution and image refresh rate alone are not enough to provide a startlingly better viewing experience in a typical flat panel or rear projection residential installation.
So What's the Big Deal? Size!
Now, does this mean that 1080p is irrelevant in most of today's home theaters? Absolutely not! We've just used a singular example to explain why it may not be such an improvement for users with fixed width screens in a particular viewing arrangement. And in truth, this example likely fits the majority of today's home theater environments.
But what does 1080p offer? Two things: increased screen size and closer viewing distances. In particular, 1080p displays (coupled with true 1080p source content like HD DVD and Blu-ray) allow those using front projection systems to suddenly jump up to screen sizes of 100-inches or more - from that same 8-foot viewing distance. So while that 50-inch plasma may not look much different when playing 720p or 1080p content, your new front projector just allowed you to quadruple the size of your display. Hey, that's not bad! The added bonus is that much of the HDTV content available via airwaves and through cableTV and satellite providers is transmitted in 1080i. 1080i content often looks fantastic on 1080p and allows the display to make good use of the additional resolution.
Special thanks to Joseph D. Cornwall
Business Development Manager, Impact Acoustics
So this weekend I upgraded my receiver. I went from a Denon 3801 to a 3803; like I said I don't run new stuff and I wanted upconversion. So I notice two differences one of which is clearer audio and video (this really surprised me). I could never make out all of the audio in the comcast Bengals commercial, I can now. The video also appears clearer and a bit brighter. So, with that said, I think the differences will become more apparent in the next few years. HDMI, despite being a spec for a few years is in my opinion still young and only starting to come in to is own. I think it takes the manufactures and circuit designers a few iterations to really get it right. I think the differences I'm experiencing between the 3801 and the 3803 are perfect examples of this.
So, I think both sides of this topic are correct. I think with the right hardware people will see a difference.
Maybe I'm way off basis, but thats my impression.
I'm apparently not allowed to post URLs.
Any idea where such a pattern can be found for testing my display?
We also have a brain that can compare the image from two slightly offset eyes and basically construct an image that is sharper and more detailed than what the individual eye can theoretically see. In fact, the brain will do it with one eye only, by constantly moving the eye around and constructing an image from the "snapshots" it has taken from different spots.
It's like the techinque they use in amateur astronomy. You take one telescope, one standard webcam, 50 photos of the moon and let the computer calculate a high-res combination from the photographs. You can calculate sub-pixel size details, when you compare the differences between the images - as long as they're not all exactly on the same spot, so you get differences between the pixels.
These theoretical figures don't really tell us what we can or cannot see when we set out to actually look at something.
One way of testing your personal viewing distance would be to display a moiré test grid on your LCD. It's a chessboard pattern with alternating black and white pixels.
A proper LCD with proper scaling won't have any problems showing that grid exactly the way it is. Here's the trick: when you look at it and move your head back and forth, you start seeing halo-like objects. These are actually the moire patterns of your own retina. Move your head back until you stop seeing them and the image turns to a dull grey color without any pattern.
That's where your real limit is. Though you might notice that if you look at the screen, you can still see the grid pattern at places. That's your two eyes and brains at work.
That's what I see, anyway.