“Let our rigorous testing and reviews be your guidelines to A/V equipment – not marketing slogans”
Facebook Youtube Twitter instagram pinterest

1080p and the Acuity of Human Vision

by April 02, 2007
1080p resolution

1080p resolution

"1080p provides the sharpest, most lifelike picture possible."   "1080p combines high resolution with a high frame rate, so you see more detail from second to second."  This marketing copy is largely accurate.  1080p can be significantly better that 1080i, 720p, 480p or 480i.  But, (there’s always a "but") there are qualifications.  The most obvious qualification: Is this performance improvement manifest under real world viewing conditions?  After all, one can purchase 200mph speed-rated tires for a Toyota Prius®.  Expectations of a real performance improvement based on such an investment will likely go unfulfilled, however!  In the consumer electronics world we have to ask a similar question.  I can buy 1080p gear, but will I see the difference?  The answer to this question is a bit more ambiguous.

Measuring Human Vision

To fully understand the implications of high resolution and high definition we must first explore the limitations of human vision.  The Dictionary of Visual Science defines visual acuity as "acuteness or clearness of vision, especially form vision, which is dependent on the sharpness of the retinal focus within the eye, the sensitivity of the nervous elements, and the interpretative faculty of the brain."  Simply put, our eyes have a resolution limit.  Beyond our ability to see it, increased image resolution is simply an academic exercise.  It can have no real part in improving the viewing experience.  Unlike hearing, our visual acuity is unambiguous and relatively simple to measure.

Vision is measured using a few different tools.  The most familiar is called the Snellen chart.  Using this tool an optometrist or physician would ask you, from a standardized distance of twenty feet (six meters in countries that use the metric system), to read the "letters" on the chart.  The smallest line that can be read accurately defines the acuity of vision, which is expressed in a quasi-fractional manner.  20/20 means that a subject can read the line that defines average vision from the prescribed twenty feet away.  20/10 means that same subject can read, from a distance of twenty feet, the line that a subject with "normal" vision could only read from ten feet.  20/10 vision is therefore twice as good as 20/20.   Similarly, 20/40 is half as good with the subject being able to read at twenty feet what someone with normal vision could read at forty. 

The next part of the puzzle is applying this understanding to a video display or other image composed of heterogeneous elements.  The human eye’s resolution (acuity) is directly proportional to the size of the elements of the image and inversely proportional to distance from the elements.  This relationship is best expressed in degrees. 

It's common knowledge that people have a finite field of view, which is normally considered from its upper limit.  Technically this is said to be the angular extent of the observable world that is seen at any given moment.  Roughly put, we can see things that exist within a known angle with the apex being our nose.  Staring straight ahead the average person has a stereoscopic field of view (not including peripheral vision which allows nearly a 180 degree field of view) of about 100 degrees.  In a similar manner we have a lower limit to our field of view.  Scientists express this as an angle as well, but because that angle is less than a degree we have to use the language of engineering and describe this lower limit in minutes of arc. 

Everyone knows from their high school geometry classes that a circle is 360 degrees (360°).  For angles smaller than 1 degree we use arcminutes and arcseconds as a measurement.  An arcminute is equal to one sixtieth (1/60) of one degree.  "Normal" visual acuity is considered to be the ability to recognize an optotype (letter on the Snellen chart) when it subtends 5 minutes of arc.  We can most certainly see objects below this level, as this describes only our ability to recognize a very specific shape.  Taking this a step further, we find that the lower limit of "resolution" of average eyes equates to roughly ½ the limit of acuity.  In other words, the average person cannot see more than two spots (pixels if you will) separated by less than 2 arcminutes of angle. 

Understanding 1080p Resolution in Displays

Now let’s relate this to a video display. This isn’t new, by the way. The NTSC standard was established in 1940 by the Federal Communications Commission.  Part of that standard accounted for the size of an image as it relates to the eye’s ability to resolve the individual scanning lines of the display.  In the past the general rule was, for best perceived picture quality, to have an image with a diagonal measure no more than 1/5 the seating distance.  The advantage of HDTV (and EDTV for that matter) is that we can sit closer, thereby enjoying a larger image.  A movie theater screen subtends a viewing angle of 30 degrees or more and, with the introduction of HDTV and progressive scan displays, so can home video media!

Using a 50-inch Plasma display as an example the dimensions of the actual image are approximately 44 inches wide by 25 inches tall.  This yields the diagonal measurement of 50 inches on a 16:9 display.  The pixel size of the average 50 inch plasma set is about 0.8 mm square.  This translates to approximately .03 inches.  With a .03 inch pixel, a 44 inch wide image requires 1466 discrete pixels.  To keep the display consonant with current resolution formats, the typical product will offer a WXGA capability, which translates to 1355 x 768 (or possibly WSXGA at 1440 x 900).  For a 50-inch set to offer true 1080p resolution (and not 1080p compatibility) it will need a pixel .023 inches in size, or a 25% improvement in pixel density with an attendant increase in manufacturing cost.  Now that we’ve determined the size of the individual display elements, the remaining question becomes "How close must we sit to see individual pixels?"

Assume an average-sized living room.  We hang the plasma on the wall and position the sofa eight feet away.  Now we get to do some math to determine the limits of our ability to see image artifacts based on resolution.  Keep in mind this article is written in general terms, so you scientists out there don't need to stand in line to file corrections!  Using trigonometry, we find that our 50 inch display subtends a viewing angle of about 28 degrees.  We know this because half the image width is (roughly now) 2 feet and the viewing distance is 8 feet.  This creates a right triangle and, using the formula cosine x (half the subtended angle) = adjacent side length (8 feet) ÷ hypotenuse length (calculated to be approximately 8.25 feet), we find x=14.04 degrees.  Multiplied by 2, we find our total viewing angle.

The resolution of our eyes is 12 vertical lines per arc angle (one line per arcminute for 20/20 acuity) times 2.  Now 28 degrees x 12 lines x 2 = 672.  This means we really can't see a display component (pixel) smaller than 1/672 x image width.  Our minimum resolvable element size is about 0.065", or about twice the size of the pixels of the WXGA image! Put bluntly, from 8 feet away while watching a 50 inch plasma TV, the human eye is generally incapable of reliably distinguishing any detail finer than that shown on a true 720p display! 

Of course there are other factors that affect perceived image quality.  The way color is handled, the latency of pixel illumination, motion artifacts and the effects of the algorithms that fit the image data to the native resolution of the display (and more importantly the SOURCE) all play a part in a qualitative assessment of the image.  It‘s safe to say, however, that increasing resolution and image refresh rate alone are not enough to provide a startlingly better viewing experience in a typical flat panel or rear projection residential installation.

So What's the Big Deal? Size!

Now, does this mean that 1080p is irrelevant in most of today's home theaters? Absolutely not! We've just used a singular example to explain why it may not be such an improvement for users with fixed width screens in a particular viewing arrangement. And in truth, this example likely fits the majority of today's home theater environments.

But what does 1080p offer? Two things: increased screen size and closer viewing distances. In particular, 1080p displays (coupled with true 1080p source content like HD DVD and Blu-ray) allow those using front projection systems to suddenly jump up to screen sizes of 100-inches or more - from that same 8-foot viewing distance. So while that 50-inch plasma may not look much different when playing 720p or 1080p content, your new front projector just allowed you to quadruple the size of your display. Hey, that's not bad! The added bonus is that much of the HDTV content available via airwaves and through cableTV and satellite providers is transmitted in 1080i. 1080i content often looks fantastic on 1080p and allows the display to make good use of the additional resolution.

Special thanks to Joseph D. Cornwall
Business Development Manager, Impact Acoustics

 

About the author:
author portrait

A sales and marketing professional, Joe holds degrees in Electrical Engineering and in Applied Business. He has been honored several times within the consumer electronics industry, being selected to serve as a judge for the prestigious Consumer Electronics Association "Mark of Excellence Awards" and having served on the Board of Directors of the Satellite Broadcasting and Communications Association.

View full profile