Do I Need 120 Hertz HDMI Cables?
One of the most common sorts of questions from our customers these days is some variant on this: "Do I really need a 120Hz HDMI Cable?" In consumer electronics stores across the country, consumers are being told that their new 120 Hertz displays will not work properly, or will not work optimally, without a cable designed for 120 Hertz. Let's address this question two different ways, beginning with the short answer and following with the long answer/explanation:
Q. Do I need a 120 Hz HDMI cable?
The Short Answer:
A. No. In fact, there's no such thing.
The Long Answer...
All right, here's the long answer. We'll start by talking about this whole "Hertz" thing to make sure we are all speaking the same language.
What the Heck is Hertz?
The Hertz is a unit of frequency, named after Heinrich Hertz, one of the pioneers of radio, who discovered what were once called "Hertzian waves." A Hertzian wave is a wave of electrical energy, and it can propagate invisibly through the air--it is, in other words, what we now call a radio wave. Hertzian waves alternate in direction, with the charge rapidly oscillating from positive to negative and back again. The unit "Hertz" is the measure of how often these waves cycle through their whole positive/negative swing, and this unit is used to measure not only radio waves, but any periodic wave, like the current in your electrical supply lines. In America, the power runs at 60 Hertz, so if you could watch the voltage on your incoming power, you'd see it swing up and down sixty times every second.
Hertz and the Golden Age of Television
Television originated in a non-solid-state, analog world. Anyone of a certain age in America remembers going down to the store with the parents and using a tube tester to try to figure out which of the several vacuum tubes in the television set had blown, because televisions relied upon a series of vacuum tubes to do all the work of bringing in a signal, extracting the video and audio, and delivering that to the screen and speakers. In this simpler, analog world, with no transistors to make electronics cheaper and more compact, one challenge in receiving a television signal was to figure out how to synchronize the television receiver to the television transmitter. If the television transmitter sent out a certain number of frames of video per second, and the television receiver was not running at quite the same speed--say, the transmitter was running 30 frames per second and the receiver was running at 29 and a half, the result was a mess. But a highly stable oscillator circuit fixed tightly to a reference frequency was not a cheap thing to include in every television set.
The solution to this problem was to use, as a frequency reference, the one rock-solid reliable reference frequency which every television owner had in the home: the AC line current. On each full cycle of the AC line, a television transmitter would send out one field of video, and every television set was designed to use that reference frequency and look for video to come in at just that rate. Since sixty frames per second was more than was needed to generate smooth action, the standard was made "interlaced," with one frame of video being composed of two fields, one of the "odd" lines and one of the even. Sixty Hertz current became sixty fields of video, or thirty frames per second.
Because the Hertz is a handy unit of frequency, it came to be applied to other phenomena with regular frequency, such as the rate at which frames or fields are fed to a television monitor. Accordingly, televisions, computer monitors, and the like often are said to have a refresh rate of some number of Hertz-- 30 Hertz, 60 Hertz, 72 Hertz, and so on.
Refreshing Crystal Light - No, Not the Drink
In that not-so-distant-past analog world of which we spoke above, the 60 Hertz wall current was used to time the movements of magnetic fields within a Cathode Ray Tube -- CRT for short -- which dragged a beam of electrons across our television screens to illuminate phosphors, making a television picture. But today, of course, there are new display technologies. One of these which is particularly relevant to our discussion here is the LCD, or Liquid Crystal Display.
In an LCD screen, instead of scan lines like we see on a CRT, there are individual, separately addressable pixels which are colored and lit by delivering electrical charges to them. LCD is, of course, not the only display that works in this fashion, the other principal type on the home theater market being the plasma screen.
Whether or not you've been around on this earth long enough to have replaced the vacuum tubes in your television set, you've certainly been around long enough to remember that when LCD monitors for home theater use began to hit the market, they were plagued with one very serious drawback: noticeable "latency." LCDs could not, for a couple of reasons, respond quickly to image changes, and the result was that in fast-changing areas of screen, transitions were slowed down; there might be a blur, for example, behind a moving object as the image of the object was being replaced by the image of the background. LCDs have of course become much better today than they once were in this respect. The latency issue is the reason you're not seeing a lot of non-LCD 120 Hertz displays--a plasma screen could be refreshed 120 times per second, too, but there's no compelling reason to do so.
One of the methods LCD manufacturers have used to defeat latency (and to do other things, too--notably, to aid in rendering 24 frame-per-second film) is to refresh the screen more frequently. The more overwriting, the less latency. Because existing American video sources generally run at 30 frames per second or at 60 frames per second, and because film is usually shot at 24 frames per second, 120 frames per second, being a multiple of each of these, makes a sensible choice for this faster refresh rate because it allows each frame of any of those video sources to be repeated on the screen a fixed number of times. Whether the display is being fed at 24, 30, or 60 frames per second, it can simply multiply the frames, refresh at the same 120-frame rate, and reduce latency in the image. And, because the refresh rate of a display is often spoken of in "Hertz," this gave rise to the "120 Hertz" display.
But What About the Source and the Cable?
The fact that the refresh rate of the display is 120 Hertz may be, for the reasons above, a great thing--less latency, smoother pulldown of 24 fps sources. But it does give rise to a misconception about how the system works, and this misconception is being used by some, unfortunately, to market high-priced HDMI cables.
Anybody familiar with HDMI will know that the bandwidth demand placed upon the cable is a function of the bitrate flowing through that cable. The bitrate, in turn, is a function of the resolution, frame rate, and color depth of the picture. The argument here is obvious enough: 120Hz signal has double the frame rate of 60Hz signal, and therefore needs double the bandwidth. That seems simple and straightforward enough, and it would be true, but for one thing: It's NOT. It simply makes a critical, incorrect, unstated assumption.
The incorrect assumption here is that the new doubled refresh rate is transmitted over the cable. It's not. Your cable needs to handle the frame rate which passes through the cable, but it doesn't care what the frame rate at any other point in the process is. If the cable is carrying a 60 Hz frame rate, and the display doubles that to 120 Hz to refresh the screen twice as often, your cable only "sees" 60, not 120. The bandwidth demand placed on the cable has to do with the signal coming from the source and into the display--what the display may do with that signal internally, after it has passed through the cable, has nothing to do with the load on the cable. Nobody feeds video at 120 Hz, because it doesn't make any sense to do so--when the original content is not recorded at 120 frames per second, there's no gain to be had in sending each frame multiple times to the display, and it would make the sending and receiving chipsets costlier while making the whole interface less reliable due to the increased bandwidth demand placed on the cable. In fact, most (perhaps all) of the "120 Hertz" displays on the market cannot and will not accept an input signal with a 120 Hertz frame rate. Read that last sentence twice if you're still confused.
Whether your display's internal refresh rate is 120 Hertz or some other rate, the signals coming in are running at frame rates determined by the sources of those signals. This typically means 30 Hertz for interlaced formats like 1080i, 60 Hertz for progressive formats like 720p or 480p, or 24 Hertz for certain players that support 1080p/24. Those signal frame rates, not your display's internal refresh rate, are what your cable must handle. If a salesman is trying to push that monstrously expensive "120Hz HDMI cable" into your hands--probably at a buck or more per Hertz - it's time to keep your wallet in your pocket.
by Kurt Denke
President, Blue Jeans Cable
Confused about what AV Gear to buy or how to set it up? Join our Exclusive Audioholics E-Book Membership Program!
Recent Forum Posts:
HDMI over Cat
duffy0401, post: 502671
I bought a Samsung series 650 last night from a well known retailer…they tried to sell me one of these cables I'm so happy I declined after reading this. I just figured I could find one cheaper little did I know it didn't even exist.
This is the first time I am hearing this sales tactic but am not surprised. Good thing you didn't fall for the gimmick.
I want to know what consumers think about wire companies fraudulently marketing hertz limits in HDMI wire. My understanding is that all wire is capable of transmitting any hertz/frame rate. i.e. If a DVD player outputs a 120Hz signal and the TV handles 240Hz TV, then the consumer's common sense tells them they would want at least a 120Hz HDMI to prevent loss of quality on a so-called 60Hz HDMI wire and to avoid the TV from having to up-convert from the lessor 60Hz wire but the better 120Hz that the DVD outputs because the up-conversion introduces un-natural movements during the interpolating process where fake frames are displayed to make up the difference of the lower hertz rate. I attached a few comments forum comments below that support my questions. Naturally, one would think that all rates from the original camera filming frames to the DVD player/cable box output rate to the TV need to be closely matched to get best natural picture motion. Even though my TV can upscale any picture source up to 120 hertz, it's no good if the camera source is only 24 frames/sec due to interpolation during the up-convert. I've tested different hertz rates on a 120Hz capable TV with a 24Hz DVD that was played on a 60Hz PS3 & transmitted to TV via a cheap, generic HDMI wire. The TV was set to up-convert to 120Hz, but had a info box on TV at start of movie that indicated the input source was 60Hz because that's what the PS3 was spitting out, but the DVD was shot at 24 frames/sec, so the up-conversion looked un-natural. The up-conversion process is called interpolation, where the process displays fake frames between real frames based on a computerized formula, so I guess 4 fake frames for every real frame because 120Hz/24Hz=5 frames? The fake frames never existed during camera filming, so the computer can't properly predict exact frames, thus leaving you with judder and unnatural movements of people & things in the movie. I prefer the natural look, so I found more importance with 1080p pixels & higher contrast ratios with vivid colors and deep blacks.
You would turn it (higher Hz interpolation) off for a few reasons I will describe below.
When you turn on “tru motion” what your TV does is take every frame of information from the source and MAKES UP additional information (based on the information from the source) to produce 120 frames per second. even if the source is less than that. This is why the picture (to most people) is inconsistent, fake looking and can random speed up/slow down. This is because your TV is MAKING STUFF UP. If you like the setting then that's fine and it appears you do.
The wire companies are making it sound as if you need to buy wire that can allow faster hertz, when really it's about the frame rate of the camera source. Some say higher hertz rate is good for sports, I agree, but only if it matches the camera rate because I don't like interpolation with fake picture frames.
One of the most common sorts of questions from our customers these days is some variant on this: “Do I really need a 120Hz HDMI Cable?” In consumer electronics stores across the country, consumers are being told that their new 120 Hertz displays will not work properly, or will not work optimally, without a cable designed for 120 Hertz. We'll address this question two different ways, beginning with the short answer and following with the long answer/explanation.
This “120 Hertz” thing really caught us by surprise at BJC. At first, we had just the occasional question about it, but lately it has been one of the most frequent questions we get on the phone and in e-mail. What's frustrating is the form it so often takes: usually something like “I need a 120 Hz compatible cable. Do you have any that are 120 Hz compatible? Your website doesn't say they are.” The B.S. vendors are laying this one on thick and heavy, and a lot of customers have already been convinced of the need before we see them.
For a time I considered putting something like “120 Hz compatible” into our descriptions, but frankly, that seemed wrong–it's like labeling your peanut butter “cholesterol-free” when all peanut butter, of course, is cholesterol-free. So the answer, of course, is to meet the question with an explanation. Unfortunately, sometimes that just leads to customers thinking we're being evasive. One can't always win the war against B.S., but one can try…
It would seem to me that this “120MHz Cable” garbage is an egregious example of consumer fraud. It would be delicious to see the heads of these companies in prison orange over this sort of balderdash
People will never understand this is a what happens when film is shot at 24fps.
That's why film looks the way it does. So now you have all these technologies that change the original source and fake frames to make it appear smooth.
It's really stupid.
wow…..great info. so i was duped into thinking that i'd be watching stuff at 120hz. i also noticed that when i turn on my ps3 or xbox it only shows at 60hz on my info box.
Info box may only reflect the signal going to the TV not how it is displayed on it. If the TV can be set to display at 120Hz, then it will do that to the incoming 60Hz most likely.
more audio video cables you can get here