“Let our rigorous testing and reviews be your guidelines to A/V equipment – not marketing slogans”
Facebook Youtube Twitter instagram pinterest

HDMI 1.3 and Cables Part 1: It's All in the Bitrate

by August 22, 2007
HDMI 1.3 cables - Its all in the bitrate

HDMI 1.3 cables - It's all in the bitrate

Special thanks to Steven Barlow of DVIGear for technical contributions to this article

With the advent of HDMI v1.3 and 1.3a, consumers are starting to really get confused about cables and what they need to worry about when selecting a product that's going to be compatible with the new specifications. We interviewed Steven Barlow from DVIGear to get a handle on why this is a more complex issue for some, and a non-issue for others. He allowed us to assimilate much of what we discussed into this article you are reading now.

Before we get too far, it's important to understand a very significant term that relates to digital signals, "bitrate," and what it means. "Bitrate" is the speed that bits are moving through the cable system, AV receiver, DVD player, whatever. The home theater enthusiast is more concerned with resolutions - and the more advanced consumer electronics pro typically understand and express this through the concepts of resolution and color depth. So a lot of times you'll hear bitrate-centric issues discussed more along the lines of 1080p being "better" than 1080i and 1080p with 8-bit color is less than 1080p having 12-bit (Deep) color.

But from a digital engineering standpoint everything is bitrate. And with that, we begin our discussion. With respect to high-definition video quality and resolution, 720p or 1080i is the first "bus stop" at 742.5 Mbit/s. Those resolutions describe 742.5 million bits of information traveling through the digital pipeline each second. A lot of AV receiver chipsets handle these resolutions with no problems at all. ALL HDMI cables that aren't made in someone's garage will pass this level of bitrate for a considerable length.

The next stop on the route to HD bliss was 1080p which, although thought of as the perfect format, this was initially limited to the normal 8-bit color depth, yielding a bitrate of 1.485 Gbit/s (1485 Mbit/s). If you just did that math, this means that 1080p is twice the bitrate of 720p or 1080i. That should be a big deal to people looking to install a longer HDMI cable run in their home theater room. Taking this a step further, if you plan to utilize any PC resolutions that exceed 1080p, you're in for even more cable requirements. 1920x1200/60Hz, for example, has a bitrate of 1.54 Gbit/s and 1600x1200/60Hz requires a bitrate of 1.62 Gbit/s. If you're going to utilize HDMI or DVI over long distances it might be a good idea to know that the manufacturer has had that cable certified to handle your particular bitrate requirements. DVIGear, for example, has spec'd all of its SHR (Super High-Resolution) HDMI and DVI-D cables to handle up to 2.25 Gbit/s (the highest available bitrate for any products on the market today) at up to 7.5 meter lengths (without any EQs or active components.) Their active cables can go up to 40 meters at this bitrate.

bitrates

Note: You may note that the bitrate may not make sense at first glance since the higher resolution has a slightly lower bitrate. This has to do with blanking time and active time. Active time is when the image is being written on-screen. Blanking time occurs during the time the image lines are skipping over the areas you don’t see (think of a old typewriter during a carriage return.). This all is derived from old CRT scanning where the beam would have to blank in traveling back to the left before picking up the next line. Blanking time and active time equal your total time. The longer the blanking time the shorter the active time, and vice versa. 1600x1200 has longer horizontal blanking time and you have a shorter active time - so the data must be crammed in. 1920x1200 has a larger active time and shorter blanking time and so it can be expressed with a lower bitrate. VESA determines all these standards and timings.

HDMI v1.3 - What Were They Thinking?

Before we get into HDMI v1.3 I believe it is important to note the state of things in the consumer electronics industry prior to the introduction of v1.3. At just 1.485 Gbit/s bitrate for 1080p at 8-bit color, many cable vendors' products fell to pieces after just 5 meters (and believe us, some didn't make it that far.) Most major name cable manufacturers were doing well to label their products, but a lot of imported pieces claimed 1080p compatibility at such ridiculous lengths that it was obvious the companies were not informed. Just as the industry seemed to be catching a hold of this concept and customer installers were properly outfitting their installs with cables that would "stand the test of time" the powers that be over at HDMI Licensing, LLC, a wholly-owned subsidiary of Silicon Image (a principle founder of the HDMI standard), felt concerned about emerging technologies such as DisplayPort from VESA, a competing format that threatened to compete with HDMI. On June 22, 2006 they announced completion of the new spec to the industry to greatly extend HDMI's capabilities - but largely much of the new spec exists only on paper.

HDMI v1.3 can, per the spec, handle up to 3.4 Gbit/s per channel. Now, don't be confused by the inflated numbers used by the marketing people at HDMI.org - they express the bandwidth as 10.2 Gbit/s - which is simply the bandwidth multiplied by the three color channels (RGB). While Silicon Image and HDMI Licensing, LLC said everyone was going to support 3.4 Gbit/s, they didn't exactly provide a lot of "support" for the first 6 months. For a long time there wasn't enough silicon to produce the transmitter and receiver chips needed to implement the new technology so most manufacturers waited until 2007 to produce consumer electronics products with the v1.3 features. As silicon emerged from Silicon Image and a few other manufacturers in 2007, support was included for up to 2.25 Gbit/s - the bitrate associated with 1080p resolution at 12-bit color per channel. This isn't supportive of the maximum theoretical HDMI 1.3 bitrates, but it is certainly more than 8-bit and less than the 16-bit HDMI v1.3 claims to be able to handle.

 

About the author:
author portrait

Clint Deboer was terminated from Audioholics for misconduct on April 4th, 2014. He no longer represents Audioholics in any fashion.

View full profile

Confused about what AV Gear to buy or how to set it up? Join our Exclusive Audioholics E-Book Membership Program!

Recent Forum Posts:

reznor11 posts on October 24, 2007 12:21
Part 2?

Clint DeBoer, post: 299935
Oh just wait until part 2 which goes into the rather unbelievable practices of one of the main companies behind HDMI. It will be absolutely eye-opening.

Any idea when part 2 will be posted?
mtrycrafts posts on August 25, 2007 22:50
Clint DeBoer, post: 301050
I believe those bitrates are x3 for the red, green, and blue color channels. Most engineers and manufacturers use the per color channel numbers of 8-bit, 10-bit, 12-bit, 16-bit while marketing people like the “big” additive numbers.

Yes, as you indicated why would you add them if they are individual colors, RGB since each has its own wire and the other color wire doesn't affect the bit rate flow of its neighbor wires, does it?
So, While the overall bit rate may be so high, each wire is taking its share and there is no worry about capacity
Clint DeBoer posts on August 25, 2007 11:45
mtrycrafts, post: 300157
Go down to the bottom of the page with the chart for bitrates. The numbers I quoted came from it and is the max with video and audio for Hi Def DVDs. So, if that is the max it can download, isn't that the bitrate in the HDMI cable?
I believe those bitrates are x3 for the red, green, and blue color channels. Most engineers and manufacturers use the per color channel numbers of 8-bit, 10-bit, 12-bit, 16-bit while marketing people like the “big” additive numbers.
MDS posts on August 24, 2007 19:57
Alamar, post: 300923
IBeing a newb I'm a little confused by that.

Are you saying that the older RPTV 1080i sets are not HDTV or are you saying that they are really basically 540P?

No, I'm saying there is no such thing as a 1080i TV. A fixed pixel HDTV has one single resolution and always scans progressively. If the TV has a native resolution of 1366 x 768 but can accept 1080i (1920 x 1080 interlaced) people often say they have a 1080i TV, but in reality it is a 768p TV that happens to accept a higher resolution as input.
Alamar posts on August 24, 2007 18:22
Clint DeBoer, post: 299935
Oh just wait until part 2 which goes into the rather unbelievable practices of one of the main companies behind HDMI. It will be absolutely eye-opening.

I know I wasn't the target of the comment but thanks for the heads-up. As a newb anything that I can find out about the real workings of what's going on is helpful.

************* To avoid the double post ******************

@MDS:
MDS
No HDTV is 1080i. All HDTVs are progressive scan. 1080i may be the highest resolution it can accept as an input but it will always scale whatever is input to its native resolution. So if this tv ‘upconverts’ to 1080p then it IS 1080p - its native resolution is 1920 x 1080p. It just cannot accept a native 1080p signal as an input.

Being a newb I'm a little confused by that.

Are you saying that the older RPTV 1080i sets are not HDTV or are you saying that they are really basically 540P?

******************************************************

MTRYCRAFTS
How does this mesh with the hi def DVDs max transfer rates of 36 or 54Mbits/s?
Isn't this the transfer rate going through the cables? A bit of difference between this and the Gbit rate in the article.

This isn't exactly right but you could think of the data transfer on the Hi Def DVD being how fast the player can read from the disk. This is the 30-50Mb/s number that you mentioned.

The data on the disk is highly compressed / zipped. The signals that the Hi Def DVD player sends out are uncompressed which is why the Hi Def DVD player sends out MUCH MUCH more data than it actually reads.

Basically the Hi Def DVD is zipped and what your TV gets is unzipped. This explains the big difference in data rate and total data over time.
Post Reply