“Let our rigorous testing and reviews be your guidelines to A/V equipment – not marketing slogans”
Facebook Youtube Twitter instagram pinterest

8-bit, 10-bit, 12-bit… What's the Deal?

By

Bit depth is what is responsible for the number of colors you can see on a display. Increasing the bit depth of a display product (and matching the source components and cabling to support that display) means that you will incur a much lower chance of experiencing color banding where smooth color gradients cannot be accurately reproduced and break up into bands. To understand how this affects bitrate, let's consider the following:

  • 8-bit color = 2^8 x 3 = 2^24 = 16.7 million colors
  • 12-bit color = 2^12 x 3 = 2^36 = 68.7 billion colors

Increasing the bit depth results in almost exponentially better color rendition. Now, remember that the old benchmark limit was 1.65 Gbit/s, much more than any home theater enthusiast needed for excellent resolution and performance. After all, at the time, these cutting-edge enthusiasts were only concerned with attaining 1080p at 8-bit color - something that was very new and only available to a handful of elite consumers. A year ago, most of these consumers didn't even know what 1080p was and they almost certainly didn't concern themselves with achieving 12-bit color (let alone 16-bit). Still, 12-bit is nice and will likely result in reduced banding, especially during darker scenes in light-controlled environments.

If you take 1080p resolution at 12-bit color the math comes out to 2.2275 Gbit/s. Fortunately for HDMI Licensing, LLC there are now some chipsets out there finally supporting that resolution in mass quantities. What has transpired is a gradual transition whereby consumer electronics manufacturers have the choice of paying a premium (typically up to $1.50 more per chip) for these chips in order to enable Deep Color support and create a value-added product for consumers and dealers.

So the Cable Drama Continues, With a New Twist

Whereas cable manufacturers were starting to come to grips with the fact that most HDMI cables didn't have the metrics to pass 8-bit 1080p over 10 meters without compensating electronics, they are now faced with a new problem. 1080p at 12-bit. The numbers are crunched but the testing phases are still underway as cable manufacturers send in their products for testing to pass the new requirements. Of course many cable manufacturers don't seek certification - in which case buyer beware. There are also great issues with the certification process, but we'll address those later.

Thanks to v1.3 and Deep Color, distances for passive copper cables have shortened yet again and, in a sense, much has been set back a year until the new effects of this slow transition start to be fully understood. We used to say "Don't put a crappy cable in your wall - it might not pass 1080p." Now we must say "Watch out for 12-bit Deep Color (not to mention any future higher-resolution formats or bitrates) - that same cable might need some electronics to pass it properly." With cables of any length over 3-5 meters it is quickly becoming apparent that active components are playing a major role in signal integrity.

Dealers and custom installers are going to have to bone up on the cable electronics needed to migrate their clients so that they can enjoy the latest technology - and learn what cables can handle this new bandwidth and for how long. Active HDMI cable solutions from respected companies like DVIGear are now certified to work up to 40 meters at 1080p 12-bit color. These active solutions are still the most popular method for long-run installation because fiber solutions are so expensive in comparison.

To put it bluntly, you don't want to get caught putting a cable in your wall (or worse, your client's wall) that won't be compatible with current and future (expected) technologies. Undoing a mistake like that could be very expensive, costing you a lot of time and energy. Many customers are being systematically shortchanged by their installers who are only interested in meeting the needs of current 720p and 1080i requirements. Don't let yourself or your clients be obsoleted too quickly!

 

Confused about what AV Gear to buy or how to set it up? Join our Exclusive Audioholics E-Book Membership Program!

Recent Forum Posts:

reznor11 posts on October 24, 2007 12:21
Part 2?

Clint DeBoer, post: 299935
Oh just wait until part 2 which goes into the rather unbelievable practices of one of the main companies behind HDMI. It will be absolutely eye-opening.

Any idea when part 2 will be posted?
mtrycrafts posts on August 25, 2007 22:50
Clint DeBoer, post: 301050
I believe those bitrates are x3 for the red, green, and blue color channels. Most engineers and manufacturers use the per color channel numbers of 8-bit, 10-bit, 12-bit, 16-bit while marketing people like the “big” additive numbers.

Yes, as you indicated why would you add them if they are individual colors, RGB since each has its own wire and the other color wire doesn't affect the bit rate flow of its neighbor wires, does it?
So, While the overall bit rate may be so high, each wire is taking its share and there is no worry about capacity
Clint DeBoer posts on August 25, 2007 11:45
mtrycrafts, post: 300157
Go down to the bottom of the page with the chart for bitrates. The numbers I quoted came from it and is the max with video and audio for Hi Def DVDs. So, if that is the max it can download, isn't that the bitrate in the HDMI cable?
I believe those bitrates are x3 for the red, green, and blue color channels. Most engineers and manufacturers use the per color channel numbers of 8-bit, 10-bit, 12-bit, 16-bit while marketing people like the “big” additive numbers.
MDS posts on August 24, 2007 19:57
Alamar, post: 300923
IBeing a newb I'm a little confused by that.

Are you saying that the older RPTV 1080i sets are not HDTV or are you saying that they are really basically 540P?

No, I'm saying there is no such thing as a 1080i TV. A fixed pixel HDTV has one single resolution and always scans progressively. If the TV has a native resolution of 1366 x 768 but can accept 1080i (1920 x 1080 interlaced) people often say they have a 1080i TV, but in reality it is a 768p TV that happens to accept a higher resolution as input.
Alamar posts on August 24, 2007 18:22
Clint DeBoer, post: 299935
Oh just wait until part 2 which goes into the rather unbelievable practices of one of the main companies behind HDMI. It will be absolutely eye-opening.

I know I wasn't the target of the comment but thanks for the heads-up. As a newb anything that I can find out about the real workings of what's going on is helpful.

************* To avoid the double post ******************

@MDS:
MDS
No HDTV is 1080i. All HDTVs are progressive scan. 1080i may be the highest resolution it can accept as an input but it will always scale whatever is input to its native resolution. So if this tv ‘upconverts’ to 1080p then it IS 1080p - its native resolution is 1920 x 1080p. It just cannot accept a native 1080p signal as an input.

Being a newb I'm a little confused by that.

Are you saying that the older RPTV 1080i sets are not HDTV or are you saying that they are really basically 540P?

******************************************************

MTRYCRAFTS
How does this mesh with the hi def DVDs max transfer rates of 36 or 54Mbits/s?
Isn't this the transfer rate going through the cables? A bit of difference between this and the Gbit rate in the article.

This isn't exactly right but you could think of the data transfer on the Hi Def DVD being how fast the player can read from the disk. This is the 30-50Mb/s number that you mentioned.

The data on the disk is highly compressed / zipped. The signals that the Hi Def DVD player sends out are uncompressed which is why the Hi Def DVD player sends out MUCH MUCH more data than it actually reads.

Basically the Hi Def DVD is zipped and what your TV gets is unzipped. This explains the big difference in data rate and total data over time.
Post Reply