“Let our rigorous testing and reviews be your guidelines to A/V equipment – not marketing slogans”
Facebook Youtube Twitter instagram pinterest

The Insanity of the Loudspeaker / Subwoofer Shootout

by November 10, 2010
Tasmania Devil

Tasmania Devil

A week doesn’t go by on our forums or via emails submitted to us by our readers asking us to do a loudspeaker, receiver, or subwoofer product comparison or shootout.  For those unfamiliar with that lingo, a shootout is where you take a bunch of products from different manufacturers and directly compare them against each other.  Some also refer to this as a "faceoff".  I use the term interchangeably, so forgive me if you find that happening within the confines of this article. 

While it may seem like a great idea to do this with products in theory, the reality is it’s a pain in the butt for three reasons:

  • Manufacturers hate product comparisons, especially when termed as "shootouts"
  • Readers often cry foul play
  • Product Shootouts are lot of work


Let's examine these three reasons more closely.

1. Manufacturers Hate Product Comparisons / Shootouts

Speaker ShootoutIf you’re a really really big manufacturer, chances are you don't have time to notice, or care, if a publication is taking your review sample to do a product shootout.  When you are so big, you tend to take on the view that no publicity is bad publicity.  This is rarely the case in the A/V world where most of the brands, especially speaker companies (with the exception of larger companies like Bose and Klipsch) are small businesses.  Their biggest fear is that any bad press can destroy their brand image or sales or both. 

So most of these manufacturers guard their products, being very cautious about to whom they will submit samples for review, and even more cautious about sending products for review to publications like ours that actually:

  • dissect their products and show inner pictures
  • discuss design philosophy
  • measure results. 

MOST manufacturers prefer reviews without ANY measurements at all.  They simply want a press release style review that focuses solely on their own talking points and product positives.  These very same companies almost NEVER submit their products to comparative shootouts.  This is especially true with companies that have a strict methodology to testing their products that reject even industry standard measurement practices for determining product performance. Years ago Bose sued Consumer Reports because of this very contention.  Bose ultimately lost the lawsuit… or did they?  Since then, Consumer Reports has never (to our knowledge) negatively reported on Bose product performance.  People don’t like to be sued or even threatened to be sued.

Here are some of the challenges we've faced with manufacturers in the past while trying to organize product shootouts:

Sue to Stop a Review, You Betcha?

Speaking of being sued, we were very recently  threatened in an email from a manufacturer that they would litigate against us if we published a review of the product they submitted to us for review.  They claimed since their product was physically smaller than the others in comparison it was at a disadvantage. This was despite the fact that we showed them their product was not the smallest and that it actually measured well in our tests.  We also disclosed to them their product, like everyone else's, was getting its own dedicated review with the results of all the products in the shootout tabulated in a separate article as a quick reference for how each product measured using an exacting and consistent testing methodology.

This begs the question: Why did they ever submit us a product for review if we fully disclosed how we were going to review and test their product in the first place?  This again boils down to nobody, especially small A/V companies, likes the potential to not be the #1 product in any comparison.  Despite how hard the manufacturer pounded their chest for us not to publish the review, we told them to instead pound salt, keep a cool head, and await the results which we were confident the readers would be pleased with.

 

So Long and Thanks for All the Fish

flying dolphinsSince we started organizing our most recent subwoofer shootout, we've had some of the participants request to either drop out of it entirely, or to retest either a different product configuration or more powerful amplifier, once we revealed to them their product had design limitations preventing it from reaching the CEA distortion limits.  Some were expecting us to do more simplistic in-room 1/3rd octave measurements like many of our competitors do to make their products look better than they really are.  One in particular seemed to have NO clue how we would test despite being told in advance.  This put us in an unsettling predicament having expended countless hours of product testing and paying our reviewer to simply cut our losses, send the product back and have no editorial coverage to show for it.  We had to put our foot down and decline sending products back, unless they were defective.  We offered the manufacturers that wanted a retest to pay a small re-test fee to cover our expenses so we could pay our reviewer to retest the products. 

We can't tell you how much time and resources we've lost in the past with companies trying to use us as a free beta tester while trying to conduct a formal review only to want to pull the plug after the review was completed and they didn’t care for the end results.   It reminded me of the Hitchhikers Guide to the Galaxy where seconds before the Earth was about to be destroyed, all of the dolphins in the oceans leaped out into space and said "so long and thanks for all the fish". 

Groundplane Technique Not Accurate for Subwoofer Testing?

Groundplane MeasurementWe actually had another manufacturer tell us that Groundplane technique is NOT a proven accurate testing methodology of measuring subwoofer performance, insisting the only accurate way to measure a subwoofer is to stick it outside on a 90ft pole or in an anechoic chamber.  They also felt that there is no proven standard way to measure subwoofer distortion that correlates to audibility - despite the fact that we declared our distortion measurements were not meant to do so.  They were unwilling to recognize the fact that the industry has agreed upon groundplane technique as an accepted testing methodology for testing subwoofers along with very specific distortion testing protocols mandated by the CEA 2010 which we are in compliance with.  Not only did they decline to submit a product for review, but they decisively pulled advertising from our site the same month. This is despite the fact that Audioholics.com was one of their top traffic referral sites ranked in 3rd party measurement tools such as Alexa.com.  These are the things that make us laugh (bitterly) when our critics accuse us of review bias.

Editorial Note About Groundplane Technique by Paul Apollonio

At very low frequencies the wavelengths (distance between the pressure wave maximum and minimums) are much larger than the dimensions of the speakers creating the sound.  If you think of low frequency sound as a large bubble that envelopes the speaker, you can picture the sound wave, known as 4pi radiation (full bubble).  It is easy to visualize why there is value in hoisting the speaker 90 feet in the air.  The bubble forming will not hit the ground until the wave passes 90 ft back and forth, (180 feet in total) which means if the microphone is 1 meter away, from this speaker, and 90 ft off the ground, the reflection is small enough to not be a significant source of measurement error. This is a great way to measure full range loudspeakers, especially professional line arrays used to augment the sound in large stadiums.  It produces no floor gain for the low frequency energy.  It is not however, a necessity for making comparative measurements on subwoofers.  Here is why.

If that same speaker, creating a huge enveloping bubble is on the ground, then the bubble ends up being a hemisphere, and not a sphere.  The reason for this is quite simple.  The bubble cannot form air waves under the ground beneath it, so all the energy coming from the subwoofer, is now occupying 1/2 the space as it would if you hauled it 90 feet high.  So, we have the same power from the subwoofer, but in 1/2 the space. (Which will add 6B to our measurement.)  Not of one of them, but for all of the subs tested this way.  This is called 2 pi radiation (4pi/2 = 2pi) and has been a standard of measuring loudspeakers for many years, as was their preferred method of measurement at JBL back in the 80s.  The ground plane method has been written about by some of the most famous loudspeaker engineers in our business, and is considered a gold standard.  It is incorporated into the CEA 2010 Subwoofer Measurement standard, and that is one of the reasons we are choosing to measure the subs in this very way. 

The "inaccuracies" the manufacturer seems to fear about using ground plane technique to measure subs escapes us, since the range of a subwoofer is essentially 20Hz to 100Hz.  At 100Hz, the full bubble is (speed of sound in ft/sec divided by cycles/sec) or 1127/100 which is about 11.27 ft long.  At 20 Hz, it is 5 times that size, or about 56 ft long.  Because these dimensions are large compared to the size of the subwoofer boxes, the errors introduced by the ground plane technique are a minute fraction of the errors introduced by any listening room in the real world.  (Unless you listen in a 10,000,000 dollar anechoic chamber of course).  Since the microphone is right on the ground, the reflection is in phase with the direct sound.  This prevents errors in the measurements, and produces results that reflect and compare favorably with what our computer models tell us we should expect with that same speaker radiating into 4 pi space (free space).

Since most of us actually place our subwoofers on the floor, we think the readings from Ground plane technique is a very good and accurate way to measure and is certainly NOT unfair to one subwoofer over another, as the same technique and measurement distance will be used for all the subwoofers measured. 

We believe the argument against using Ground plane Technique for subwoofer measurements is a red-herring to distract from the real reason for refusal to participate; fear of having their product shortcomings exposed in a public way.  Audioholics is committed to bringing the truth to all its readers, and goes through extraordinary efforts in attempts to treat all the product manufacturers fairly, but we feel motivated to point out that fair treatment is less desirable to some than others. 

For more information about Subwoofer Measurement Techniques, visit:

Subwoofer Measurement Techniques

Audioholics Subwoofer Measurement Standard

Speakers in a Comparison Should be the Same Size, Price and Enclosure Type

When we were gearing up to do our latest subwoofer shootout, I originally pegged it as the "ultimate subwoofer shootout", implying it was a price no object, size no object comparison.  Manufacturers didn't like this.  Most wanted only typically sized and priced products to be compared.  We acquiesced and limited the shootout to products under a fixed price point, say $3k.  We also asked manufacturers to keep the size of each enclosure under 6ft^3.  As time progressed, some manufacturers simply didn’t deliver product for whatever reason, or they asked us permission to submit a newer and hence larger model, still meeting our price limit criteria.  You could imagine how this played out once competing manufacturers found out that there would be larger sized subwoofers in this shootout especially if their product was pricier than the larger competitor product.  It's virtually impossible to test identically priced and sized subwoofers between manufacturers, since their product design philosophy and business model varies so much.  We however recognize that LARGER subs tend to always have the advantage of output and extension which will be clearly delineated in the product shootout. 

Is All Of This Fuss Really Necessary?

All of this fuss happened because we declared we would do a shootout in addition to the individual products reviews of each product to compare product strengths and weaknesses. These experiences are certainly invaluable lessons we've learned while attempting to organize product shootouts.

If I have one thing to say to the manufacturers that loath product shootouts, I must borrow a quote from FDR's inaugural speech which sums it up best "the only thing we have to fear is fear itself".  If you truly believe that your product offers good performance and value to consumers, let the chips fall where they may. If you don't believe in your products, how are consumers supposed to? Consumers don’t always pick the product that yields the highest SPL output or lowest distortion.  There are many reasons why people purchase a particular product and absolute SPL output performance doesn’t always dictate purchasing decisions which is why detailed product assessments are such a necessity.  Why turn down a product review solely because the results will be compared and tabulated to other products?

2. Readers Often Cry Foul Play

Shortly after publishing any loudspeaker or subwoofer shootout, it's not uncommon to find a reader pop up on our forums or on AVS Forum claiming our product comparison didn’t have enough product variety.  We can certainly understand this criticism.  It's very difficult, and often impossible, to include all of the brands or products everyone wants to see in a shootout. 

Some readers don’t like seeing a product they own fare poorly against other products in a shootout.  They will vehemently defend their beloved product, despite the objective measurement data shows that it may not perform as well as others in the comparison.  

If the reader cries of “UNFAIR” are not due to the lack of variety claim, or reader favoritism towards a product they own, it's (we are told) that our results favor manufacturers that either advertise with us or sell products on the Audioholics licensed E-store.  This is all despite the fact that we've pointed on numerous occasions where our advertisers or E-store products have reviewed unfavorably both in dedicated reviews, or in shootouts.  

Less than 5% of our product reviews are products sold in the E-Store. If our critics actually spend a little time reading our reviews of products sold on the E-store, they will find many that aren't favorably reviewed.

Case in point, Yamaha is one of our biggest advertisers and E-store affiliates, yet we had no problem writing this article:

Trading Amplifier Quality for Features
(note: Yamaha actually revamped their model line as a direct result of this article)

Some not-so-favorable reviews of products sold at the E-store:

The critics also tend to ignore the fact that the Audioholics E-store is a separately run business entity from the editorial site where they simply license our name.  We have little concern if the E-store products sell or not as it has little to no effect on our bottom line.  In fact, a message to those critics that perceive a conflict of interest between the Audioholics licensed E-store and the editorial site, we highly encourage you to purchase from a competitor E-store such as Crutchfield or OneCall.  Those are great places to shop for A/V gear.

In reality, we are more affected by a manufacturer dropping direct advertising because of an unfavorable review, or a new manufacturers not signing up as a perceptual result that our coverage of their products is too negative.  Fortunately we are now working with a 3rd party ad agency which sells roughly 70% of our ad space to larger non audio related audio companies to avoid this very conflict. 

We are not getting rich showing measurable product flaws or companies attempting to pawn off $500 Blu-ray players at a 7X premium price.  We do this, as our mandate to reveal the truth as we see it, and let the economic consequences fall where they may.  Our integrity is our greatest asset.  Without that, we lose our loyal readership (YOU).  Readership is everything which we are proud to tout having the largest audited readership in the industry with over 1 million monthly readers (Source: Google Analytics). 

Some people tend to overlook these facts or really spend the time to dig deeper to realize their claims of product bias are unfounded, especially since we objectively measure each product we test in a similar manner and also conduct both single-blind and sighted listening tests. 

3. Product Shootouts are A LOT of Work

It’s a heck of a lot of work for us to do these shootouts, and impractical for products as sophisticated as an A/V receiver.  When we gear up to do such comparisons we go through a variety of steps to make them happen such as:

  • Solicit manufacturers for product samples
  • Coordinate shipping multiple products to a centralized location
  • Organize the proper test gear to do measurements
  • Come up with a test plan that is executable and fair for all participants involved
  • Measure and interpret data fairly
  • Properly setup blind tests to reduce biases and ensure results are as accurate as possible
  • Write up the results which is very labor intensive as you start stacking up 4 or more products into a single shootout
  • Submit the results to manufacturer s for peer review and hope they won't cry foul play, attempt to sue you, threaten to drop advertising, or claim the product was a prototype and want to submit another sample for review before publishing the results.

For more information on how we do blind and sighted tests, read:

Setting up a Speaker Comparison - the Right Way

Also something to watch out for in sighted and blind tests:

How to Skew a Blind Test

A typical blind shootout between 4 brands of speakers takes a full day of setup and testing and several days of data interpretation and write-up.  We almost never feel satisfied that our listening panel was large enough to ensure statistically accuracy, and our listening panel often finds the whole experience fatiguing after participating for few hours.  It would obviously be better to drag out such tests over a course of several days, but to what end? 

So Why Do We Do Shootouts?

SpaceballThat is actually a darn good question that I often ask myself every time I volunteer to do one or solicit one of our reviewers to conduct one for our site.  I suppose it's curiosity to see just how much variance in performance between brands at similar price points.  It is also important to us, that our readers comprehend that name of brand or price alone, are not the best determinants of quality.

These shootouts are intended to be fun with the obvious benefit of simultaneously covering a bunch of industry relevant products among a panel of enthusiasts that can give our readers invaluable feedback on products they've tested and compared.  This is usually impractical for average consumers to do themselves, so it becomes a great way to get useful feedback on product performance from a centralized source. 

I thought it would be interesting to our readers to get a glimpse of some of the drama we deal with when organizing product shootouts, which explains why we do these so infrequently.  Next time someone makes a request on our forums to do a shootout, it comforts me that I can simply say "maybe" - and then link them to this article for further reading.

In closing, we thank all of the manufacturers that submit products for us to review. We double thank those manufacturers that don't fuss about us comparing their products to their competition or when we find measurable performance issues and disclose them within the reviews.  It's important to acknowledge that without the manufacturers, we would be unable to conduct product reviews or operate this site as it would be void of a revenue stream to pay our editorial staff. 

 

About the author:
author portrait

Gene manages this organization, establishes relations with manufacturers and keeps Audioholics a well oiled machine. His goal is to educate about home theater and develop more standards in the industry to eliminate consumer confusion clouded by industry snake oil.

View full profile