http://www.audioholics.com - Audio & Speaker Cable Myths vs Truths Revealed. Sign up to our FREE Newsletter here: http://www.audioholics.com/newsletter-signup.
Hugo: Gene please tell us why there are so many crazy myths surrounding speaker and audio cables?
Gene: Well you have to understand that consumer audio markets to people that are not very technical in nature. This includes reviewers of most AV magazines that often blindly buy into the pseudo science since they often take what the manufacturers says as gospel.
Electronics and EE theory is often intimidating to untrained individuals similar to how bodybuilding may be intimidating to someone that has never worked out with weights before.
The common audiophile is desperately looking for ways to improve and tweak the fidelity of their audio systems. Thus, the power of suggestion is very strong and many exotic cable manufacturers know this all too well. As you know, this isn't dissimilar to the supplements industry.
What many exotic cable manufacturers do is take what I call "engineering half truths" and stretch them beyond what would even be considered believable as Star Trek physics, hence the pseudoscience is born. Some of their theories are even more cockeyed than a looney tunes episode.
Hugo: Can you list some of the common pseudoscience and misconceptions?
Gene: One of the biggest misconceptions exotic cable vendors like to perpetuate is a problem with skin effect. Simply put, Skin effect is a measure of how the resistance and inductance changes in a cable as a function of frequency. As frequency goes up, the skin depth decreases along the conductor so in a sense the conductor becomes less conductive at those frequencies since the higher frequency currents are mostly distributed towards the surface of the conductor. The reality of the situation is that even common 12 AWG speaker wire still has VERY low resistance almost decade (200kHz) than the highest frequency humans can hear (20kHz). Skin effect is a real problem RF Engineers deal with all the time. Although it's measureable at audio frequencies, it's mostly negligible, hence why there is so little written about it from peer reviewed sources when dealing with audio.
So, companies come up with elaborate ways of allegedly reducing the "skin effect" problem. We've measured many of these so called "skin effect" free cables and the reality is many of them have higher DC resistance right off the bat because they use high gauge conductors. So even though they can maintain a more linear frequency dependent resistance and inductance profile as frequency goes up, they still have HIGHER resistance than ordinary 12 AWG speaker cable even up to 100kHz.
In our Speaker Cable Gauge Article we show the most important metric when dealing with speaker cables is DC resistance. The lower the gauge of cable, the less resistance it will have. We even tabulate a recommended AWG of cable you should use based on the length of the run and the impedance of your speakers. This will help minimize insertion loss which is DIRECTLY related to the cable's resistance. One should NEVER sacrifice low DC resistance to mitigate other alleged problems at audio frequencies for speaker cables.
Other Crazy Pseudoscience:
Other crazy pseudoscience includes slapping a battery across the dielectric of the cable to allegedly reduce distortion by keeping the cable's dielectric "broken in". The battery itself doesn't conduct but it sure looks pretty having a black led backlit box attached to your cables. The whole notion of cable break in is yet another myth that has no basis in science or logic for that matter. Music is an AC waveform, always changing in amplitude and phase. The idea that the cable somehow aligns itself to some optimal state is not only a fallacy, but it's an embarrassment that there are companies knowingly reporting this nonsense as some engineering truth.
Similarly the idea of a piece of wire, or even the dielectric for that matter, introducing non-linear distortion is complete hogwash. This is something that is immeasurable even with test equipment 1000's of times more sensitive and consistent than the human ear.
Some companies even go so far as to convince their customers that it's beneficial to cryogenically freeze their cables prior to using them. They theorize that this will align the crystalline structure of the copper so that it will produce less distortion. What they fail to tell the customer is that any beneficial realignment that may have occurred will be nullified once the cable comes back to room temperature. I suspect they may see more benefit having their cables soaked in Kosher chicken fat and blessed by a Rabi.
For the rest of the transcript, please go here: