When it comes to audio quality, there is no one true scale or score you can give to determine how professional sounding the audio quality is. It is mostly a matter of opinion. Oh sure, you can measure things like signal-to-noise ratio, or chart energy across a spectrum of frequencies. But at the end of the day what matter is what sounds better, and people can only be determine that with their ears. Often a bit of audio that has a low signal-to-noise ratio will also sound bad. Likewise with audio that has lots of distortion. But will that always be the case?
One other determination of audio quality is quality of gear. For instance, most folks will probably say that an expensive microphone will yield better audio quality than an inexpensive one; or that an expensive microphone pre-amplifier makes the difference. But will that, too, always be the case?
Let’s have you decide for yourself.
I have two clips of audio for you to listen to. Both are voice over recordings. One was recorded with a $5.00 plastic PC mic plugged directly into the built-in sound card of a 6-year old standard PC. The other guy used a $300 studio microphone being fed through a $200 audio interface. So what we have are two clips of audio that on paper should be light-years apart in audio quality; essentially, one should sound 500 times better than the other, right? Well since the gear cost 500 times more, it may not be be 500 times better, but it should be really obvious shouldn’t it?
Ready to hear these two clips? Okay, open our home page at: www.homebrewaudio.com and scroll to the middle of the page. You’ll see two flash players. Spoiler alert: Don’t read the text BELOW the players, since that will give the answer away.
Now listen to each sample. They’re only a few seconds long. Can you tell the difference between a $5 recording and a $500 recording? Regardless of the outcome, think about what this means in terms of the whole expensive-gear-always-sounds-better argument.
We’d be interested in hearing what your results were. Leave a comment!