On Wed, 17 Jul 2002, jtalbert wrote:
Actually, I have a very modest turntable at this time (under $300) and when I was finally able to get hold of The Mix on vinyl and compare it to the CD, to my ears, there was an astounding improvement. The CD version of The Mix is quite cold and almost lifeless in comparison. The sound of the vinyl had a much more natural stereo field with greater instrument separation and presence. I would honestly say the differences seemed more than trivial. And again, that is on a standard turntable with a total system cost probably under $2000.
Well, maybe one day I'll be able to afford one of these rare beasts and see for myself. That's still years off though.
I also like it when music is mastered direct to CD because it eliminates the annoyances that come with live performances - environmental noise, crowd noise, and mistakes. Mastering direct to disk is not what would eliminate any of these types of noises and flaws. Environmental noise has to be controlled before it ever hits the mixing board stage for any recording. Likewise, crowd noise will leak into ANY recording, direct to disk or not, if not controlled by the engineer. And of course, mistakes in the performance have nothing to do with the recording medium! :-)
What I meant was, using electronic instruments to produce the music, and keeping the music in electronic form (preferably digital) at every stage of the process involved in getting it onto a CD. That means studio, no crowd, no microphones if possible, no environmental noise other than electrical transients which should be eliminated anyway, and mistakes edited out in the computer.
I find this hard to believe, and I lack access to the equipment to test it. What we need is to hook a good spectrum analyser up to a vinyl system and a digital system and see which one more accurately reproduces the sound from the studio. Without the original studio final mixes, there would be no way of making this type of comparison accurately. Especially when one takes into account that each medium will have been mastered slightly differently.
Use an artificial reference sound designed for testing purposes then.
Digital clocks are highly accurate - to better accuracy than humans can detect. I can't imagine the source of the problem you're referring to, since everything in a digital system should be driven by a crystal oscillator, and all lags in the system should be constant.
On the contrary, NOT all digital clocks have the same level of accuracy. Any flaw in the clock introduces jitter which in the end can ruin a digital signal's clarity, separation, presence, frequency response and other properties. There's a reason why Apogee DAC's are priced so much higher than consumer convertors (especially those found in less expensive CD players and cheap computer sound cards. The digital to analog convertor (and analog to digital convertor when recording) is one of the most important links in the digital signal chain (right up there with bit resolution and sampling frequency)
I agree that the ADC and DAC are important, but I still find it hard to believe that a crystal oscillator would emit pulses with enough irregularity that it would be audible to humans.
Again, the kind of timing "clock" referred to in digital audio has a completely different purpose. A bad digital clock can adversly affect the quality of the signal. Its not be easily noticeable on its own, but when comparing with a convertor that has a *high* quality clock, the difference in remarkable.
Well, this is another place where some impartial measuring equipment would help settle the issue. -- /* Soleil */