[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [ARSCLIST] Cassette obsolescence - digitizing standards
I'm sorry to say that I do not agree with you at all. Your
calculations look very impressive, but as you said, the examples
cited are totally made up.
And still result in a fairly large error.
And there seems to be numerous factors not included in the equations
(huge increase in digital noise and relative distortion from the
original 16 bit file being upsampled to 24 bits; and one major
oversight, that is, dithering [or truncating] down a 24-bit file to
16-bits creates aliasing and quantizational errors [ie dithering noise]).
First off, you can upsample a 16-bit file to 1 24-bit file with
absolutely no increase in noise or relative distortion. If you find
elsewise, then you are either doing it wrong, or have really poor software.
Second off, the error caused by truncating OR dithering would be
within 1 value, not 14, so again the point is moot. You would prefer
to increase errors by up to 14 points (in my example) just so you can
avoid adding a possible 1 point error? Hardly what I'd call a sound
business practice when accuracy is indeed a necessity.
And this is regardless of what software or noise-shaping algorithms
used. Why do all this to audio from a cassette that most likely
already suffers from noise/hiss problems? Oh, and last but not
least, my ears can pick up the other problems that this scenario
creates -- audible digital artifacts. The only way to avoid having
to deal with this mess is to do the job right in the first place and
make the original transfers in 24 bit. It might be best for the
original poster to invest a few dollars in obtaining an Alesis Masterlink.
One man's opinion.
Fort Lee, NJ