[QUOTE=hogger129;2765795]I don’t mean to rain on anybody’s parade, but 16-bit/44.1khz (CD quality) is already sufficient for the range of human hearing. 24-bit/96khz contains information that isn’t necessary for listening. While I would agree it is needed for mastering as to keep the source as pure as possible, no one can hear a difference compared to the same thing at CD quality.
[B][U]I challenge anyone to find evidence that would prove otherwise.[/U][/B]
Besides, the format that Apple sells music in ([I]iTunes Plus[/I]), which is 256kbps VBR AAC/M4A is already a great format that is usually indistinguishable from CD quality at that bitrate. And odds are they would jack up the price for only a minor increase in sound quality.
I’m not an audiophile, but most new music and remasters today sound like absolute garbage because they don’t take care in mastering. I think that should be addressed before moving to huge files with sound you can’t even hear.[/QUOTE]
There never were any parade… At least not on my behalf.
In this case, I really do not need to provide evidence, you will have to check your facts, that is all.
To cover the entire theoretical spectrum of the ear, it will have to be raised to 48KHz.
I do not disagree with you in your other views as I firmly agree, especially when it comes to 12db loudness-war remasters…
We do not need 4K screens either, but we would like to have them just as I would like to have the common audio unit process 24/192 while fact is, most fails above 24/96.
When it comes to the theoretical facts, they speak for themselves. No matter how many times we sample each second and how many bits we use for saving, there will always be a difference to to the constantly varying analogous signal which means we have to introduce quantification to finally introduce infinite resolution to the signal. In this process quantification-errors occur and that is the current culprit not solved yet…
No offence or anything