I still don’t believe it, and I’m searching for sources to prove my opinion.
But if we agree in one thing, namely that a PC with a decent CDROM drive can rip an exact image of a CD audio (you can copy a CD audio and compare the copy bit-wise with the original, you’ll see that they are identical), then a PC is the perfect CD audio playback device if you activate the “activate digital cd playback for the cd player” (or however it’s called in English) option, because with that option activated, you’re doing a real-time rip (bit-wise identical) and sending the data to the soundcard afterwards (still real-time). There won’t be any jitter left when the digital signal is finally sent to the soundcard, which of course has to be of high quality for passing the analog signal to a decent amp. No jitter, problem solved.
And then I don’t see why a normal CD player shouldn’t be able to do the same thing. Why shouldn’t it be able to correct the effects of jitter, when the PC is able to do it? Because bit-wise identical copies are possible (try it), and therefore you definitely are able to get a “perfect” digital copy of the original disc into your PC, otherwise you couldn’t burn an identical copy of it, what you actually can. And when you’ve once saved it to your hard disc, it is still identical, and when you start to play it back, the clock signal for playback is generated from the sound card, and every sample is played back in the perfectly right moment -> no jitter -> perfect sound from the soundcard.
But I can’t prove it yet, admitted. I found a very detailed article from a scientist on how to measure jitter, but unfortunately he did not mention at all if jitter has any impact on the sound…
I am aware that the Hifi industry spreads lots of seemingly reasonable arguments why this or that is better, and many of them are just b*sh if you take a closer look at them with the necessary background knowledge. Unfortunately, the jitter question is very complex, so I can’t disprove it. Yet.