The accuracy of the C2 information returned by the drive, used to detect read errors when ripping audio, is much debated in audio CD ripping forums.
After Andre Wiethoff's EAC and DAEquality programs, the Beta Nero CDSpeed used by CDRinfo with ABEX test CDs for their latest tests confirms that most drives don't always set a C2 flag ON when a byte is misread.
Some drives (1, 2)passed the test with 100% accuracy, but the tests never went over 3500 errors per seconds, and other tests shows that the accuracy can vary very much at higher error rates.
In hydrogenaudio, I asked myself if the C2 inaccuracy doesn't come from the drives switching to more efficient, but less accurate error correction algorithms when they encouter damaged CDs. Like Bobhere suggests in EAC for the C1 level, but for the C2 level.
I have no doc about this and rather thought, according to the CDRinfo writing quality article, that the efficiency of the C2 error correction depended rather on the knowledge of the positions of the errors, given by the previous error correction stages.
Can there be different C2 error correction mechanisms with different accuracy/efficiency ?