What is an acceptable level of C1 errors

Hey guys,

What is an unacceptable level of C1 errors on a disc with 0 C2 errors?

I’m just curious at what point the a certain number of C1 errors affects playback and readability, etc…

Read Disc Grading System - BLER in my sig.

Most manufacturers set higher standards for themselves, striving for an average BLER of under 50.

there’s no such thing as a level of C1 errors that will start affecting a disc’s readability. by definition, C1 errors are correctable read errors. look at the media tests to get an idea of what kind of C1 figures are low/avg/high/etc.

So in theory, the number of C1 errors should not make any difference on readability and playback?

Originally posted by djhelpme
So in theory, the number of C1 errors should not make any difference on readability and playback?
You bet ! :wink:

acceptable average c1 error is less than 10

11-30, not too good, but can live with…

over 30, take the remaining media back and change to other brand or simply ask for refund…

the best media(taiyo yuden, Ritek or Mitsubishi Azo) attains average c1 value less than 1

less than 1 - really :slight_smile:

There is no written standards on this (Beside BLER error rate).

But my personal standards is:

Very good: C1 average less than 1.0 and no spikes over 15 C1 errors
Good: C1 avarage less than 2.0 and no spikes over 20 C1 errors.
Acceptable: C1 average less than 5.0 and no spikes over 50 C1 errors.

And in all cases zero C2 errors.

Anything higher than this do I not accept on discs that I want to use for archival.

On test discs/throwaways I do not really care as long as there is no unreadable sectors…

Hmm, I’m curious about C1 errors in particular. I burned some SVCDs @ 40x without C2 errors and I didn’t scan them yet but all I know is that they didn’t play smoothly, without artifacts. Yet SVCDs burned at 32x with the same media play better and I’m willing to bet although I don’t have the 40x burned media tested (besides knowing there are no c2 errors) that the C2 level is higher.

Is this perhaps an indication that higher C1 errors may be seen at higher burn speeds and if this is case that it does affect playback?

Is this perhaps an indication that higher C1 errors may be seen at higher burn speeds and if this is case that it does affect playback

There’s no evidence that any level of C1 affects playback. C2 is another matter. I’ve seen C1 in the range of thousands (max) on discs that are perfectly readable.

Originally posted by djhelpme
Hmm, I’m curious about C1 errors in particular. I burned some SVCDs @ 40x without C2 errors and I didn’t scan them yet but all I know is that they didn’t play smoothly, without artifacts. Yet SVCDs burned at 32x with the same media play better and I’m willing to bet although I don’t have the 40x burned media tested (besides knowing there are no c2 errors) that the C2 level is higher.

how would u know that there aren’t any C2 errors if u didn’t scan them yet?

BLER (Block Error Rate) is defined as the number of data blocks per second that contain detectable errors, at the input of the C1 decoder. This is the most general measurement of the quality of a disc.
The “Red Book” specification (IEC 908) calls for a maximum BLER of 220 per second averaged over ten seconds.
Discs with higher BLER are likely to produce uncorrectable errors.

Nowadays, the best discs have average BLER below 10.
A low BLER shows that the system as a whole is performing well, and the pit geometry is good.
However, BLER only tells you how many errors were generated per second,
it doesn’t tell you anything about the severity of those errors.

Therefore, it is important to look at all the different types of errors generated.
Just because a disc has a low BLER, doesn’t mean the disc is good.
For instance, it is quite possible for a disc to have a low BLER, but have many uncorrectable errors due to local defects.
The smaller errors that are correctable in the C1 decoder are considered random errors.
Larger errors like E22 and E32 are considered burst errors and are generally caused by local defects.

If you need to have high confidence that the disc will play on all players, you need to look at the pit geometry related signals like I11, I3, Asymmetry, and Push-Pull. If these signals are near optimum, there is a high probability that it will play on all players.

Q. What is the best speed to test at?

A. In general, it is advantageous to test at the fastest speed available to maximize throughput.
Also, some applications require that the disc work at higher speeds.
Really good discs will usually produce pretty much the same results at all speeds.
However, there are many situations where you may need to test at a slower speed.

For one thing, many discs will generate lots of serious errors at high speed, or not play at all.
Certain types of defects will be greatly magnified at higher speeds. In this case, you may wish to try re-testing at a slower speed.

This doesn’t necessarily mean that the disc is no good. CD-ROM drives will fall back to a lower speed if the disc is not readable at the current speed. They also use up to 10 re-tries to recover the data. In most applications, the user is completely unaware of this.

Generally if the disc performs well at high speed, it is a pretty good bet that it will work at lower speeds, although this is not always the case. Since CD-ROM drives are optimized for higher speeds, drive performance may be compromised at lower speed. Also, results can be affected by vibration, which varies with speed.

Source

AZMortal,

In that case I meant that I just used Nero CD Speed to verify the number of C2 errors, which are 0, but I did not use Kprobe to check the # of C1 errors that time.