Fast(er) skipping of errors


I’m doing some tests, where I’ve intentionally made a disc unreadable with a marking pen. Now, I’ve only marked aproximately 15% of the surface of the disc, but at the rate the errors are skipped, it’ll take the whole day - if not days to get the image dumped to the harddrive.

I’ve tried IsoBuster, which has a minimum error retry of one - and it doesn’t show much process info either.
ImgBurn ignores error, but at a extremely slow pace.

What program / setting can do this faster?

You can change the software and hardware retries in ImgBurn down to 0, then it’ll be as fast as the drive can handle.

AFAIK there are types of retries that affect the pace:

  1. software retry amount
  2. hardware retry amount

Software retry is just a loop in the application code to re-read n times, so it is really software retry = hardware retry * n.

Hardware retry amount makes a considerable difference, but it is hardware limitation, so some drives won’t support changing it.

After setting to 0 it will still take quite some time to rip the disc, but there is one more setting that will make it rip really fast, which is to disable the data (CD-ROM) sector ECC. Blindread and CloneCD can both do this, but again it is hardware limitation. A Sanyo type chipset Plextor drive supports this. The newer Plextors are not Sanyo so don’t offer this.

Supported ones are old ones like:
Plex 760, 755, 8432T, 716, Premium.

If using CloneCD you can set the read error recovery options by right-clicking the drive. It won’t tell you that the drive supports or not at this point, but it will tell you during the rip in the log.

I’m not sure if these programs will be faster, but you could try DVDisaster and IsoPuzzle which are also designed for recovering an image of a damaged CD/DVD.

DVDisaster lets you configure the amount of sectors it should skip ahead after experiencing a read error.

Thanks alot for the replies!

I’ve done a few tests with dvdisaster: initially the app seemed to be the perfect match. Although the error-skipping wasn’t improved, I found that the program could skip an x amount of sectors every time it found an unreadable sector, which improved the scan speed.

But still, it is extremeley slow!
The whole experiment is about testing the abilities of PAR2: I made an ISO with a bunch of movies on it, ran a PAR2 creation for it and then burned the image. I then took a marker and made some dots on the surface of the disc - to simulate damage.

The disc I’m testing now, has a dot on it as big the nail of my little finger (roughly). Even so, it seems that dvdisaster is reporting way too many errors. Take a look at the screen.

This is roughly one hour after starting the test. Initially, I thought that the illustration of the disc, actually would show where the physical errors were - thus confirming with the size of the marking I made. This turned out to be untrue - as you can see on the image.
The parity recovery level is set to 15% - which means that (2 272 139 * 0.15) = 340 821 sectors (roughly) could break before running out of parity blocks…
In this test, 529 408 sectors is reported as unreadable - which equals 23.3%. I ran out of patience,
I simply can’t see how almost one quarter of 4.7GB vould fit under one small dot… What’s going on?

Ultimately, it boils down to this point:
With thousands of DVDs archived, some are doomed to have some errors. Let’s say 10% of the discs has some kind of unreadable errors. Even with parities for those discs, it would take years to scan them! Obviously the discs would have corrupt files on them without the aid of par.

  • How can one do a quick scan of corrupt disc?
  • Why does the disc seem to have more errors than what’s covered by the damage?

I’m closing this topic down, and started a new thread:
This thread came abit off of what I intended it to be.
Thanks for replies.