Thanks alot for the replies!
I’ve done a few tests with dvdisaster: initially the app seemed to be the perfect match. Although the error-skipping wasn’t improved, I found that the program could skip an x amount of sectors every time it found an unreadable sector, which improved the scan speed.
But still, it is extremeley slow!
The whole experiment is about testing the abilities of PAR2: I made an ISO with a bunch of movies on it, ran a PAR2 creation for it and then burned the image. I then took a marker and made some dots on the surface of the disc - to simulate damage.
The disc I’m testing now, has a dot on it as big the nail of my little finger (roughly). Even so, it seems that dvdisaster is reporting way too many errors. Take a look at the screen.
This is roughly one hour after starting the test. Initially, I thought that the illustration of the disc, actually would show where the physical errors were - thus confirming with the size of the marking I made. This turned out to be untrue - as you can see on the image.
The parity recovery level is set to 15% - which means that (2 272 139 * 0.15) = 340 821 sectors (roughly) could break before running out of parity blocks…
In this test, 529 408 sectors is reported as unreadable - which equals 23.3%. I ran out of patience,
I simply can’t see how almost one quarter of 4.7GB vould fit under one small dot… What’s going on?
Ultimately, it boils down to this point:
With thousands of DVDs archived, some are doomed to have some errors. Let’s say 10% of the discs has some kind of unreadable errors. Even with parities for those discs, it would take years to scan them! Obviously the discs would have corrupt files on them without the aid of par.
- How can one do a quick scan of corrupt disc?
- Why does the disc seem to have more errors than what’s covered by the damage?