This post is rather long, but it will be very interesting to those who are seeking to know more about weak sectors - this will also raise quite a few questions Here’s my lab work…
With my previous tests on max.iso (for those who don’t know it’s a file with weak sectors, see max.iso thread) one of my test result had been wrong. I have noticed that my test result of trying to record the max.iso file as an audio CD wasn’t accurate and inconclusive.
Let me tell you what my test involved:
I took the max.iso and made it into a multiple of 2352 bytes by appending bytes of zeroes to the end (did a bit of simple maths to calculate how much to add). Then I made a cue sheet:
FILE “Z:\MAX.ISO” BINARY
TRACK 01 AUDIO
INDEX 01 00:00:00
Okay, I can’t use this with CloneCD to burn it - because it does’nt accept cue sheets. I used Daemon Tools to mount the cue sheet & CD image and used CloneCD to get a CloneCD image of it (also verified the main binary files just be careful) and lastly burnt it to a CDRW at 4x writing with CloneCD.
For writing: Plextor IDE 8432T, firmware 1.09 (latest)
For reading: Toshiba DVD SD-1502, firmware 1816 (latest)
Daemon Tools v3.10
WinHex (for padding of zeroes and comparison of files)
What was wrong with the test?
There were 3 problems:
My Plextor IDE 8432T drive (or the software - I’m not sure exactly what is the cause) doesn’t write the first few sectors of the image on to the CD. So I get a copy that had always been truncated at the start
Strange, but my Toshiba DVD could not read the last few sectors of the copy.
Also, with CloneCD v22.214.171.124 it some how resulted in a copy that was modified - I noticed this when I tested with a CDROM image with no weak sectors and using the same technique. Settings I used for writing: AWS off, Don’t Repair Subs off, 8x writing.
Please, I don’t wish to get into trouble with Olli, so lets not jump to any conclusions on CloneCD until we speak to him - anyway it’s not the issue here. The newest CloneCD probably has this fixed already - I’m not sure.
The problems I described are conclusive because I have tried it 5 times over just to make sure.
A change of test strategy
Any way, I decided to change my test strategy…
I used the same technique of padding the image, but this time I added a further 2 seconds of sectors to the front of the padded max.iso image and also a further 2 seconds of sectors at the end.
This time, I used Golden Hawk’s CDRWIN 3.8G to burn the cue sheet and image to the CDRW again at 4x, CDRWIN’s raw mode: off.
I read the CDRW with CDRWIN and Toshiba DVD and then I cut out the paddings of zeroes so that I could compare it with the original.
Finally, the result, the comparison gave 3 differences. But then I thought about it and suggested maybe those were because of jitter. So I re-read it - this time the comparison gave 0 differences!!! I tried 4 times and all gave 0 differences!!!
Now, I’m getting somewhere (my eye brow raised - this is strange). I would expected differences. This raises the facts:
The writing of weak sectors are perfectly writable and the data are all intact and readable when you are writing an audio CD.
I further my tests…
I did the same tests on the posted SD2 images (see ‘EFM Testing Images Mirror’ thread) that I found in this forum and also the famous generated non writable weak.iso image posted by blackcheck (see ‘Almost correct EFM recording’ thread).
All of those resulted in 0 differences in comparisons to the original I started with.
So what is going on here?
It looks like the problem is not EFM encoding problem. Also, don’t tell me bullshit - interpolation cannot help here!! You do the maths - it won’t give you those bytes!!
To me, it looks like the writer when it is fed with those weak sectors is probably truncating some bytes and so results in data CDs that has sectors different from the original - in most cases unreadable sectors - because of missing sync or header.