Let me try and tackle some good comments that this thread has received.
AZImmortal: " to say that one drive is missing something that the other picks up is a conclusion that u can’t really jump to"
Oh, I perfectly agree. My bad choice of words.
What I meant was that the results differ.
It is impossible with this data to deduce which is more accurate.
What is interesting however is that C’T Magazine has tested C1/C2 data of Plextor Premium to be inline with results from a calibrated/dedicated AudioDev test equipment.
This to me is a suggestion that perhaps Plextor results are more trustworthy. However, it’s far from being a proof, I admit that.
AZImmortal: “another conclusion u can draw from these tests might be that the lite-on is the superior reader and is able to correct more errors, so it’s hard to say what’s going on.”
The conclusions I can possibly draw (now forgetting the comparative Audiodev results for a while) is that:
- Plextor is less accurate in reading the disc (tracking/focus/servo erformance)
- Plextor is less capable in error correction (not up to the theoretical maximum of 4000 consecutive bits) in it’s CIRC circuitry
- Kprobe is less accurate in using LiteOn drives to read the discs and misses some errors completely
- Some combination of the above three
Currently I cannot come up with any other explanations than these.
maybe u should also try using nero cd speed’s cd quality test on both drives since both are capable of using this program, and then also throw in a third drive if u can for comparison against the other two
I will do the CD Speed Scan Disc next, but I don’t have a third drive to test with right now. I’ll think if I can solve that somehow.
btw, u should note which drive u used to burn the discs for all of the tests.
Actually the name of each image file on my post states: The burning drive, burning speed, media and speed rating, testing drive and test speed. I just forgot to tell you this.
The first discs were burned with LG4480b and the LiteOn48246S.
This shows that it’s not just a matter of burning drive reading better than another drive as there are differences even if the burning drive was neither of the test drives.
I will also try a third drive (Yamaha CRW3200) today and do more tests.
At this point, we don’t know what either drive is reporting and how it’s being interpreted
However, regardless of that it is quite possible (due to the more than order of magnitude greater differences) that either one of the drives or both drives (+ the software used with it) is NOT to be trusted for media quality analysis (at C1/C2 level).
If this is true, then it throws a shadow of doubt over the media quality analysis we use here at CD Freaks and on other forums.
As such, I think this is an important issue and I think more than just me should try to get to the bottom of this. That means: HELP!
If we truly want to have a useful homebrew way of doing c1/c2 error count tests, then we need to have those test as reliable, repeatable and generalizable (over various reading drives) as possible.
My above tests (if correct!) do not IMHO give a very confident view on the current results we can achieve with homebrew methods.
It’s also true that the normal variation between scans can account for considerable differences in results.
Again, I agree completely. To reinstate my position: if the differences are more than an order of magnitude large and swing from no C2 to more than a few C2 with the time based distribution of errors being uncorrelated from drive to drive, well then I think the results are not very trustworthy (or to be more precise, we don’t know which result if either is to be trusted).
If Plex reports E22 as C2, then the Kprobe drive would have to report E22 as C1 for there to be no C2 errors on the LiteOn scans (assuming all other things being equal).
However, if this was the case, then Plextor would mark E11+E21+E31 as C1, while LiteOn would report E11+E21+E31+E22 as C1. While possible I don’t think this is likely, because the C1 levels of LiteOn are sometimes even an order of magnitude smaller than with Plextor.
I also agree that we are more than likely comparing apples and oranges.
My point is however not to try and say X is better than Y (while it may come to that later on), but to try and find out:
? which method (if any) is trustworthy for us hobbyists to compare the media quality of cd-r discs at C1/C2 error count level?
? how far reaching conclusions or generalisations can we draw from a C1/C2 test (using either of the above hard+soft combos) and can these generalisations be indicative of read performance on most other drives? If not, it kind of takes the meaning away from testing the media quality for anything else except personal use on that particular drive on which it was tested.
Again, I will say that I’m not an expert on the issue, I may have done mistakes and I’m not here to start a childlish troll about which drive is better
I want to find out how we can find out what is a reliable way to test C1/C2 counts and if we can trust / generalise from these results to a broader mass of cd-rom/cdrw drives.
And I need help from those of you who own both Plextor Premium and a LiteOn cdrw drive (and have some extra time).