PlexTools vs LiteOn (Premium vs 48246S)

vbimport

#1

Anybody have both of these drives?

Have you tried testing C1/C2 levels (write quality of CDR discs) of same discs on both drives at the same speed (24x)?

If so, what kind of results are you getting?

I’m gettin VERY different results testing same discs on Plextor Premium + Plextools Pro 2.05 compared to LiteON 48246S + KProbe 1.1.14.

I see differences in avg/max C1 (more than an order of magnitude differences), existence of C2 and distribution of C1 as a function of disc time.

The results can be very different and I don’t know which one to trust.

Anybody else get significantly differing results?

regards,
Halcyon


#2

Some tests at this thread in CDR Info forums:

http://www.cdrinfo.com/forum/topic.asp?TOPIC_ID=10628

I’m starting to have doubts about the usefulness of C1/C2 testing with either KProbe or Plextools - or possibly both of them.

Kprobe (on LiteOn 48246S) seems to miss C2 errors and give really low C1 error counts.

regards,
Halcyon


#3

these are both different drives with different hardware and software being used for their tests, so to say that one drive is missing something that the other picks up is a conclusion that u can’t really jump to. another conclusion u can draw from these tests might be that the lite-on is the superior reader and is able to correct more errors, so it’s hard to say what’s going on.

maybe u should also try using nero cd speed’s cd quality test on both drives since both are capable of using this program, and then also throw in a third drive if u can for comparison against the other two.

btw, u should note which drive u used to burn the discs for all of the tests.


#4

At this point, we don’t know what either drive is reporting and how it’s being interpreted. Neither testing tool is designed to test a drive, but rather to compare discs. If you want to compare the drives, you need a 3rd drive for reading, and a 3rd tool for testing. It’s also true that the normal variation between scans can account for considerable differences in results. For example, KProbe skips a section when it encounters a read error, so the errors in that section are not counted. Dics that produce C2 level errors are apt to have very different results in consecutive scans. When C2 are present, I use CDSpeed to test and watch the read speed as much as the error rate. The bottom line is that any disc that shows C2 level errors has failed the test, and a disc with relatively high C1 is highly suspect.
One possible interpretation of your posted scans is that the Plextor is a much less capable reading drive than the LiteOn.


#5

Plextor discloses what the Q-Check C1/C2 Test is reporting in the Plextools Professional help file.

[b]C1: indicates BLER, which is the number of E11+E21+E31.
C2: indicates the number of E22.
CU: indicates the number of E32.

These are the number of frames that have correctable or uncorrectable errors, counted per second.
[/b]

To my knowledge, KProbe’s author, Karr Wang has not disclosed exactly how measurments are being reported. KProbe may be reporting E32 errors as C2 errors. If so, this would account for the absence of C2 errors in KProbe compared to PlexTools Professional Q-Check. C1 errors in KProbe might only include E11 + E21 (speculation), which would then cause the C1 error count to be lower in KProbe.

My point is unless we know exactly what [b]both[/b] of these two different programs are measuring and how to correlate them (if possible), we are comparing apples and oranges.


#6

Let me try and tackle some good comments that this thread has received.

AZImmortal: " to say that one drive is missing something that the other picks up is a conclusion that u can’t really jump to"

Oh, I perfectly agree. My bad choice of words.

What I meant was that the results differ.

It is impossible with this data to deduce which is more accurate.

What is interesting however is that C’T Magazine has tested C1/C2 data of Plextor Premium to be inline with results from a calibrated/dedicated AudioDev test equipment.

This to me is a suggestion that perhaps Plextor results are more trustworthy. However, it’s far from being a proof, I admit that.

AZImmortal: “another conclusion u can draw from these tests might be that the lite-on is the superior reader and is able to correct more errors, so it’s hard to say what’s going on.”

The conclusions I can possibly draw (now forgetting the comparative Audiodev results for a while) is that:

  1. Plextor is less accurate in reading the disc (tracking/focus/servo erformance)
  2. Plextor is less capable in error correction (not up to the theoretical maximum of 4000 consecutive bits) in it’s CIRC circuitry
  3. Kprobe is less accurate in using LiteOn drives to read the discs and misses some errors completely
  4. Some combination of the above three

Currently I cannot come up with any other explanations than these.

AZImmortal:
maybe u should also try using nero cd speed’s cd quality test on both drives since both are capable of using this program, and then also throw in a third drive if u can for comparison against the other two

I will do the CD Speed Scan Disc next, but I don’t have a third drive to test with right now. I’ll think if I can solve that somehow.

btw, u should note which drive u used to burn the discs for all of the tests.

Actually the name of each image file on my post states: The burning drive, burning speed, media and speed rating, testing drive and test speed. I just forgot to tell you this.

The first discs were burned with LG4480b and the LiteOn48246S.

This shows that it’s not just a matter of burning drive reading better than another drive as there are differences even if the burning drive was neither of the test drives.

I will also try a third drive (Yamaha CRW3200) today and do more tests.

rdgrimes:
At this point, we don’t know what either drive is reporting and how it’s being interpreted

Completely agree.

However, regardless of that it is quite possible (due to the more than order of magnitude greater differences) that either one of the drives or both drives (+ the software used with it) is NOT to be trusted for media quality analysis (at C1/C2 level).

If this is true, then it throws a shadow of doubt over the media quality analysis we use here at CD Freaks and on other forums.

As such, I think this is an important issue and I think more than just me should try to get to the bottom of this. That means: HELP! :slight_smile:

If we truly want to have a useful homebrew way of doing c1/c2 error count tests, then we need to have those test as reliable, repeatable and generalizable (over various reading drives) as possible.

My above tests (if correct!) do not IMHO give a very confident view on the current results we can achieve with homebrew methods.

rdgrimes:
It’s also true that the normal variation between scans can account for considerable differences in results.

Again, I agree completely. To reinstate my position: if the differences are more than an order of magnitude large and swing from no C2 to more than a few C2 with the time based distribution of errors being uncorrelated from drive to drive, well then I think the results are not very trustworthy (or to be more precise, we don’t know which result if either is to be trusted).

Inertia,

If Plex reports E22 as C2, then the Kprobe drive would have to report E22 as C1 for there to be no C2 errors on the LiteOn scans (assuming all other things being equal).

However, if this was the case, then Plextor would mark E11+E21+E31 as C1, while LiteOn would report E11+E21+E31+E22 as C1. While possible I don’t think this is likely, because the C1 levels of LiteOn are sometimes even an order of magnitude smaller than with Plextor.

I also agree that we are more than likely comparing apples and oranges.

My point is however not to try and say X is better than Y (while it may come to that later on), but to try and find out:

? which method (if any) is trustworthy for us hobbyists to compare the media quality of cd-r discs at C1/C2 error count level?

? how far reaching conclusions or generalisations can we draw from a C1/C2 test (using either of the above hard+soft combos) and can these generalisations be indicative of read performance on most other drives? If not, it kind of takes the meaning away from testing the media quality for anything else except personal use on that particular drive on which it was tested.

Again, I will say that I’m not an expert on the issue, I may have done mistakes and I’m not here to start a childlish troll about which drive is better :slight_smile:

I want to find out how we can find out what is a reliable way to test C1/C2 counts and if we can trust / generalise from these results to a broader mass of cd-rom/cdrw drives.

And I need help from those of you who own both Plextor Premium and a LiteOn cdrw drive (and have some extra time).

Cheers,
Halcyon


#7

something else u might want to try is to burn one disc with each drive, and then scan each disc with both drives. this could give some idea about the relative burn quality of each drive.


#8

Halcyon
You may be missing the point to some extent. The scanning utilities are not intended to measure “drive quality”. Ultimately, it doesn’t matter what they measure, or even how accurately they do it. What matters is that they give us a reasonably consistant measurement for the purpose of A-B comparisons so that we can make an informed choice about burn speeds and media type. It’s more of a “pass-fail” measurement than a qualitative measurement. Where they differ mostly is with those marginal discs that have some C2 but not a lot. However, you can almost always identify a poor quality disc on any of these tools, regardless of how accurately the numbers are counted.


#9

rdgrimes:
You may be missing the point to some extent

Well, that may be true :slight_smile:

I don’t however think that the measurements I state are supposed to give a useful indication of drive’s quality.

What I do think that they ought to give (to have any practical value) is to give a rough indication of burn quality on a particular media.

Now, let’s take the instance of Kprobe + a particular LiteOn drive on a particular media (I could use Plextor as an example too, please don’t get hung up on this).

I do believe that Kprobe results can be a pretty good indication of readability of that media on that particular drive on which it was originally tested using Kprobe.

That is, if your burns test well on Kprobe, then that media is probably good for that drive. Period.

However, I think I have shown that to generalise from the results of a single test with a Kprobe on a particular LiteOn to ANY other drive (or Plex/Plextools), might not be that accurate and as such useful.

I think most people here and other cd-r forums tend to think that IF it reads well in Kprobe, then the disc has good burn quality and it will read well in other drives as well.

That is, I can burn at XXX speed and if it reads well in Kprobe, then I can send it to my brother and he’ll have no problems using the disc.

Of course, this “reads well in other drives as well” is dependent on the reading quality of those other drives (like you suggest and I also believe).

However, what I’m trying to understand / find out is if there is a useful homebrew method of measuring burn quality in such a manner that it can be a rough estimate of readiblity of that burn on multitude of different drives.

I do believe that a disc that measures near faultlessly in Audiodev dedicated disc test station (or similar calibrated lab test gear), will read without problems in most cd-drives (at least until somebody proves otherwise).

However, I am NOT convinced anymore that if a disc tests almost faultlessly on Kprobe/LiteOn or on Plextool/Plextor that the tested disc will read without problems on a multitude of other drives.

Yes, drive read quality is a factor here, hence I’m interested in how far/wide can we make generalisations from a single burn test done on one drive using one particular software?

Is LiteOn/Kprobe a useful test of general burn quality (for other drives than the LiteOn it was tested on)?

Is PlexToolsPro/Plextor Premium a usefuol test of general burn quality (for other drives than the Premium it was tested on)?

Did I still miss something? If so, please be patient, I’m known to be thick headed :slight_smile:

Best regards,
Halcyon


#10

I think you are seeking the Holy Grail. The point I’m making is that you should use the test as a comparison, IE: A-B, is disc “A” better than disc “B”. In this respect, yes you can say that if disc A reads better, (in any of the tests), than disc B, then disc A will have a much higher liklihood of reading well in other drives also.

Error rates are not really a measure of burn quality per se, but of read-ability. That distinction is important because there are several other variable factors in having a “good read”. Those varaibles include the media type, read speed, and last but not least the reader itself. The error scans are only checking one thing, the disc’s performance in the reading drive.


#11

I know burn quality <> error free reading and I’m not looking for the holy grail :slight_smile:

I have several authoritative sources on my desk and I do understand the problems and statistical nature of cd-reading. Still, regardless of this, media are tested all over the worldl with varying methodologies/gear and what is interesting IMHO is to try and improve this test methodology available to consumers and find what are the practical limits of what conclusions can be drawn from them.

I’m just trying to make sure if generalisations are possible and to what degree + what test methodology will be the safest in securing comparative media burn quality results.

I totally agree that as comparative results Kprobe/Plextools are useful (if not totally reliable even in that).

But they are not necessarily (and this is what they are being used by many in almost all forums):

  • a guarantee of what is readable and what is not in another drive
  • a guarantee of what burn speed still produces realiable results

Only comparative results of what is better than other AND even that might not be conclusive from a drive to another: i.e. the least erroneous results in drive A might not be as erroneous in drive B as another burn that produced more errors when tested on drive A.

Again, I’m not trying to shoot down anything or to find the ‘perfect ultimate test setup that produces reliable results on an absolute level’ or any such silliness :slight_smile:

What I would like us to do together (I lack resources to do this on my own) is to find out: what combination produces the best generalisable results and how far can these readability results (euphemism for data still being readable free of unrecoverabl errors at a decent speed) be generalized. Or if this is even possible at all.

Of course we can never know these limits precisely for sure, but surely we can improve our current collective knowledge on the matter :slight_smile:

regards,
Halcyon

PS More tests to follow. Been busy doing other things. My current personal stance is that a combination (drive+software) that produces more errors in testing is a more conservative and as such a safer estimate of readability (even if VERY ROUGH indicator) of that burn in other drives. My personal belief is that LiteOn gives way too low C1/C2 counts and the resulting burns (even if they measure excellently in Liteon/Kprobe) will not work as reliably in a multitude of drives as those burns that pass the PlexTool/Premium combo tests. Then again, further test results might convince me otherwise.


#12

Why not just take thing at face value? If the drive shows higher C1 error rates, (on the same disc), than another drive, it’s just not as good of a reader. There is such a thing as over-analyzing. Terms like “guaranteed” and “totally reliable” don’t apply here. If you see low C1 rates on any of the available tests, there’s no evidence to suggest it’s not a perfectly readable disc. Considering the fact that current error rates on CDR’s are lower than on manufactured discs, this seems like a sensable conclusion.


#13

Halcyon:

Since there are quite a few users who own both Plextor Premium and Lite-On drives, it’s strange that apparently no serious comparisons have been made. Thus your work could be highly useful.

However, before embarking on extensive tests, even apparently obvious things should be checked first, since otherwise considerable time could be waisted.

When I looked at your initial tests, my attention focused immediately on the issue of reading speeds. It seems that KProbe v1.1.14 may read at about half of the speed set by a user; see

http://club.cdfreaks.com/showthread.php?&postid=440797#post440797

http://www.cdrlabs.com/phpBB/viewtopic.php?p=71407#71407

On the other hand, it’s not clear what 10-24X CAV means for Q-Check; see

http://club.cdfreaks.com/showthread.php?s=&threadid=71951

Obviously, it would be useful to run tests with KProbe and Q-Check reading at comparable speeds, so such issues should be clarified first. Further, since KProbe and Q-Check may slow down when certain errors are encountered, it might be good to distinguish test results on discs where no C2 errors occur.


#14

For a truly objective comparison, both should be checked in a 3rd drive and testing program, like WSES or CDDr. WSES does control read speed very well as I recall.