Plex Premium + PlexTools Pro vs LiteOn / Kprobe - round 2 (WARNING BIG!)

I did a bunch of burns and tests to gauge the differences that the two drive/software combos produce, when measuring C1/C2.

NB! The scales on the images/graphs change from one picture to another. Don’t compare them visually without calibrating for the differences in your head.

I first tested a couple of problematic media:

Yamaha CRW3200 burn at 8x on TDK Metallic 48x

LiteOn48246s (using latest 52246S firmware) at 24x:

Looks ok, if not the best possible.

Plextor Premium (using firmware 1.02) at 24x:

Four times the average/total C1, but no difference in C2.

LiteOn48246S burn at 16x on TDK Metallic 48x


Looks ok, not great.


About the same distribution, almost three times the average/total C1 count. No difference in C2.

Plextor Premium burn at 40x on TDK Metallic 48x


Again decent, but not good.


Same difference, almost three times as much C1, no difference in C2.

Yamaha CRW3200 burn at 24x on Platinum 48x


Mediocre or worse results, but no C2.


Again, three times as much C1 and a few C2 errors that are not on the LiteOn scan.

LiteOn48246s (with 52246S firmware) burn at 52x on MMore 52x


Quite a bit better, but still no good. No C2 though.


The trend continues, almost three times as much C1, but no C2 difference this time.

Then I decided to try some more of the ‘good stuff’ (from this point on, Plextor Premium firmware 1.03 had
been released and was active on burns and read tests):

Plextor Premium burn at 40x on Verbatim DatalifePlus (printable surface) 48x


Very nice results. A decent candidate for an archival media (I would like a better scratch resistance on the label side, but that’s another issue altogether).


Still a fair result, even though C1 count has increased to almost 3.5 as much as with LiteOn test.

LiteOn 48246S (52246S firmware) burn at 52x on Verbatim DatalifePlus (printable surface) 48x


The same Verbatim disc, now burned on LiteOn at 52x. Still measures well, imho.


Again, c. 2.6 times as much C1 when measured in Plex Premium. Still ok though.

Plextor Premium burn at 32x on Plextor 48x


Similar results as the Verbatim DataLifePlus, although with a more constant error distribution. Very good result.


Again four times as much C1, but no difference in C2. Still good results, imho.

LiteOn 48246S (52246S firmware) burn at 52x on Plextor 48x


The same disc type that was burned with Premium at 32x (due to PoweRec) is now burned at 52x on LiteOn. Measures about the same as when burned on Plextor. Very good that is.


Plextor didn’t like the burn nearly as much. C1 count is almost seven times as big.

<phew>

Now, before I comment anything myself and stray your thoughts elsewhere…

Any comments?

regards,
Halcyon

Why do you include the C2 graph since they are empty (No errors)??? :confused:

I find it very difficult to absorb all the images, I think this sort of comparison begs for a graph of the numerical values without the images.

As for the results, I have a little trouble with your characterization of some of the results as “not good”. I put more weight on the max value than the total error count. The average count may be a good compromise, but based on the max counts, all of these discs are in very good shape.

Why c2? Out of habit, sorry for the waste of space. If you want to do them again without C2 graphs, please go ahead :slight_smile:

My characterisation of burn quality (for a particular test drive) based on the C1/C2 results may be too harsh yes, but it’s consistently based on the C1 average count and lack of C2.

I only reported C1 counts to illustrate the differences between the tests (although they are naturally also evident in averages as well).

Summary of the above test results

  1. Plextor Premium with PlexTools Pro gives 2.5 - 6.7 times the amount of C1 errors (also reflected in avg c1/sec counts) compared to LiteOn 48246S with Kprobe.

  2. Sometimes (on not the absolutely best media) Plextor Premium + Plextols Pro report C2 errors while LiteOn + Kprobe report none.

Any more comments?

regards,
Halcyon

Halcyon:

With the speed set at 24x, KProbe v1.1.14 is probably reading at 16x. Similarly, it’s not clear what 10-24X CAV means for Q-Check.

To clarify these issues, could you use a stopwatch to measure how long it takes for KProbe and Q-Check to read a given disc? For KProbe, the timining should start when you see the “Testing …” message at the bottom; similarly for Q-Check you should try to skip the spinup.

It would be best to use discs without C2 errors, and to tell us their MSF-s from Q-Check.

Although I agree that a summary (as requested by rdgrimes and provided by Halcyon) is nice to have, I also appreciate having the original charts. Looking at them, I am actually struck by how similar the results are, even though they were measured on different drives using different test software.

Yes, the Premium error values are on average about 3 times the magnitude of the LiteOn results (and closer to about 2-2.5 times for the peaks). However, the structure of the measured errors is mostly very similar. The average error levels tend to rise and fall together, and the peaks line up very well.

I have stretched several of the Premium graphs to match the size of the KProbe graphs in order to better illustrate this observation:

As KCK pointed out, there is a bug in KProbe 1.1.13 and 1.1.14 that causes it to test at one speed lower than what it is set. Thus, when you set KProbe to 24x (which would be CAV on an LTR-48246S or LTR-52246S), you were actually testing at 16x CLV. See the following thread (started by KCK and inspired by Halcyon):

http://www.cdrlabs.com/phpBB/viewtopic.php?p=71407#71407

On my LTR-48246S, switching from a true 16x CLV read speed to a true 24x CAV (~11x-24x) read speed nearly tripled the average error measurement on the same disc. The total variation over the entire range of 4x to 48x measurement on my drive was about 5x. This just points out that there are many factors that affect the measured error levels, and read speed is one of them.

You might get better agreement if you test on KProbe 1.1.14 with the speed set at 32x (to give a true 24x CAV measurement) or you may not. But either way, given the good agreement on error distribution, I would not be so discouraged over a scale factor of generally 2.5-3.5 between the Premium and the LiteOn measurements. If we can pin down the cause and say definitively that it is due to a scaling error in one of the test suites or improved reading ability of the 6S drives, then great. If we can’t pin it down, then we can live with the difference and simply recognize that it exists.

I have always said that these tools are best used for doing a relative comparison of burning quality on various media burned at various speeds in one’s own drive and are not meant to supplant professional media testing equipment.

cfitz

Here is one example of a disc tested in a Premium and a LiteOn, both at 24x CAV reading where, where the measurements matched up unbelievably well, both in structure, average and peak values:

http://www.cdrlabs.com/phpBB/viewtopic.php?p=71511#71511

cfitz

KCK,

Thanks for that information. I had read your warning earlier, but had performed most of the tests before already. I will do the next tests at a speed that measures comparably in time. I will provide some tests about the Kprobe read speeds as soon as I can.

cfitz,

I completely agree on the time based distribution of the data. The results are very similar for these burns. Elsewhere I have provide data of test where even the time based distribution is very uncorrelated between the two tests:

http://www.cdrinfo.com/forum/topic.asp?TOPIC_ID=10628

I think this goes to show that not all results between the two test set ups produce comparable results, but it doesn’t tell which result (if either) is more trustworthy indicator of the general quality of the readability of the disc (issue 1).

I have always said that these tools are best used for doing a relative comparison of burning quality on various media burned at various speeds in one’s own drive and are not meant to supplant professional media testing equipment.

Oh, I almost agree with that with some minor constraints. I think I have grounds for it this slight disagreement as well (I’m not just doing it out of the joy of argumentation, even if it may appear as such) :slight_smile:

To be as exact as I can be considering the information available to me now and using my limited reasoning the test results are good for:

Determining the comparative error free readability of burned discs on a drive on which those discs were tested, while for any other drive results may or may not be similar (results may correlate when read in another drive or they might have very significant differences).

If we want to use KProbe results in this and other forums as an indicator of general error free readability indicator for an unknown modern drive X (like they are used by almost everyone!), then we should at least have some data to support our decision for doing so (issue 2).

This is what I’m after personally, because I think it is important issue for archival and storage use for data that needs to remain readable for 10+ years in a variety of drives (issue 3). Of course with long term archival situation there are other issues beyond our test capabilities today, but let’s not get hung up on them right now (it’s a completely different issue).

Also, I slightly disagree with the “are not meant to supplant professional media testing equipment.” part as well, because C’t Magazine is the only source I know which has actually compared Premium+PlextoolsPro C1/C2 data with a dedicated/calibrated test gear (AudioDev) and found a reasonably good match with the C1/C2 results from the two test setups.

As such I think Plextor Premium and PlexTools Pro might be a reasonable homebrew alternative to give a rough indicator general error free read quality of a disc in a variety of drives. However the results are far from being conclusive yet and this is why I raised this question for all the experts in this forum.

Friendly regards,
Halcyon

PS English is not my native tongue and if I come across as combatative or rude, please accept my humble apologies and be sound in your belief that it was not my intention. I want to get as deep in this matter as possible and that may take a few reasoned disagreements, but that does not mean I’m not trying to understand other’s arguments or that I think I’m unfallible myself :slight_smile:

EDIT: The issue with incorrect reading speed seems to be with KProbe 1.1.14 (and perhaps earlier versions) and appears to be corrected in KProbe 1.1.16 onwards.

Here are the results for the test that KCK asked about test read differences and timings.

I first measured a Philips SilverSpeed 32x (Ritek Corp) written with LiteOn48246S at 32x on the Plextor Premium at 10-24x CAV:

The whole test took c. 4mins and 36 secs to complete, while the initialisation took c. 10 seconds (the lap time in the timer).

Then I measured the exact same disc on LiteOn 48246S with Kprobe set to 32x speed:

Time to complete the test was 4:50 and initialisation again c. 10 seconds.

I also tried the next higher read speed that LiteOn 48246s supports in hardware: 40x resulted in a test speed of 3:33 (initialisation of 10 secs).

I think that KCK is right that with no C2 errors, LiteOn test speed with Kprobe is roughly equal to the Premium + PlexTools test speed, if Kprobe is set to test at 32x.

Now the times are roughly the same. The total, average and max C1 counts remain 3 times as big on Plextor Premium (+ PlexTools Pro) test as they are when tested on LiteOn with Kprobe.

regards,
Halcyon