Cdfreaks presents: CATS vs. Home-made scans

vbimport

#1

The idea of this article is to compare disc quality scans obtained using professional tools (AudioDev’s CATS) with the ones obtained from standard PC drives. We will also discuss about the correctness, accuracy and usefulness of these tools when it comes to disc quality measurement. Note that the comments and conclusions of this article reflect the views of CDFreaks and not the ones of AudioDev.

You can find it here: http://www.cdfreaks.com/article/202

Feel free to comment the article in this thread.


#2

Thanks for your hard work on this Jan70. I know you’ve been at this one for a long while.

It’s long been a popular activity at some forums to declare drive A as “better” for scanning than drive B, because it most closely resembles results obtained on a CATS machine. Unfortunately this is very flawed thinking, and I’m glad to see someone injecting a little reality into the discussion. So thanks for pointing out that the only scans that matter are the ones that the end user performs on his own drive, whatever drive that may be.

The only question that can be answered with any scanning is whether disc A is read better than disc B on the drive used. Any other conclusions are pointless and usually flawed. Making assumptions about the “quality” of a particular media or burner based on CATS scans is equally flawed and pointless.

All drives will agree that a truly crappy disc or burn is in fact crappy. Likewise, all drives will agree that a truly outstanding disc or burn is in fact good. Where the disagreements will occur is on those discs that fall somewhere in the “mediocre” category. Which is why we have been saying for a long time that the end user should buy the best media available to him and burn it at the optimum speed for his drive (lowest error rates). Working with mediocre media will drive one insane with scanning and re-scanning trying to figure out is it a good disc or not. Mediocre burners will have the same effect.


#3

Most of the credits go to Spath for his knowledge and expertise in this article!


#4

Thank you very much for doing this test! This is a very important comparison that I hoped the German magazine c’t would do (they have a wealth of CATS scans available)- but now that you’ve put the information on the web, it’s even better!

This illustrates the wealth of information that CATS scans can give in addition to what consumer drives will report. And puts the results in proper perspective.

Thank you also for explaining the individual parameters measured. One question- I assume that the x axis on the AudioDev scans is radial position in mm. Is that correct?

My personal conclusion is that consumer drives (particularly LiteOn) are more error tolerant than the calibrated Pulstec drive- which is normally good, but has to be taken into account in interpreting error scans.

I will continue to scan every DVD I burn (and look at c’t tests with CATS results).

G


#5

Well, congrats on having survived working with Spath. :wink:


#6

Thanks for a good job, Jan70! It’s very helpful for correct understanding of home-made scans.
One question: why Plextor PIF test (SUM1) has been left out?


#7

Great article guys and very informative.

When I seen the wealth of results from CATS scanners my first thought was “I want one of those” I then very quickly concluded I couldn’t afford one.
What I found really interesting was, that even with very good quality media ( Taiyo Yuden) the burns were still not good enough to pass all the tests on the CATS units.

Regarding the “home user scans” as stated in the article that although CATS and home scanning didn’t always agree with the quality scores, they did agree in the two most important factors, in that they both agreed when a disc was good and when a disc is bad. As a home user and scanning a good number of discs, I’m glad all those hours spent scanning discs has some value.


#8

Very interesting, but no good examples of an “in the middle” situation…

The good are good, the bad are bad, and all platforms seems to have little difficulty showing it.

In some other tests, the “professional” and home tests have tended to mostly agree in shape but not numbers, with a good many instances where they agree on neither.

Does it also highlight the old suggestion that the Liteon is a poor burner, but covers it up by being an excellent reader - it got to the end of ones that other drives found unreadable.


#9

I want a cats too :)!!! - i never knew it was that smart - this is the first time i’ve seen such a scan, and gee - it’s amazing how confusing the terms are. But thanks to you - I learned something … and as a final question - do you actually OWN a cats machine?


#10

Great article guys. Lots of good knowledge on CATS lower level results. Also sane conclusions (what else would I expect from you).

I did a similar study myself for CD-R discs a bit over a year ago (plextools/kprobe vs CATS on CD-R) discs and the results were very similar in terms of scan consistency, drives’ varied ability to read really bad discs, etc.

I’m so glad you managed to do this for dvd discs, now we don’t have to rely on pure conjecture anymore.

If you have the possibility, please continue on similar article ideas in the future.


#11

First congrats with the excellent article. :bow:
Why the article i really good. I do have one complaint:
I didn’t like the way how the defenitions are made up in the end.

While I can see that you took your time to come up with something decent and something people could understand. I do believe that you should have used the IUPAC (That’s the standard in the industry/scientific worlds results). defenitions in case of accuracy(of measurement/ of equipement) and Precision and use them in the right context.
I think this is a small shame compared with the rest of the article quality which is really excellent and detailed and good written something which I don’t see every day.
In other words the rest of the article put the quality bar very high and then I bump into this. OUCH

Still for the people who are not involved on a quite daily business when it comes to measurements the article doesn’t lose it’s quality it’s just for completeness and correctness.

So next time take a look arround at if there any scientific definitions you might be usseing instead of takeing quite some time to come up with something decent yourself which would do it for most folks but just isn’t the same for the people who work in the field.


#12

An interesting article indeed which does show that home testing of recorded DVD media has its value (but also its limits) for end-users …


#13

Very nice article, thanks.


#14

I have never heard of these IUPAC conventions before, and certainly not in
the optical storage industry, and this is why I defined my own terms. The
only IUPAC I found by googling is related to chemistry, is that what you’re
talking about ?


#15

IUPAC is an international organization that creates standards for chemistry (they are most known for their chemical standard number system) indeed.

I guess he rather was referring to international standardization systems such as ISO or EN (which is rather pan-european) that contain some standards which define exact procedures how to determine a measuring system’s parameters like repeatability, precision and measuring limits. These parameters are mostly interesting in scientific (especially chemical) analysis … maybe that’s why he confused the IUPAC thing (IUPAC makes use of these standards as well).


#16

very nice
Thanks Mike


#17

for the TestDVDR at the Cats which was the Writer and Firmware of the Writer which was used?

Btw. Interesting Test!

Pitti


#18

Besides sending it to the AudioDev before publishing it, we also send the article to the authors of the software tools and also they didn’t come up with term you are referring to. I’m afraid that if they excist they are hardly used in the field we are playing in.


#19

Praise The Almighty, I just made a wish for such a thread a few days ago in another thread, and it came true within twenty-four hours. I thank you and the moderators for doing so.<O:p</O:p

The comparisons between different devices in disc quality testing are most welcome and long awaited. This is an excellent article for amateurs. However, there is still room for improvement.<O:p</O:p

First, the average values reported by some of the testing programs are incorrect. Plextools erroneously calculated PIE average as SUM1 even though SUM8 was indicated on the graph. It is better to use PxScan in this respect at least. On the other hand, KProbe 2.4.2 miscalculated PIF average as SUM8 even when SUM1 was chosen for scanning. To get correct results, KProbe 2.4.0 should be used instead. Although Nero CD-DVD Speed may also be used in this case, its PIF averages are also incorrectly calculated when the scanning interval is different from the sample length. Although average values are not the most important numbers in error scanning, it is preferable to be correct in an article aimed for excellence.<O:p</O:p

Secondly, the scanning speeds used in the cases of LiteOn and the Philips drives gave the highest possible PIE/PIF counts by each of them, whereas the speed chosen for Plextor drive gave the lowest counts. It is better to use 12X or 8X with the Plextor’s.
<O:p</O:p
As I have stated in the thread mentioned above, I am also doing comparisons between CATS and some PC drives, and I hope that the same parameters are adopted to make it easier to read different reports.

I shall post more critical comments later if your team do not object.


#20

I second dakhaas for the usage of the terms he mentioned. In short, precision is the word to use instead of consistency, and accuracy means closeness to correctness. These terms are used not only by IUPAC, but also universally by other fields of science when talking about any type of measurement. You need only get some college textbooks containing a chapter about measurement for refreshment.