Wonder if anybody here has seen my posting in the main forum (was a lot of work to copy the results ).
At least with firmware A100, the 4163b seems to be one of the worst under bad 16x writers (from the writing quality standpoint). This was really surprising to me, as I have bought this device in view of the good ratings and quality scans found in the internet.
C’t recommends 4x on MCC004, 4x, on YUDEN000T02, 4x on RICOHJPNR02, 4x on MXLRG03 and so on. That’s enough I think to conclude C’t and AudioDev CATS DVD Pro are not reliable. LG GSA-4163B was not the only drive with such wild ratings and this issue of C’t wasn’t the first to publish such results. It was good to see C’t test results in the days of CD-R though when pratically no consumer was able to do anything except to try reading in Windows Explorer or Audio CD player after burning CD-R disks.
I haven’t seen AudioDev representatives posting on CDFreaks to explain how their devices work. I hope C’t editors can post here as well.
Daring statement to say that homemade quality scans are more reliable than scans of a multi-thousand dollar system! I am going to translate c’t’s testing conditions for you if you are interested in. Let my firet only say this here: C’t is a computer magazine with the highest thinkable reputation in Germany. To guarantee quality in this case, all measurings are done at AudioDev in MalmÃ¶, Sweden, for lots of money including measeurings of mechanical disc quality, in terms of AxialIN / RadialIN max , RRO (Radial Run Out), PPb (Push-Pull Signal) and PPb DV. I highly doubt, that any other computer mgazine provides such in-deep information combined with detailed explainations.
I haven’t seen AudioDev representatives posting on CDFreaks to explain how their devices work. I hope C’t editors can post here as well
I give you some email addresses later. Ask them. They are usually very responsive.
Thanks for your information, but I have already been familiar with the things you said. As if multi-thousand dollar equipment could beat multi-million dollar equipment or something? Dare? How can you dare to imply C’t can be more reliable than CDFreaks for optical storage tests? If what you said is true, AudioDev in Sweden has done the tests. What’s so phonomenal about it? Prove that C’t is accurate and CDFreaks reviews are unreliable. Right within 24 hours, I will place an order for one of such equipment from AudioDev on behalf of myself and I will publicly apologize for posting so many use-less scans and opinions without knowing better. Nobody has done that yet.
How are you going to explain MCC004 should be used for 4x? The rich C’t just tested two types of 16x media?
BTW, AudioDev is not a monopoly of C’t magazine. All major companies designing and manufacturing DVD writers have far more professional equipment for media quality tests.
Sorry but I think most of the reviews we read online are very mis-leading. The best source of information is forums like this one which give a much broader view
I don’t think that AudioDev CATS system has any real benefit to the home user! I don’t play my DVDs on a CATS system and I don’t read my data backup DVDs on a CATS system. I am more interested in whether my burned DVDs will play in my DVD Player and whether my backups will keep my data safe!
I think to arrive at a conclusion as to whether a DVD burner is good or not you would have to burn 100’s of the same media on 10’s of the same burner and then test them in 10’s of each DVD-ROM and DVD Player available just to be sure you didn’t get 1 bad disc or 1 faulty burner or 1 faulty player. Reviews are always based on 1 disc in 1 burner tested on 1 drive: Too Many Variables!
For most people they are happy if they get a DVD burner, find suitable media and the burned discs work in their DVD Players and their DVD-ROMs!
At the end of the day places like this are where intelligent people come to find out about how good a certain burner is because the people here are using lots of the same burners with different readers and posting real world results.
Just my views but I find real world tests to be far more important than some specially designed scanning machine that nobody uses to actually play their discs!
I believe most agree with you. Why else do we visit these web forums?
BTW, CATS tests at C’t are not that special. I suspect it’s more about marketing on the part of AudioDev. Again, these are all my guesses and opinions. If you believe otherwise, or know better, you should either correct me or ignore my posts.
I once thought CATS looked like a gigantic machine weighing tons.
some sentences (from my mind - c’t is currently in my office at work and i’m at home) from that c’t article: “…a good quality (kprobe, nero cd-dvd-speed) scans on a (consumer) drive only assures that this particular disc is good readable in this burner/reader…” and “…most burners are better readers than the audiodev equipment. But audiodev equipment is used as measurement standard in the industry and it could be that dvd reader have problems to read a dvd with high (audiodev) pi even it your burner does not have any problem with that dvd…”
Do you remember the threads about “…my dvd is not playable at a) sony playstation b) this dvd standalone and so on…”?
PS: The MSI burner (hardware identical to benq 1620) was rated better than benq because most of the media where burned with “conservative” settings e.g. 12x (msi) instead of 16x (benq) and or 8x (msi) instead of 12x (benq).
Before I buy that drive I spent a lot of time on Internet and this forum as guest to make some conclusion.One of them was that reviews were very controversial.
One thing someone liked another one didn’t and so on.
No, it’s not used as THE standard. That’s why I said “marketing” on the part of AudioDev. Since AudioDev probably allowed C’t to use their devices for free and wanted C’t to add it.
From reading many of the controversial threads on CDFreaks, and other dedicated websites, it seems to me AudioDev CATS is just another reader.
It’s not totally true that a particular disc scanned in SOHW-1633S BS41 firmware or SOHW-1653S CS09 firmware can be read only in the same drive used for scanning. Quite many consumers (by which I mean anyone who use the drives, not necessarily those who are alienated from using more professional equipment) are familiar with such things by now. Normally a disc showing good scan results in SOHW-1213S/1613S/1633S/1653S is quite well read in almost any modern DVD-ROM drive and DVD-Video player. Same for BenQ DW1620 and Plextor PX-712A and PX-716A. What about the very smooth 16x CAV reading graphs resulted from the discs burned at 16x P-CAV in GSA-4163B and read back in NEC ND-3500A? NEC ND-3500A is surely just another “consumer” drive and does a consumer burn a disc to be tested on an AudioDev machine or to be played back on a Pioneer, LG, Sony, Samsung, Panasonic DVD-Video player or DVD-ROM reader drive produced by Lite-On, LG, BenQ, NEC, Toshiba, etc.? If anything, the C’t tests of CATS could have been far more biased than the generally produced results posted on CDFreaks forums. And C’t is more dedicated to general hardware than DVD media tests which explains why there were so few and random tests. In addition, a conventional paper-printed magazine is too slow.
If anyone thinks only C’t or CDRinfo.com can use CATS tests for their reviews, that person is mistaken. Some other sites and individuals can make use of the same device as well but it makes more sense to concentrate on CD-Speed and K-Probe tests (which doesn’t necessarily mean AudioDev is out of reach.) Sanyo’s UM Doctor was also sold to companies and the software was more expensive than any consumer drive. That was why K-Probe became popular and later CD-Speed (for PIE/PIF tests).
Actually, discussions on media test methods should be on the media test software forum. Though most posts related to the subject these days are posted on the various sub-forums of Recording Hardware forum.
There have been some existing threads on the question: AudioDev CATS vs. K-Probe vs. CD-DVD Speed vs. DVDinfoPro vs. CD Doctor vs. etc. and etc. My answers merely reflect some of the conclusive results from those.
Since you have the device, would you make some discs using different firmwares and test them to see if the large discrepancies between reports by C’t and by CDFreaks are due to difference in the firmware used (A100 vs. A101). I am considering buying the drive, and I have been relying more on the tests by C’t, as it is to my knowledge now the only source using professional machine for DVD quality testing.
K-Probe and CD-DVD Speed are also used by professionals. That’s a fact, not a guess.
C’t is not the only one that uses CATS for scanning disk quality. The results are so random and rare because of the high cost and difficulties to use the devices. That means they cannot test the disks frequently enough using various firmware, various disks, etc.
On CDFreaks forum, quite many people participate to post their own scan results. Different drives of each drive modal. Different firmware versions. Different media IDs. Different speeds. Different sources. Different scan speeds. C’t’s CATS tests is no competition to that sort of community activities. Take my words for granted. Those who test media at the media and drive manufacturers are often impressed at the activities done right here.
And of course kaborka cannot answer you because kaborka does not seem to have CATS at all. Not even C’t seems to have one according to kaborka. The disks have to be shipped to a place in Sweden for them to be tested and it takes very long to prepare the disks, ship them, wait for the scan tests to be completed and reported back to the client. Imagine it if you wanted to test 100 types of media recorded in 10 different drives which alone is 1,000 disks. Imagine also you wanted them to be burnt at least at 2 different speeds on average. Like, 16x and 8x for MCC004, 4x and 8x for RITEKG05, 8x and 12x for YUDEN000T02 and so on. Imagine you wanted to test averagely 2 types of firmware for each combination. That now makes 4,000 disks. Shipping 4,000 DVD disks internationally alone costs very much. The scan results are usually monopolized and used for internal use only, not shared on the forum boards, though sometimes posted on reviews like in C’t and some online websites.
I didn’t want to post these quotes really but quite a lot of people seem to make simplified conclusions without bothering to test and read themselves.
Here’s just one of the posts in that single thread, posted by Mr. Spath, one of CDFreaks moderators.
> Nobody said they should show exactly the same. But such grave inconsistency cannot
> simply come from differences within the batches/models of drives. It has to do with
> KProbe itself, as Halcyon suggested, and only indirectly with the drives.
It seems you still think that all drives should report the same PI/PO errors for a given
disc : it’s not the case, and this is what Halcyon has experienced too and reported in
his first post. Results can be very different due to the hardware and there’s nothing
wrong with that. Also nobody said that Kprobe is buggy. It could be, but one cannot
prove it by comparing his PI/PO plots with other KProbe/CATS/Plextools plots from
Many people seem to consider CATS as absolute PI/PO references without understanding
what these devices are about. Error scanning on a CATS is done by reading a disc in
a particular drive and collecting statistics on error correction, just like Kprobe
or Plextools does with your home drive. So a CATS gives you the PI/PO values measured
by a pulstec drive, nothing more.
Finally error scanning is the least useful feature of a CATS (especially for drive
manufacturers) so don’t think that LiteON will spend their days making PI/PO plots
from these machines. You should not think either that c’t are experts because they
made CATS PI/PO plots ; the way they interpreted these plots to criticize Kprobe
would instead suggest the contrary.
Many thanks for spending much time to respond to my unsophisticated post, but then I merely wanted to know whether LG 4163B is a good writer. I know that Audiodev and other professional equipments are widely used in the industry for internal purpose. Câ€™t is the only source constantly publishes DVD quality test results using Audiodev CATS system. I inadvertently mis-phrased the sentence. I read Spathâ€™s post in your quote some time ago, and I agree with him except that I feel his negative attitude towards Câ€™t is too strong.
In my opinion, a disc of high writing quality should give low error rates when scanned by professional testing devices and by as many consumer readers as possible. I give more weight to the Pulstec devices, because they are pickier and more parameters are tested.
Prompted by your suggestion, I read test scans in this forum; I was surprised that there are so many already. It took me quite a while to go through them. Some of my thoughts after this reading:
Discs written using firmware version A100 also give very low error rates when scanned with consumer drives in most cases, so the situation may be more serious than I thought before, though it still cannot be excluded that writing quality has been improved in the newer versions.
Benq 1620 and LiteOn 1633S were used in most of the scans. The two drives and many other DVD writers show excellent capabilities in reading DVDs. Naturally the PI/PO error rates reported by them are low in most instances. The data are pleasing but may be at least misleading sometimes. Therefore the data I have read still cannot eliminate the doubts casted by Câ€™t.
I want a writer that will write discs to be regarded as high quality by as many readers as possible. So good scans by only one or two DVD writers good at reading are not enough. Faced with choosing between LG 4160B and 4163B, I will now settle for 4160B, which is highly rated by Câ€™t and by others.
Although Pulstec device is not â€œtheâ€ standard, it is essentially the de facto standard. Most likely LG also have some Audiodev or similar devices in use. It is upon LG to prove that 4163B can maintain the excellence achieved by 4160B in writing quality. It will be nice if you can pass the words to people at LG to put their own tests on its website. But you and other open-minded LG owners can also do something if equipped and willing to. My suggestion is to use LiteOn 167T DVD-ROM drive to scan discs burned with different firmware versions. The drive is to date the only one to give scan profiles quite close to those by Audiodev. In my previous post I thought many drives will serve the purpose, but reading the many scans in this forum has changed my thinking.
If you want, you can also see results from GSA-4163B when read in ND-3500A and scanned in PX-712A as well.
Although Pulstec device is not â€œtheâ€ standard, it is essentially the de facto standard.
Do you really think that’s important here? What’s the point of someone having been to Mars when most people are living on Earth? It’s better for all of us to be richer and equipped with more technologies, but using existing technologies more widely is far more important in the real world. It’s basically a matter of philosophy rather than technology arguments. Whether AudioDev or CATS or C’t is good or not is beside the point. A God can know everything and we should worship him. But what if no one here can talk to him frequently enough?
The drive is to date the only one to give scan profiles quite close to those by Audiodev.
And most people don’t burn disks to be sent to AudioDev but to use in their own homes. Lite-On DVD-ROM drives are not recommended for scanning.
At present Plextor 712A is the second choice as substitutes for Audiodev, so I will be much interested in PI/PO error scans by it. In case you have not read it, please go to a forum about writing qualityfor choosing proper speed for scanning.
If you say LiteOn DVD-Rom drives are not recommended for C1/C2 error tests, I will fully agree. As to PI/PO scanning, it is a different matter. In most cases, PI/PO error rates reported by LiteOn 167T and by Plextor 712A are not much different from those by Audiodev. Moreover, among three models of LiteOn DVD-Rom drives examined for similarity in PI/PO error rates to Audiodev’s results, 163 and 165H varied with the discs used, only 167T is more consistent. Have you seen any hard evidence to prove that LiteOn 167T is not suitable for PI/PO error tests, or you just echoed what others have said?
The lack of adequate standard in DVDR drives and discs have created many problems for customers. That is why we are spending time to debate about which devices are suitable for testing DVD disc quality.