Is there such a thing as a perfect copy?

Hi there,
I’m quite new to all this, but have done quite a bit of reading recently. I’ve used CloneDVD with a NEC AD-5170A drive (capable of 8x speed with DL discs) to make a copy of a film on a Verbatim DVD+R DL (Singapore, ID:MKM 001) at 2.4x (certified speed). When I checked this with Nero CD-DVD Speed the Read test at maximum speed was 100% good, but the C1/C2-PI/PO test at 5x (recommended for my drive in the respective forum tutorial) gave a result of 97% good. I made another copy at 1x which produced a better result (98.2 good, 1,7 damaged, 0.1 bad) but still not perfect. My next move was to run the tests on the actual, brand new, original disc, as well as on a couple of older original ones, none of which produced a reading of more than 98.7% good either. Finally, I checked every disc again using VSO Inspector (both surface scan and file test) which gave a result of 100% good for all. So, what I’d really like to know is:

Is Nero CD-DVD Speed too ‘sensitive’, VSO Inspector too ‘insensitive’ or, perhaps, there isn’t such a thing as a perfect copy (or original) and I should be happy every time I get a reading above 98% or even lower with the former program?

Thanks

Welcome to the forum :slight_smile:

If all files can be read without problems, then the copy is perfect.

There is a thing that many people doesn’t take in consideration: [B]all discs contains errors[/B], also pressed ones (i.e. the originals). But drives are able to correct these errors at two levels: hardware (embedded on the drive itself) and software (the player you are using to watch a movie for example).

The only thing that CD-DVD Speed does is to count errors on a disc and display them in a graphical way to have an idea of how good the burn was executed.

Even if a disc contains some errors, if data are still readable then data integrity is preserved and the copy is reliable.

This is also the reason why the scan only is not sufficient to have an idea about a burn quality, but the transfer rate test is also necessary. In fact, it can happen that a disc with an excellent scan can be anyway unreadable.

Wow, didn’t expect such a quick response. Thanks a lot. :slight_smile:

I guess I should be happy then if the Nero C1/C2-PI/PO test gives a reading of above 98% good?

By “transfer rate test” do you mean the Read test that Nero CD-DVD Speed performs or another one, and what program would I need for this?

Also, how does VSO Inspector’s Surface Scan and File Test compare to the Nero ones? Should I use both programs just to feel more confident about the success of my burns or which one, if just one is enough?

A friend suggested that he wouldn’t trust a NEC drive to perform an accurate scan, and also that burning at 1x is just as bad as doing it too fast on my specific media, as I would risk overcooking the dye. Do you agree?

Sorry for the number of questions but now that I’ve found you it’s hard to resist. :wink:

Don’t worry about my “transfer rate test” question. I know now what it is.
Just noticed you’re Italian. Io sono Greco by the way. Thanks for your help.
Buonanotte :slight_smile:

Yes burning to slow can be just as bad as burning to fast. You can burn [B]quality[/B] media at its [B]rated speed[/B], with the exception of maybe getting above 16x which some people have had issues with, but dropping to 12x or 8x seems to solve the issue. You might getaway with burning faster then the rated speed but don’t get to excessive above the rated speed.

:cool: :cool:

Thanks for this.
I’ll have another go at 2.4x and run some more tests to see what happens.

Well, there’s no such thing as a perfect copy. That’s for sure. As long as your drive does not have much trouble reading the disc and your DVD player has no problems, then it can be considered a good working burn. Since you have a fussy drive that can show problems when a disc isn’t burnt so well, doing a TRT is a good idea. If the TRT if smooth with no slowdowns, you can be fairly sure the disc is a decent burn.

Scanning opens up a big bag of worms that most people can’t get their heads around. By that I mean most people forget that scanning isn’t for judging whether a disc will work in a player or not. IMO it’s more for determining if the disc has any big unburnt spots on it, large amounts of problematic errors, if the discs error levels rise over time (degrading discs) and just really basic stuff. People get the silly idea that if a media scans well, it is better than a media that scans a bit worse. This is foolish since there’s no real proof to this at all. I’ve had discs that scan very well, yet don’t work flawlessly in a player. Then I have discs that scan far worse but playback without any problem at all.

Hi guys. I’m back for a little while.
Thanks a lot for your advice. Have a look at the following thread, where the discussion seems to continue, to see the actual results of it. :wink:

http://club.cdfreaks.com/showthread.php?p=1799132#post1799132

I think there is some misunderstanding at play in this thread.

in the digital world, as far as [I]user data[/I] is concerned, and if no processing is applied to the data, either the copy [I]fails[/I], or [I]it’s perfect[/I], there is no in-between.

The so-called “errors” referred to here are not user data errors, they are low-level errors.

Think of a text on a piece of paper. User-data is the actual words and sentences, and low-level errors are like small artifacts in the letters, uneven lines, defects in the paper etc… If the text can be read despite the little defects, the message is intact and the copy will be perfect. A human could mix a word with another one because of small artifacts etc…, but because of the way digital data is handled (control algorythms etc…), this will never happen in the digital world, a successful 1:1 copy is always a perfect copy of the user data.

What an excellent example [B]Francksoy[/B]! :clap: I’m going to quote it in http://club.cdfreaks.com/showthread.php?p=1800438#post1800438 where you have actually already made your useful contribution. :slight_smile: It certainly puts things in perspective when interpreting scan tests or having unrealistic expectations of them.

I would say there is no such like a “perfect copy” you can create ‘at home’, but a “perfect content copy” is possible.

OK I realise that what some refer to here as “perfect copy” is actually “perfect burn”… (i.e. whithout low-level errors).

Which is not the same thing. :disagree:

A DVD burn (or any other digital storage solution for that matter) without low-level errors is impossible, at least with the technology at hand. So a “perfect burn” is impossible.

This doesn’t mean that the copy is not perfect. What is copied is the user data and nothing else. Actually a 1:1 digital copy is always perfect or it fails, as I mention in a post above. That’s what low-level error correction system, inlcuded in all digital storage technology, is all about. That’s what digital storage is all about.

You can copy a Word document a thousand times, on DVDRs, CDRs, disquettes, HD, whatever, it will always be a perfect copy of your original Word document (unless the copy fails). There will never be such a thing as an “inferior” copy of your Word document, in the sense that the words and sentences would be different! :wink:

The quality of the [I]burn[/I] is a different thing, which is what our media/burn testing is about… if the low-level errors get too high, reading can get difficult or fail entirely. [B]The actual user data is still the same[/B], but it gets hard, or impossible, to retrieve.

Wrong. Some user data errors are embedded in the discs due to various causes. The matter has been discussed in:

http://club.cdfreaks.com/showthread.php?t=163379

Yes, discussed. Not proven nor even demonstrated from actual technical considerations, let alone some coherent description. Please provide a sound demonstration of how user-data errors can be embedded in a 1:1 digital copy, or stop talking nonsense…

A digital copy with user data errors is a FAILED digital copy. If you don’t get that, you don’t understand what the digital world is.

[Moderator’s note by DrageMester: Unproductive comment removed]

To all intents a digital copy, because of the way binary works (0 or 1) its either a perfect copy, or its not. I feel that this was said but not in such a plain way.

But then there are the errors in the interpretation of that data, that can be because of the way you drives read, or the software/firmware reads or something that gets between i.e. dust or a scratch on the media. This is a whole other ball game.

@muchin and Francksoy: Please try to be constructive in your discussion. The “You’re wrong” “No, you’re wrong” is not leading anywhere except to deletion of posts or perhaps closing of thread. :wink:

I believe you may be talking about different things.

Francksoy is talking about the payload data, which should never be different in a copy compared to the original, or the copy is corrupted and thus failed.

muchin is probably talking about raw data before applying the error correcting code and extracting payload data. Raw data is almost always damaged in some way on optical media. As long as there is sufficient ECC, the raw errors will be corrected when extracting the payload data, however.

Of course we are. But my posts were quite clear on this, that I was not talking about raw data. You are re-explaining this for a third time in a different way, and I can only hope that this will help [B]muchin[/B] understand the point, and that his input was kinda surrealistic in the context.

Is there an actual difference between “user data” and what you call “payload data”? If so, I understand the misunderstanding and I apologize. If not, sorry but I think that my remark to [B]muchin[/B] was entirely legit, because my point was clear and his input was purely negative and not even discussing the actual point I made.

OK, I admit that the last paragraph of my previous post is off-topic and purely personal (in the bad sense), so you’re welcome to delete it if you feel like.

No, I just used another term for the same thing in the hope that it would perhaps clear up any misunderstanding.

[color=black][font=Verdana]

He has been rude to not only me, but also members much senior than he, for example:[/color]

[font=Verdana]http://club.cdfreaks.com/showpost.php?p=1747834&postcount=10 [/font]
so I have intended to give him a lesson.

[color=black]

I believe you may be talking about different things.[/color]
<O:p</O:p
Francksoy is talking about the payload data, which should never be different in a copy compared to the original, or the copy is corrupted and thus failed.
<O:p</O:p
[font=Verdana]muchin is probably talking about raw data before applying the error correcting code and extracting payload data. Raw data is almost always damaged in some way on optical media. As long as there is sufficient ECC, the raw errors will be corrected when extracting the payload data, however.
Since Francksoy said that “Yes, discussed. Not proven nor even demonstrated from actual technical considerations, let alone some coherent description” in responding to my post, what he referred to as user data errors must have included raw data errors. I have followed optical disk specifications in the usage of the term “user data” to indicate “raw data” written onto optical disks. Even if what he said does not include raw data, there is still a mistake. It is possible to have in-between discs. For example, an audio CD containing numerous “payload” errors may still be playable without noticeable distortion.[/font]

Apparently I have to repeat myself:
Please keep this discussion on topic and constructive without any personal attacks. :cop:

I am not going to issue further warnings in this thread.

    • One post deleted * * Cleanup of older posts * *