2 Megabyte Buffer - Too Small

Most DVD drives of today have very small buffers. Even the true Plextor PX-760A has only 2 Megabytes. Why is it so if memory is so cheap? Videoadapters now come with ridiculous amount of memory and even Soundblaster has 64 Megs for god knows what purpose (it’s NOT for the synthesizer). I certainly expected a Plextor to contain about 64 megs or so today to stand out in the crowd.

I recall how the 8 megs in my Samsung SW-248B saved me. I was burning a CD at 4x and forgotten about that started to play around in CDex CD ripper. Of course a blue screen resulted, but it said that “it may be possible to continue normally” and the burned disc was OK because the buffer didn’t run out.

Today when even small speed of 4x on DVD (or should it be called 4y?) is comparable to the max speed with CDs, the buffer can be emptied with any processor demanding operations, such as seeking in Extra High compressed APE files. I think a burner would really know how to utilize extra memory.

the onboard buffer is only for hiccups in the bus, not for a delay in the data stream because of software activity.

use imgburn, and set your memory buffer to something crazy like 256MB, and you’ll be suprised how little even THAT helps when you’re burning a disc at 12-18x.

even if they made the onboard buffer something huge like 64MB, it wouldn’t do much good when you take a ~30MB/s write rate into consideration.

I mean, back in the day, before buffer underrun protection, the fastest write rates were ~300kB/s. A lot has changed, and i’m pretty sure the only reason drives HAVE buffers anymore is for packet writing.

Do you believe a disk drive is able to perfectly resume writing after a buffer underrun occurs? This protection is like ECC, it can and will help to preserve data but nobody should rely on it.

Threads with priority 15, such as purely realtime tasks as audio playback will disturb the flow of data between memory and disk drives and buffer in the main RAM will have zero help.

Well, here’s something to think about.

Even if you burn a disc from beginning to end with no interruptions, the burning process starts and stops several times to change speed zones. So the drives’ ability to start and stop must be fairly evolved for writing strategies to actually INSIST on utilizing this ability.

And secondly, if you’re THAT concerned with the quality of your burns, or the sacrifice of said quality through buffer underrun protection artifacts, why not simply resist the urge to multi-task while burning? It’s not like the good-ol’ days when burning a dvd (or cd in that case) took 1/2 and hour. I’m sure you could find something to do for 5-10 minutes while the disc burns.

And my original point was not that buffer underrun solves all of your problems, or that you should rely on it, and not give a damn. My point was that regardless of how much onboard memory your optical drive has, with today’s burning speeds, it’s going to afford you only a piddly amount of time for your system to catch back up and save the situation. Even the 32MB you suggested earlier would only provide ~2 seconds of cover before you run into an underrun anyways.

And lastly, most newer drives have the ability to actually slow the burn process when it sees a buffer situation approaching. My $30 samsung even has the ability to interface with nero, and reduce burn speed when the buffer levels start depleting, so it RARELY becomes an issue.

And this isn’t true, either.

This is what DMA was created for, and yet ANOTHER reason why onboard memory hasn’t increased over the years.

If that is the commonly accepted truth in your words, I won’t press this issue any more.