We have at least one engineer from Lite-On lurking around [hello, Wind :)], so maybe we can get an informed opinion on this issue that has bothered some of us for a while.
Let's look at an example.
MAXELL 002 media at 12x. Forced and online hypertuning on, overspeed on. [Similar effects can be observed with other media types, with or without hypertuning and overspeed, in 16x burns as well, but this is a typical case.]
First image is a scan by 165P6S.
Second image is a scan by BenQ DW1620.
Third image is the same scan by BenQ DW1620, but using a different scale.
I know the first scan by Lite-On looks good, but what is it about Lite-On burns that drives BenQ nuts when it reads them?
What happens between OPC stops that sometimes causes BenQ to return to sanity and report much lower jitter? [See the sections enclosed in green rectangles in the third image.]
Look closely at the 12x CLV section between 2.1 and 2.3 GB. High PIEs, high PIFs, high jitter reported by BenQ.
We make an OPC stop at 2.3 GB, and make another run [at the same 12x CLV] to 2.6 GB. Wonderful low PIEs, low PIFs, low jitter.
What happened at 2.3 GB that made the burn so much easier to read?
What happened at 0.9 GB, 2.3 GB, 2.6 GB, and 3.8 GB that made the results alternate between GOOD, BAD, GOOD, BAD, and GOOD?