I know, REALLY stupid question… But, for the life of me, I’ve been using BDRB like 1000+ times by now, but I almost always use the High Quality setting.
But, in this particular case, I have a Blu-ray that comes in at about 26 Gigs, so I want to use 2 pass and a few x264 tweaks (like I usually do, such as Tune film, for example) but with only about 3 1/2 Gigs to compress I don’t want to go all out with a 2 pass High Quality encode. Sooooo, NOW I guess I will have to show what an @ss I am and actually ASK what is the difference between these two settings that I almost NEVER use…?
The thing has always struck me as SO dang ambiguous because on one hand it calls one setting ‘Good’ which is very fast. Okay… Now it calls the next setting ‘Better’, which I would at first assume is a bit slower and thus ‘Better’, right? But then JDobbs throws in the comment 'Oh, but it happens to be “FASTER”! What the hell…? How can it be BOTH ‘Better’ and at the same time ‘Faster’…??? This has always confused the living heck outta me, so I just absolutely HAVE to come grovelling here to ask, sorry…
Now, my first guess is that since the ‘Faster’ mode is in between the ‘Very fast’ and High Quality (slow) settings, that the ‘Faster’ mode SHOULD then logically be somewhat SLOWER than the Good ‘Very fast’ setting, making it indeed ‘Better’, right? But, then what the heck is it ‘Faster’ than…!!!
Will someone [B][I]PLEASE[/I][/B] put me out of my misery; I’m gonna hang myself for sure, I swear!