When I’m making movies (Nerovision Express 6), there is a option for bitrate. What does this mean? Are the differences in Bit Rate really noticable?
Basically, bitrate is the rate at which data is streamed. The perceived differences in bitrate depend on the source material, e.g. if your source is 352x240 (NTSC VCD), you won’t notice any difference above about 2000 kbps. The optimal bitrate also depends on how “busy” the video footage is: scenes with explosions, flames, smoke or a lot of movement onscreen require higher bitrate than static or slow-paced scenes.
See Wikipedia for a detailed explanation of the term.