Hi, not sure if this was the right sub-forum to ask this … but it seems like the closest one.
This is actually a two part question relating to the same topic … system resources (CPU usage) during playback of video files.
The first concern relates to different video file types.
Certain file types offer better data compression than others (from personal experience, it would seem that the old MPEGs were the least compressed, while the f4v flash video format seems to be highly compressed).
Well, assuming that you had TWO video files of the same exact video (meaning same length, same framerate, same resolution, etc.) … but one was encoded as an MPEG file and the other an mp4 file (which offers much greater data compression).
So the mp4 file takes up much less space than the MPEG file, despite the fact that they are essentially the same video.
When playing them on a media player on your computer … which will require more CPU and which will use less? Or would they essentially be the same, considering the framerate and resolution are the same? Would the larger file size use more system resources to playback the video or not?
The second concern relates to watching videos in their native aspect ratio (1:1) versus stretching them out to fullscreen.
Say you have TWO video files of the same exact video again (this time, same length, same framerate, and same file type). However, the difference this time is in the RESOLUTION. One video is rendered in 360p … while the other is in HD (720p). The HD video file is exactly the same as 360p video … except when it comes to resolution.
Assuming that your computer monitor has a native resolution of 720 vertical pixels … then if you played back BOTH files using a media player … which would use up more CPU (system resources) … the 360p video in FULLSCREEN (meaning it is upscaled from 360 to 720) or the 720p video in its native resolution (which would fill up the whole screen automatically anyway in a 1:1 aspect ratio).
In other words, which uses more CPU … having to upscale the 360p video so that it is now displaying in 720p … or simply playing back the 720p video (automatically fullscreen), which is larger in size and contains more data than the 360p video to begin with?
Seems to me like if watching the 360p video in fullscreen uses the SAME amount of CPU as watching the 720p video normally … then you’re basically getting ripped off by watching a lower-resolution version of the same video … upscaled (and poorly), are you not? The 360p video would look poor (needless to say) in fullscreen, while the 720p would look perfect in its native resolution … and you’d still be straining your computer the SAME amount in either case … total gyp. Or would playing the 360p video in fullscreen actually use MORE CPU … because now the computer has to figure out how to fill in the missing pixels (and again, quite poorly)?
So what’s the benefit (if any) of watching a lower-resolution version of the video? Simply that it conserves HDD space?
In addition, in the above example … when you are stretching out the 360p video to fullscreen view … which device is actually doing the upscaling? Is it the COMPUTER or the MONITOR that is doing the work? When you watch standard definition video on an HDTV, 90% of the time the TV is the one doing the upscaling … however, on a computer it’s the computer and not the monitor that actually does the upscaling, correct?