Video file types and resolution

vbimport

#1

Hi, not sure if this was the right sub-forum to ask this … but it seems like the closest one.

This is actually a two part question relating to the same topic … system resources (CPU usage) during playback of video files.

The first concern relates to different video file types.
Certain file types offer better data compression than others (from personal experience, it would seem that the old MPEGs were the least compressed, while the f4v flash video format seems to be highly compressed).

Well, assuming that you had TWO video files of the same exact video (meaning same length, same framerate, same resolution, etc.) … but one was encoded as an MPEG file and the other an mp4 file (which offers much greater data compression).

So the mp4 file takes up much less space than the MPEG file, despite the fact that they are essentially the same video.
When playing them on a media player on your computer … which will require more CPU and which will use less? Or would they essentially be the same, considering the framerate and resolution are the same? Would the larger file size use more system resources to playback the video or not?

The second concern relates to watching videos in their native aspect ratio (1:1) versus stretching them out to fullscreen.

Say you have TWO video files of the same exact video again (this time, same length, same framerate, and same file type). However, the difference this time is in the RESOLUTION. One video is rendered in 360p … while the other is in HD (720p). The HD video file is exactly the same as 360p video … except when it comes to resolution.

Assuming that your computer monitor has a native resolution of 720 vertical pixels … then if you played back BOTH files using a media player … which would use up more CPU (system resources) … the 360p video in FULLSCREEN (meaning it is upscaled from 360 to 720) or the 720p video in its native resolution (which would fill up the whole screen automatically anyway in a 1:1 aspect ratio).

In other words, which uses more CPU … having to upscale the 360p video so that it is now displaying in 720p … or simply playing back the 720p video (automatically fullscreen), which is larger in size and contains more data than the 360p video to begin with?

Seems to me like if watching the 360p video in fullscreen uses the SAME amount of CPU as watching the 720p video normally … then you’re basically getting ripped off by watching a lower-resolution version of the same video … upscaled (and poorly), are you not? The 360p video would look poor (needless to say) in fullscreen, while the 720p would look perfect in its native resolution … and you’d still be straining your computer the SAME amount in either case … total gyp. Or would playing the 360p video in fullscreen actually use MORE CPU … because now the computer has to figure out how to fill in the missing pixels (and again, quite poorly)?

So what’s the benefit (if any) of watching a lower-resolution version of the video? Simply that it conserves HDD space?

In addition, in the above example … when you are stretching out the 360p video to fullscreen view … which device is actually doing the upscaling? Is it the COMPUTER or the MONITOR that is doing the work? When you watch standard definition video on an HDTV, 90% of the time the TV is the one doing the upscaling … however, on a computer it’s the computer and not the monitor that actually does the upscaling, correct?


#2

So the mp4 file takes up much less space than the MPEG file, despite the fact that they are essentially the same video.
When playing them on a media player on your computer … which will require more CPU and which will use less? Or would they essentially be the same, considering the framerate and resolution are the same? Would the larger file size use more system resources to playback the video or not?

This depends in part on the the codecs used within the mp4 file. Xvid or divx video codecs don’t take much decoding capability, whereas H264 does take more. On modern computers, this is not much of a problem, but if you have an older computer, it can be. It all depends on the specifications of your hardware.

In other words, which uses more CPU … having to upscale the 360p video so that it is now displaying in 720p … or simply playing back the 720p video (automatically fullscreen), which is larger in size and contains more data than the 360p video to begin with?

Scaling the video will take a bit more processing, but is insignificant with modern computer equipment. Lower resolution videos are made to play on other equipment, like handheld tablets, iPads, phones, etc. I generally recommend re-encoding using the original resolutions when the intended playback device is a computer or stand alone dvd or blu ray player. Doing this, you don’t run into problems of aspect ratios…where the video looks stretched in one dimension or the other.

Scaling is done within software on the computer.


#3

Well, I tried testing it out last night, and got the following results:Playing a 360p video (25 fps) at its native size used an average of 25-35% CPU. When stretched-out/upscaled to fullscreen, the CPU jumped a little bit to about 35%-45% usage.However, playing a 720p video (also 25 fps), which is already practically at fullscreen in its native size (my monitor is 1366x768) required an average of 70-80% CPU usage, significantly higher than the 360p video even at fullscreen.Just to clarify … the monitor does NOT do any upscaling at all, correct? It just gets fed the video? So CPU usage should definitively determine which requires more work?


#4

Tell us the details of your computer. It will help us figure out what you are doing.

What are you playing the video with? Do you have a video card that is capable of taking some of the processing chores off the cpu? If you have such a card, and a player that can enable hardware decoding, your cpu use will drop like a stone.

For example, here are a couple of websites showing how to set up Media Player Classic HomeCinema using the built in hardware decoding capabilities of many different Nvidia and AMD video cards: http://wiki.vuze.com/w/HD_Video_Playback_with_Hardware_H.264_Decoding

Note that both of those sites are talking about H264 playback.


#5

Some players, like a few editions of PowerDVD and Arcsoft TMT 3 that I’ve seen, have an easy to use option to enable hardware acceleration.

A free program called VLC will try to use video acceleration by default.


#6

Well, this computer is over three years old. AMD Athlon 64 1.6 GHz, 2 GB RAM, NVIDIA GeForce 6150SE, running on Windows Vista. I run most videos using VLC media player. I tested more videos out last night (all on VLC), and it seems like there is only a small jump in CPU usage when switching from a lower-resolution video’s (240p or 360p) native size to fullscreen … but there is a large (noticeable) difference in CPU usage when playing a 720p video at all, either native or fullscreen (about the same either way on my 1366x768 screen) … although this time only required about 50-65% CPU on average. 1080p videos make an even larger jump (practically full CPU usage).


#7

[QUOTE=princekrillo;2624579] NVIDIA GeForce 6150SE[/QUOTE] Sorry, that’s 8200, rather. Just read it off the side of the computer. Wouldn’t let me edit the post anymore.


#8

Usually the minimum recommended cpu for 1080p HD playback is a 2ghz, dual core.

The Geforce 8200 is an integrated video chip that is supposed to have full HD support for 1080p video and has what they call Nvidia PureVideo Technology. Don’t have first hand experience with this particular onboard video however, and from the sounds of it, you are pushing the limits of what it can do.