High Definition screen modes and resolution explained

You may quite possibly have seen the specifications in your local electrical retailer, with LCD and Plasma TV sets carrying the “HD Ready” logo. But what does this actually mean?

The TV’s screen must support 720p and 1080i screen modes. The TV must also support HDCP compliance. The native resolution supported must be equal or greater than 1,366x768 pixels. This means that 1080i signals will be downscaled to fit on the screen as they are larger than a 1,366x768 screen will allow.

A newer standard has already been released called “Full HD”. Full HD must be able to support 1080p. 1080p requires a minimum resolution of 1,920x1,080 pixels. So for the moment these displays tend to cost a bit more than a standard HD Ready TV.

What is the difference between an interlaced (i) and progressive § screen mode?

Regardless of the resolution, the picture you see on your TV screen is made up of picture frames displayed sequentially. Each frame contains a number of horizontal picture lines that make up that single frame.

Interlaced
For the old analogue PAL TV system (used in Europe) which uses 576i (interlaced). Each picture field contains 288 horizontal lines of picture data. Why 288? The term “interlaced” means that in order to see one complete frame of picture data, each frame takes two scans to perform. In the first half frame, each “odd numbered” horizontal lines are displayed (field 1), and in the second scan, each “even numbered” horizontal lines are displayed (field 2). The PAL system allows 25 picture frames to be displayed per second. (exactly half of the 50 frames per second available).

So let’s see what that actually means when High Definition interlaced picture screen modes are used.

720i contains 360 horizontal lines of picture data per field (2 fields = 1 frame).
1080i contains 540 horizontal lines of picture data per field (2 fields = 1 frame).

Progressive
Progressive is much easier to understand. Each picture frame contains all the available horizontal lines of picture data. There is no need to break the frame of picture data into odd and even horizontal lines and display them as two half frames. This also means the complete picture frame rate is now 50 frames per second, which results in a more detailed, flicker free picture on your TV screen

Now let’s see what that means when High Definition progressive screen modes are used.

720p contains 720 horizontal lines of picture data per frame
1080p contains 1080 horizontal lines of picture data per frame

So what does all this add up to?
The human eye is most sensitive to horizontal lines of picture data. In other words, the more horizontal lines of picture data that can be displayed, the more detailed and smooth the picture will be. A 1080p frame of picture data will contain around four times the detail of an old analogue PAL TV screen.

So those are the facts and figures.
Which display modes are supported by your HD TV?
How does your HD TV perform with HD content?
Would you consider upgrading to “full HD”?
Will you just buy “full HD” to start with?
Let’s hear your views.

I bought my 42" Samsung Plasma display 1080i early this year, quite a big jump for me and my wallet given the price at that time so I’m not sure if I will upgrade to Full 1080p soon.

How does your HD TV perform with HD content?
The picture quality is just perfect for me. Well at least for HD content, but watching standard definition channels from FIOS on this TV is baad! I can see pixelations.

Am I correct that plasma has better picture quality than LCD?
Also I read that 1080p is only good on a 46" or higher?

[QUOTE=zevia;1933427]
Am I correct that plasma has better picture quality than LCD?
Also I read that 1080p is only good on a 46" or higher?[/QUOTE]

Not really. There are equally good and bad examples of all display types. It’s not so much what’s in the screen as it is what’s behind it. Crappy electronics, unfortunately, are common. Many people find that rear projection LYCOS-type sets provide the best combination of large screen size, price and picture quality. As costs decline, plasma and LCD will probably become the norm, and rear projection may eventually die off. But your best bet is to preview all the types, and don’t rule out 1080i or even 720p sets either, until you see them. Ignore prices until you settle on a particular technology and brand. You will not see much, if any, difference between 720p and 1080p on screens smaller than 50" unless you sit VERY close.

IMHO, the fastest way to separate the good electronics from the crap is to view them while displaying DVD content, or even broadcast quality. If they can upscale and create a good image from that, you’re bound to be satisfied.

I personally prefer Plasma, as Plasma is faster at refreshing the display compared to LCD. Maybe Plasma displays do not have a high contrast ratio’s compared to LCD, or are maybe not so bright. They of course use more energy compared to LCD display. But i just prefer the way the picture looks on a Plasma display.
.
I also agree what CDan says. There are good and bad displays. Having said i prefer Plasma. I would rather have a good LCD display panel rather than a bad Plasma panel.

Many manufacturers selling FULL HD TV’s here in the UK are selling them from size 37 inch and above. I can’t comment if that is the size at which point you notice a difference, or at the moment, it requires that size of display to accommodate 1,920x1,080 pixel resolution.

Don’t underestimate 24p and 100Hz (50/60Hz). :wink:

Hi Dee-27
I think your explanation of interlaced and progressive needs a bit of clarification. I will see if I can explain this a bit better. In the interlaced scanning system the group of 288(PAL) lines you refer to are known as a “field” and they occur 50 times a second. When you add the 2 fields together you get a “frame” of 576(PAL) lines that occur at 25 times a second. The same applies to 720i and 1080i there are 720 lines and 1080 lines but they occur at 25 times a second. There is no resolution difference between 1080i and 1080p as they are both 1920x1080. “But” 1080p will look a lot better than 1080i because of the reduced flicker and the lack of distortion effects on motion in the picture because all the lines are scanned at once.

Thanks for the clarification [B]oldtecho[/B].
Am i correct in saying that with an interlaced display, only half of the available lines that make up a full frame are displayed at a time, hence the flicker with interlaced?
So for example for 720i, two “fields” of 360 lines that make a frame are displayed sequentially?
And if this is the case, i need to change the term “frame” to “field” in some instances of my explanation?

Hi Dee-27
You are correct in saying that with an interlaced display, only half of the available lines that make up a full frame are displayed at a time. In your case of 720i the first field (field1) displays lines 1-3-5-7-9 etc to line 719 in a 50th of a second. The second field (field2) displays lines 2-4-6-8-10 etc to line 720 in the next 50th of a second. Then when the 2 fields are interlaced together (hence the term interlaced) on your display to form one frame you see 720 lines in a 25th of a second giving you flicker.

Many people who first watch non-HDTV on their new high-def televisions are disappointed by how it looks. But it’s not the television’s fault. The single most important ingredient in picture quality is the source, and lower-quality standard-def TV, especially compared to HDTV, looks bad. The difference is often compounded by the fact that HDTVs are bigger and sharper than regular TVs and thus highlight the flaws of low-quality sources even more. No matter how nice of an HDTV you get, standard-def TV, at least compared to DVD and high-def, will look a lot worse.

Let’s just keep it this simple, or else we could get into some serious stuff that would only create a lot of confusion for people.

:cool::cool:

Is it necessary to have HD content to have the HD experience?

[QUOTE=pradeep_rad;1951819]Is it necessary to have HD content to have the HD experience?[/QUOTE]

If you have a device with a hardware upscaler built in you can have that HD experience too.

I’m going to set my 50" plasma to display in 720p instead of the 1080i mode it’s set at…Should look better …It already looks great, i.e. no noticeable flickering … but I’ve gotta see if displaying the 720 lines scanned at once makes any significant difference…What you guys/gals think?.. has anyone tried and compared on your sets?..

[QUOTE=t0nee1;1951878]I’m going to set my 50" plasma to display in 720p instead of the 1080i mode it’s set at…Should look better …It already looks great, i.e. no noticeable flickering … but I’ve gotta see if displaying the 720 lines scanned at once makes any significant difference…What you guys/gals think?.. has anyone tried and compared on your sets?..[/QUOTE]It should depend on the native resolution of the set. But other factors have to be considered too. Does the set support 100Hz for example.

Our lounge can’t accommodate a large set. We have a 32 inch set that supports 1080i. On that set, 720p certainly looks better.

I would certainly try 720p, you may be surprised by the result. :slight_smile:

[QUOTE=chef;1951854]If you have a device with a hardware upscaler built in you can have that HD experience too.[/QUOTE]

Um, not really. Only 1080 content should be considered “HD”. Upscaling 480 to 1080 is not HD.

[quote=pradeep_rad;1951819]Is it necessary to have HD content to have the HD experience?[/quote]If you have an HDTV, I would say “near HD experience” and I can confirm that at least in my setup.

I still have DVD player connected via component upscaled to 1080i and it’s huge difference compared to watching 480i on my previous setup.

[QUOTE=t0nee1;1951878]I’m going to set my 50" plasma to display in 720p instead of the 1080i mode it’s set at…Should look better …It already looks great, i.e. no noticeable flickering … but I’ve gotta see if displaying the 720 lines scanned at once makes any significant difference…What you guys/gals think?.. has anyone tried and compared on your sets?..[/QUOTE]

You should see some difference especially with fast motion.

:cool::cool:

Thank you for the explanation. I am new to this and am very interested in getting myself educated in this. People talk about 1080i and 1080p and I nod as if I understand but I had no idea:o.

I think I will need to reread it a few times though.

Hopefully the next time my BIL brings up this subject, I will be able to add to the conversation rather than just nodding :).

I bought a Panasonic flat screen LCD 1080p HD t.v. last year. I just bought a Panasonic Blue Ray DVD player to go with it like a week ago. Incredible! The sound a screen are just amazing to me. We buy a new movie each and every week just to see how much better it gets. Unbelievable!

[QUOTE=platinumsword;1952077]You should see some difference especially with fast motion.

:cool::cool:[/QUOTE]

[QUOTE=t0nee1;1951878]I’m going to set my 50" plasma to display in 720p instead of the 1080i mode it’s set at…Should look better …It already looks great, i.e. no noticeable flickering … but I’ve gotta see if displaying the 720 lines scanned at once makes any significant difference…What you guys/gals think?.. has anyone tried and compared on your sets?..[/QUOTE]
As was noted 1080i can and does produce artifacts/anomalies (thin horizontal lines that seemed to disappear for one) with fast motion that you won’t see at 720p. FOX and ESPN Sports originally chose 720p over 1080i for this reason.

I enjoyed reading this post and replies, just what I was looking for as I’m about to upgrade to a 1080p monitor and most of this 1080p stuff was going over my head. Thanks for the information.