Some insight over LCD TV's technology

I’m struggling over finding trustful and relevant information about LCD displays.

For example are LCD displays supposed to output 60Hz (as manuals say) or 59.94Hz?
And does it only output that rate or can deal with others? (ie. 50Hz, 24Hz, 23.976Hz?)

Why such a standard as 60000/1001Hz exist (for carrying chroma signal in NTSC) while there doesn’t seem to exist 50000/1001 for PAL?

I think that’s all for now. Thank you!

Let’s start with this.

[B]Why 59.94 Hz? ( Bob Myers KC0EW Hewlett-Packard Co.)

I recently received some mail asking where the NTSC 59.94 Hz field rate came from in the first place. Thinking that this might be a topic of general interest, I’ve decided to post a short discussion of this here - hope no one minds!

Before the NTSC color encoding system was added to the U.S. TV standard,television WAS at 60.00 Hz; it was set at this rate to match the power line frequency, since this would make interference from local AC sources less objectionable (the “hum bars” would be stable in the displayed image, or - if the TV rate wasn’t exactly locked to the line - at least would move very slowly). Actually, in some early systems, the TV vertical rate WAS locked to the AC mains!

A problem came up, though, when trying to add the color information. The FCC had already determined that it wanted a color standard which was fully compatible with the earlier black-and-white standard (there were already a lot of TV sets in use, and the FCC didn’t want to obsolete these and anger a lot of consumers just to add color!) Several schemes were proposed, but what was finally selected was a modification of a pixel-sequential system proposed by RCA. In this new “NTSC” (National Television Standards Committee) proposal, the existing black-and-white video signal would continue to provide “luminance” information, and two new signals would be added so that the red, green, and blue color signals could be derived from these and the luminance. (Luminance can be considered the weighted sum of R, G, and B, so only two more signals are needed to provide sufficient information to recover full color.) Unfortunately, there was not enough bandwidth in the 6 MHz TV channels (which were already allocated) to add in this new information and keep it completely separate from the existing audio and video signals. The possibility of interference with the audio was the biggest problem; the video signal already took up the lion’s share of the channel, and it was clear that the new signal would be placed closer to the upper end of the channel (the luminance signal is a vestigial-sideband AM signal, with the low-frequency information located close to the bottom of the channel; the audio is FM, with the audio carrier 4.5 MHz up).

Due to the way amplitude modulation works, both the luminance and the color (“chrominance”) signals tend to appear, in the frequency domain (what you see on a spectrum analyzer) as a sort of “picket fence” pattern. The pickets are located at multiples of the line rate up and down from the carrier for these signals. This meant that, if the carrier frequencies were chosen properly, it would be possible to interleave the pickets so that the luminance and chrominance signals would not interfere with one another (or at least, not much; they could be separated by using a “comb filter”, which is simply a filter whose characteristic is also a “picket fence” frequency spectrum. To do this, the color subcarrier needed to be at an odd multiple of one-half the video line rate. So far, none of this required a change in the vertical rate. But it was also clearly desirable to minimize interference between the new chroma signal and the audio (which, as mentioned, is an FM signal with a carrier at 4.5 MHz and 25 kHz deviation. FM signals also have sidebands (which is what made the “picket fence” pattern in the video signals), but the mathematical representation isn’t nearly as clean as it is for AM. Suffice it to say that it was determined that to minimize chroma/audio mutual interference, the NTSC line and frame rates could either be dropped by a factor of 1000/1001, or the frequency of the audio carrier could be moved UP a like amount. There’s been (and was then) a lot of debate about which was the better choice, but we’re stuck with the decision made at the time - to move the line and field/frame rates. This was believed to have less impact on existing receiver than a change in the audio carrier would.

So, now we can do the math.

We want a 525 line interlaced system with a 60 Hz field rate.

525/2 = 262.5 lines/field. 262.5 x 60 Hz = 15,750 Hz line rate.

This is the rate of the original U.S. black-and-white standard.

We want to place the color subcarrier at an odd multiple of 1/2 the line rate. For technical reasons, we also want this multiple to be a number which is fairly easy to generate from some lower multiples. 455 was selected, and

    15,750 x 455/2 = 3.58313 MHz

This would’ve been the color subcarrier frequency, but now the 1000/1001 correction to avoid interference with the audio:

    60 x 1000/1001 = 59.94005994005994.......

The above relationships still apply, though:

    262.5 x 59.94... = 15,734.265+ Hz
    15,734.265+ x 455/2 = 3.579545+ MHz

And so we now have derived all of the rates used in the current standard.[/B]

what of my question is that supposed to answer?

This can be a very complex subject to put into layman’s terms.

I was trying to giving you an answer to where 60hz came from and how we ended up with 59.94Hz. This rule still applies.

Your LCD will display the appropriate source, given that it has the ability to.

I’m kinda guessing what your angle is. Can you tell me what your objective is instead of giving me refresh rates and frame rates.

Some more info for ya.

[B]What’s the difference between 59.94fps and 60fps?[/B]


[B]Video Frame Rates and Display Refresh Rates for Beginners[/B]


[B]Television Standards - formats and techniques[/B]


I have some background knowledge since I do a lot of video cleaning, restoration, etc in avisynth.
I’m just trying to be pragmatic and straight forward with my questions, which are mostly spec related. I had read many posts and places (like AVSForums, etc) before posting here, included your big quote chunk. But neither in your post or the given links I read anything related to capable output refresh rate for LCD TVs.

I’m just asking if LCDTV’s are whole integers (60,100,120,240) Hz, instead of the old common spec 59.94Hz. Just wondering what’s the outcome of playing interlaced 29.97 (59.94fps) content in a 60Hz TV.
That drags me to formulate the second question; can LCD TV’s adjust their refresh rate?

And the last question is more out of curiosity than anything else; how was color dealt in PAL without affecting the framerate? (The explanation of your big post but in PAL’s case).

Edit: Just found the answer to my last question

When television came into existence, the cathode ray tube (CRT) display technology was used. CRT design at the time used a multiple of the power line frequency which is 60Hz in North America and 50Hz in Europe. The National Television System Committee (NTSC) standard developed in the United States in the 40’s and 50’s actually ended up slowing down the CRT refresh rate by 0.1% to 59.94Hz to compensate for some distortion that occurred due to the nature of the way NTSC transmits information. Later, the Phase Alternate Line (PAL) standard was developed in Europe and had no such anomaly.

I just performed a question in this web.
He seems to know his thing, and there’s a lot of valuable information in it.

Nice link, shows some more relative information.