Wow, lots of misinformation on this thread.
The key piece of information in gheberf’s question is that the issue arises when the camera pans quickly or there is rapid motion in the image. EVEN IF YOU HAD A PERFECT DIGITAL CONNECTION directly to the original source, you would STILL experience this distortion. It has nothing to do with cables, interfaces, component quality, etc. Your high-end equipment is displaying exactly what the broadcaster is delivering - that is, a distorted picture.
Why, you ask, is my cable company delivering a distorted picture? You need to go back to the original days of HDTV, when the “Grand Alliance” of electronics manufactures designed the original specifications for HDTV. In order to fit a high def digital picture into the same bandwidth used by a low-def analog NTSC picture, the digitial data is compressed – the same way an mp3 file is compressed. The standard for over-the-air HDTV was MPEG2, which is formally known as “Generic coding of moving pictures and associated audio information.” Satellite and cable broadcasters mostly use MPEG-4 which provides even more compression - and even more degradation of picture quality in rapid panning and fast motion.
The way these standards work, the broadcaster does not actually transmit 60 full frames each second – that would take too much bandwith. When they designed the standard, they noticed that almost all the pixels in a digital frame were the same as they were in the last frame. Instead of using up bandwidth to retransmit the same information, they only transmit the complete, full detail image when there’s a scene cut, or periodically if there are no scene cuts. After the initial full image, for each succeeding frame, they only transmit the pixels that actually changed from the previous image. The rest are assumed to be “same as last time.”
This works great for something like the network news, where the only part of the image that moves is the broadcaster’s mouth. Instead of resending a picture of the whole studio 60 times each second, they “paint” the studio once, then resend the broadcaster’s mouth 60 times each second. There are also codes in the specification to save bandwith if the camera pans or zooms – for example, for a pan in a news studio, they only transmit the “new” part of the picture, and the TV is instructed to shift the “old” part of the picture over.
The problem arises when there is lots of motion on the screen – like sports, or (my favorite pet peeve) ocean waves. In these cases, there are just too many changed pixels to transmit them all in the limited amount of time allocated to each frame. Rather than slowing down the frame rate, which would make for jerky motion, they reduced the resolution – ie, instead of sending 720p lines, they send 480 or 240 or 120 lines for the portions that have heavy motions.
If you look closely, you will notice that in the scenes that bother you, what appears as “distortion” is actually a bunch of perfect squares that are 2x2 or 4x4 or 8x8 or 16x16 pixels. On your 51" screen, there are about 40 pixels per inch. But in the rapidly moving parts of a picture, your picture will step down to 20 or 10 or 5 pixels per inch. On your high quality big screen, 5 pixels per inch is almost 1/4" per pixel. So you will definitely see these big 1/4" squares in the “blurry” parts.
Unfortunately, there’s not much you can do – except, ironically, you might get a better picture from an over-the-air antenna than from a cable box. The FCC requires that over-the-air signals must be transmitted in MPEG2, so that everyone’s HDTV will be compatible with the broadcast. But cable companies or satellite broadcasters are free to use a lower bandwith MPEG4 signal with their own MPEG4 cable boxes/receivers. So ironically, you will probably see less of the offensive distortion if you switch back to the bunny ears. If you go with a rooftop antenna, that will be the best.
What goes around, comes around.