As far as I’m aware of, all over the air HD broadcasts are 1080i. However, while some consider 1080i to be inferior to 1080p, both can produce the same 1920 x 1080 resolution picture. The difference is 1080 is interlaced, like 480i and 576i standard definition. The reason 1080i video tends to be inferior is that most broadcasters reduce the bitrate to try craming more channels per channel/transponder, where as Blu-ray discs which use 1080p don’t have this bandwidth issue and thus use a much higher bitrate than over the air broadcasts. Like standard definition channels, some HD channels may be just 1280 or 1440 pixels across. For example, BBC HD is 1440 x 1088 pixels interlaced.
Blu-ray discs and other HD sources (e.g. Full HD downloads) typically use 1080p. The advantage here is that the TV does not need to deinterlace the picture, eliminating any risk of interlacing artifacts showing. However, a drawback with progressive video is that 24p motion will typically appear more jumpy, where as with an interlaced source motion can occur at each field, effectively doubling the refresh rate.
To overcome the issue with progressive video such as 24p, most high end TVs use motion compensation to interpolate in-between frames. This is why you’ll see TVs claiming high refresh rates like 100Hz or even as high as 200Hz. So besides 1080p, another thing worth checking is the TV’s refresh rate. If it doesn’t mention, then it likely does not have motion compensation.
In my opinion, unless you use a Blu-ray source or are lucky enough to have HD broadcasts that use high bitrates, the difference between playback on a 720p display and a 1080p display is not going to be great, regardless of the size of the TV. For example, a low bitrate 1080i broadcast is going to look poor (maybe not even HD-like) no matter how good the TV.