WHY YOUR HDTV ISN’T HI-DEF ENOUGH: The Real Story of HDTV Standards—There Aren’t Any.
In order to qualify as hi-def, a signal must have either 720 horizontal lines of progressively scanned pixels (720p), 1080 lines of interlaced pixels (1080i) or 1080 lines of progressively scanned pixels (1080p, which nobody even broadcasts yet.) But there’s a whole lot more to the quality of digital television than the number of pixels present. After all, 1080 lines of poor-quality pixels may technically be “high-definition,” but that doesn’t mean it looks very good.
One of the most important factors in determining picture quality is bit rate, or how much video and audio data is being sent down the pipe for each program. The technology behind digital television relies heavily on digital compression, and the ATSC specifies that digital TV use the MPEG-2 compression standard, which is also utilized by DVDs, although some satellite broadcasters use the more efficient MPEG-4 advanced video coding (AVC) standard. These compression technologies are necessary in order to deliver a large number of channels to consumers. Without these codecs, an uncompressed HD video stream could require as much as 1 gigabit per second of data capacity—that’s 52 times the capacity of the average broadcast channel. With compression, the same stream can be shrunk almost infinitely. But compression is often used overzealously, and picture quality suffers as a result.
Many people are already familiar with this data-size/fidelity tradeoff from their experiences with digital music: MP3 files with high levels of compression may take up less hard drive space, but they sound muffled and unsatisfying. The same is true for video. When an HD signal is over-compressed, it may have the same number of total pixels, ensuring it’s still technically HD, but the picture is often tainted with blocky, pixelated noise and image artifacts.
Read the whole thing.