Karen Brown is one of my favorite press/analyst people and she started a conversation at Motorola that deserves a post. Karen asked the Motorola “encoder geeks” how to tell if the video on your TV screen is a high-definition feed or an upconverted standard-definition feed.
First, one executive pointed out that even HD streams can vary significantly in quality depending on the original source material and what bitrate was used to encode the video. Weather can even affect HDTV as evidenced by the wet camera lenses during this year’s Super Bowl.
As far as upconverted SD streams are concerned, there’s no easy way to tell what you’re looking at other than, well, looking at it. HD should be recognizable to the eye simply because of crisper resolution. Upconverting adds in “extra lines” with data that wasn’t there before, so a good eye should be able to spot the difference. However, you can’t run a test on the content because if the video was upcoverted before encoding, the TV will think it’s displaying HD even when the original source was SD.
If SD content wasn’t originally created in widescreen format, you can usually tell when it’s been upconverted because the picture has to be adjusted to fit HD displays with a 16×9 aspect ratio. This is done by cutting off part of the picture, creating black curtains on either side, or stretching the image horizontally. Today, however, a lot of SD content is streamed in widescreen format (DVDs and letterboxed movies), so looking for picture distortion is not a reliable indicator of upcoverted SD video.
Filed under: HDTV |