Saturday, December 24, 2011

The Nuts and Bolts of High-Definition


So I’m sure everyone has heard of HD Television by now, and most people have either purchased one or are planning to in the future.  But what I am growing more aware of is that although people have heard of High-Definition –and they want it– few people understand the nuts and bolts of what it is and why they should have it. I am going to try and make sense of all this and maybe clear the mud up a bit.
It all starts with the movie directors, yup, the Hollywood types.  They are crazy about video and very passionate about creating a moment, capturing it; then bringing the experience to you exactly the way they saw it. That only happens when they can control what you see, and thus video standers were born.
With intense research and testing it was discovered that the best and most immersive experience for movie goers could be achieved by sitting 1.5 to 2x screen height away from the screen. And from there they went about figuring the best resolution for enjoying a movie at that distance. The resolution they came up with is called 4k and technology has been slowly catching up to that standard. A screen with 4k resolution ensures that a person with 20x20 vision will not be able to see the little dots that make up the image from the best seating distance of 1.5x screen height.
All this is great for going to the movies and being immersed in a screen as big as your house, but is it relevant to watching movies or TV at home. Well, yes… and no. This optimum 4k technology is slowly becoming available for home use in projectors for home theater application and in some TVs that have not been released in the US yet, but the cost is astronomical in comparison to mainstream television prices.  For the cheapest 4k projector and screen on the market you are looking to spend around $10-$12k. And that is down quite a bit from 4 years ago when the only projector offering 4k resolution was $180k. The more common high-definition resolutions available in televisions and projectors today are 720p and 1080p of which 1080p is the higher.
To understand the different resolution we will start with standard definition televisions that dominated the home watching experience until the late 90’s. The resolution standard for those TVs is called 480i. All these numbers may start to get confusing, but they do have a purpose. Resolution is based on the number of little dots used to make up the picture. For example, in every 480i resolution TV, there are 307,200 little dots, or pixels that make up the entire image that you see. There are 640 columns of pixels and 480 rows.  This may seem like a lot, but it’s only about 3.5% of the resolution in a full 4k image.
Standard 480i televisions usually come in an aspect ratio called 4:3 which means for every four pixels across there are 3 pixels down. This was the standard aspect ratio for movies years ago, but since then they have added more to the screen by going to a 16:9 ratio. This is why you see black bars at the top and bottom of a movie on a standard resolution TV. The little “i" at the end of 480i stands for a term called interlaced. Interlaced describes how the image on the screen is displayed. To avoid scrolling lines from showing up on the TV they had to alternate every other row of pixels that could be turned on during a scan cycle. This means that at any given moment you actually only see 240 rows of pixels on any given image. This standard definition is starting to look quite dismal. The term high-definition came about when the little “i” turned into a little “p” for progressive scan. Progressive scan means that every pixel can be turned on at the same time basically producing twice the definition you can see on the screen. And that quickly advanced to a more defined resolution of 720p.
Along with the higher 720p resolution came the 16:9 standard for televisions and projectors which produced a grand total of 921,600 pixels on the screen. This higher resolution along with being able to display every row of pixels made a huge difference in the quality of video. With the adoption of these high definition televisions came the need for high definition content, and for a while the only way to get HD content without up-converting 480i movies was from cable, satellite and over the air broadcast. Most TV networks started producing and broadcasting 720p content to be displayed on newer 720p High-Definition TV’s. A few short years after “High-Definition” was adopted came “Full High-Definition” or 1080p displays.
With the introduction of Blu-Ray discs came full 1080p content to be shown on these slightly superior displays, and the 1080p televisions became the must have. There is not much difference to be seen between 720p and 1080p on a screen that is smaller than 42”, but as TV’s get larger, the seating distance is getting closer to that optimum 1.5 to 2x screen height making resolution ever more important. A 1080p image is approximately 1/4th of the 4k resolution established and produces a great image for a normal movie or TV watching experience at a distance over 3x screen height.
What this boils down to is that resolution is less about having the newest and greatest technology available and more about your seat. If you are going for a theatrical experience and want to be fully engulfed in the movie you would want to sit 10 feet away from a 163” screen with a full 4k resolution display. Most people do not have the money or room for that, but a 60” screen sitting 8 feet away is completely reasonable for a great experience on a 1080p TV. And if all you are going to do is watch television on a TV in your living room sitting 12 feet away, a display of 720p will be perfect.  It’s all about the smallest pixel size you can see from the viewing distance you are at.

No comments:

Post a Comment