Does anyone know if a 720p signal looks as good on a 1080p monitor as it can on a 720p monitor? I'd figure some 'artificial' processing is required to display 720p on a TV with 1080 vertical lines. (ie. What is the TV displaying on the additional 360 lines, which don't have any information.)
I'm wondering this becuase if you prefer to watch 720p NFL over a Blu ray movie, perhaps you might as well save your money and buy the 720p screen...
Right now most networks are showing 720P and some Cable Networks are showing 1080i. Until the 1080p gets standard more then you'll have to decide what you want to buy.
You can get sets with 1080p right now, but you have to decided if you should get it or not. If you got the money then that's up to you. If not stick with what's current. 1080i and HDMI cable (high quaility cable though) would be the best bet for the perfect picture.
Well peter im no expert but from what I understand the tv uses an upscaling system. Reproduceing every three pixels or so to get the 1080p resolution from a 720p. Again I am not an expert but that is what happens to the best of my understandings.
Yes Matt, I believe you are correct about the upscaling. So ,now your picture quality revolves around the quality of the processor in your TV to complete this upscaling (and not the quality of the screen itself). The processor needs to fill in pixels with information which don't exist, by calculating what needs to be there. (or something like that?) If you just had a 720p screen, it could simply display the end result without requiring any additional processing. I'm a big believer in reducing the amount of processing/filtering/etc to keep a true picture. So I suppose I'm really wondering... if you supply a 720p signal... can a 1080p screen (with all that processing) possibly end up with a picture as good as a 720p screen, just showing the picture in it's native resolution.
Perhaps a Sony screen can do this, but I doubt if LG could ??? (as an example)
720P is really 1366 x 768 as far as displays. 1080i is 1920 x 1080. The catch is that hi def sat and cable is not 1920 x 1080, meaning not true hi def. it runs about 1240 x 1080. So 720P will look fine on a 1080i display, as there is minimal upscaling. Also, your viewing distance matters. At 10 feet, the human eye cannot resolve the resolution difference.
I am not under the impression that 1080i uses 1920x1080. My understanding is that 720p and 1080i are equal to each other and require the same native screen resolution (~1200x ~760). After calculating all the complex algorithms this is correct. (so I am told as I was never a math wiz)
This is why all screens of ~1200 x ~760 can display both 720p and 1080i, but not 1080p. For some reason 1080i, only requires 768 vertical lines in order to resolve itself correctly. It is only 1080p that require 1920x1080 as this format throws 1080 vertical lines simultaneously.
When you say 'true high def', do you mean 'actual high def' signals which are being broadcast? ... like what fox, espn, etc are actually broadcasting. Are all these networks actually broadcasting in 1920x1080 right now, or is this just a theoretical possibility?
Currently the compression rate is very high just for 720p or 1080i just to get down to the 19mbps required as standard. 1920x1080 can not squeeze into 19mpbs without severe compression. (so I was told) This is why high def signals will not likely see 1080p for a while. (1920x1080 is equal to 1080p... is it not?)
If that is the case, how are they keeping the 16:9 aspect ratio when they broadcast? All our high def channels are coming accross in a nearly 16:9 ratio, hence the shape of the screen and the black bars generated on our wide monitors when we are force to watch standard def in 4:3.
1240:1080 is nowhere near the 16:9 which is the whole reason behind TV's having either 1920x1080 or 1280x768 pixel layouts. It the broadcast is in 1240x1080, why does the TV display the channel in a nearly 16:9 layout?
Mind you, I have no idea why some plasma TV's are 1024x768 or 1280x768 or 1365x768. Perhaps is just related to the physical size of the plasma pixel and how big the screen is...
Unrelated. You can change any aspect ratio of what you are viewing on almost every modern display of every technology by pushing the aspect ratio button on your remote control. This has nothing to do broadcast resolution.
Ya but when you change the aspect ratio using your tv set, you distort the image. Chick buts look wider when you stretch the screen... some pixels are dropped around the edges when you zoom. I don't see how that has anything to do with displaying the image in it's proper aspect ratio.
buts is actually spelled with two "T's". What can I say? Hd signal from cable companies and satellite companies is not full HD. Its 1240 x 1080. Its really that way. True HD is 1920 x 1080 which is only available on Hi Def DVDs basically as far as consumer content. It doesnt have anything to do with aspect ratios and display sizes. It is what it is.