Unregistered guest | Everyone here seems like they are knowledgeable and freindly so I feel comfortable posting a total noob question. Here it is: How do I get a decent (clear) picture on my HDTV (16:9)? I don't want to use s-vid anymore. Here is my equipment list: Tv- Philips 34PW9819 34 inch 16:9 HDTV CRT with component and DVI-Hdcp 1080i inputs. http://bizrate.lycos.com/marketplace/product_info/details__cid--11520000,pid--11 206302.html PC- Abit a17 mobo -- 1gb corsair ddr400 xms mem -- evga 128mb AGP Nvidia Geforce 5700 ultra (dvi) --Antec true power 430-"Da ZAHLMAN RESAHRATAH" liquid fanless cooling system. AHANIX D4 Case Software-- MCE 2005, Powerstrip, latest Nvidia forceware drivers. I am currently connected from PC to TV with the S-Video cable, but i'm sure you all know that the picture sucks. DVD's, even with Nvidia's special codec for MCE, look grainy (compared with my component input samsung dvd player) and the desktop environment is downright unbearable. I've tried all manner of custom timings in powerstrip to get the overscan in check. Currently my desktop extends vertically outside the screen by at least 10 percent. The sides are underscanned by five inches on each side too. Am I polishing a turd here? Is the real problem that i'm not using the 1080i component inputs on my tv? And how do I get component from my nvidia 5700 to the tv? I've seen ATI cards that have dongle options to convert the signal, but I don't want to buy yet another expensive card. I would also like to figure out why it is not possible to use the DVI output of my card into the DVI input of the Philips TV. When I try this hookup I get terrible flickering and distortion at all resolutions including the native res of the tv. Please help the Noob! I will do whatever I have to do...just want to get this HTPC to work for web browsing, DVD watching, etc. TV reception/HD broadcast decoding is not important right now. |
Bronze Member Username: Arrow224Post Number: 22 Registered: Mar-05 | Get a DVI cable, and use that. It will give you HD res, and full widescreen. |
Anonymous | Yeah, I basically have the same question. For whatever reason, info on nvidia DVI connections seems limited, and I'm never sure if they are HDCP compliant. I believe that's why you're having a problem with flickering and distortion. |
Unregistered guest | when you feed a non hdcp signal to a hdcp compliant input ..you will get no picture!!! whatsoever!!! flikering and distortions are vertical/horizontal values being out of wack! you mentioned powerstrip...bring up the custom values and find out what your native resolution is for your display and the set then to that...that will give you a basline so you can thincker to perfection.... good luck |