I'm fixing to get a budget rear-projection HDTV. I plan on having 3 devices hooked up to it and it has 2 component + 1 HDMI input. Aside from the usual attractions of typical HD programming, I am also looking forward to having a huge hi-res desktop from my computer. I would prefer to use the HDMI input with my video card to help text and whatnot remain legible just that extra little bit.
My video card has a DVI-I port, yet all the adapter cables I find are for DVI-D ports. From looking at the pin arrangements, the only difference I can see is the DVI-D port lacks the 4 rgb pins that would normally support the analog signal. It seems to me that wouldn't impede the DVI-D cable from being plugged into it and thus transmitting the digital signal. Does anyone have experience with hooking a computer up via DVI/HDMI? Would I have to get some sort of converter box? Will only newer DVI cards work or with just any as long as they support the requisite resolutions and scanrates?
Most new HDTVs will only accept a DVI-D signal. My ATI video card had a DVI-I output and even though the DVI to HDMI cable fit, nothing appeared onscreen. I added an NVidia videocard with DVI-D output and it works great. You can also get a DVI-I to component adapter. They support 1080i, 720p and 480p and run about $20.
To eliminate overscan, I run my videocard at 1200x666. It looks great, Text is easy to read and nothing gets cut off. I use Powerstrip to achieve this resolution.
What nVidia card did you get? All the newer ATI cards specificially say they're HDMI compliant and advertise a TDMS processor. Is that what your card has? Is it AGP or PCI Express. All of those new ATI cards are PCIX only. I don't and won't have PCIX for a long time.
I really dont care about how fast the card is or anything, as long as it outputs a digital signal. I will fall bac on a 9600XT with that component adapter if all else fails, but I'd prefer the digital interface if I can.