Log in

No account? Create an account
12 August 2006 @ 02:52 pm
Just on the off-chance that anyone knows about this :)

If I get a graphics card that has only DVI outputs & use a converter to hook it up to my monitor (which only has D-sub/analogue VGA inputs) are there any performace penalties to doing it that way? I.e. should I be making D-sub output one of my criteria for a graphics card or is it irrelevant?
Current Mood: curiouscurious
Current Music: The White Stripes "The Denial Twist"
peagles on August 12th, 2006 02:20 pm (UTC)
Nope, no performance penalties as far as I'm aware.
John: 2003jarel on August 12th, 2006 02:24 pm (UTC)
No visual quality issues?
peagles on August 12th, 2006 02:32 pm (UTC)
Only those inherent to VGA.

I did a bit of testing when I got a video card with 1 x DVI and 1 x VGA output - there was no difference that I could detect between using the VGA output or using the DVI output with a DVI-VGA convertor bolted on.
Margaretpling on August 12th, 2006 02:39 pm (UTC)
That's good to know, thanks :)

The main reason I was asking is that it looks like the cheapest & most expensive 7600GT cards come with D-sub, but the mid-range ones that I'm most likely to get don't.
Robdreema on August 12th, 2006 02:45 pm (UTC)

the only thing you'll notice is that once you've started using a screen with a DVI connection, and you look at a VGA one, you'll think the vga one looks kinda blurry...
Andrewsagima on August 14th, 2006 08:51 am (UTC)
I have 6 computers here at work with multiple screens which all have at least one attached to a converted dvi, that screen looks a little worse/blurry next to the ones that are just on the standard vga adapter. I'm not sure you'd notice with just one screen.

Martin Atkins: normalmart on August 19th, 2006 03:04 pm (UTC)

There are two kinds of DVI socket. One has just the digital signal present while the other has both digital and analogue (VGA) signal present on a few extra pins. Most DVI-only graphics cards use the latter, and those little adapter thingies really just wire the analogue output up to the right VGA pins so that your VGA monitor will work.

Therefore the performance/quality penalty depends on how good your graphics card is at producing the VGA output. I don't think you'll really notice any problems, since I assume most DVI cards will use the same chips as the VGA ones and just wire it up to a DVI socket instead of a VGA one.

(Of course, since I'm posting this a week after your original question you've probably made your purchase by now. ;))

Margaretpling on August 19th, 2006 10:04 pm (UTC)
Thanks for the info :)

Not got the card yet, I'm buying all new bits so I've not quite made all the decisions yet ... ordering it this week, hopefully :)