An often asked question. My contention has been that a quality VGA card was nearly equal to DVI. I recently tested that idea on my Dell 1905s and Nvidia Quadro NVS 280... which as you know has a dual output dongle which with the proper cable can be either VGA or DVI. I have both cables, hence the test. 1. A drawback to DVI is that you always(?) lose some adjustments to the monitor. I don't know whether it's universally the same, but I lost Contrast, Autotune, and a few others. 2. I would rate the DVI and VGA display to be equal... even, *identical*. Much to my surprise, I couldn't tell any difference even with a magnifying glass. I know all VGA is not high quality. The one example where I've heard the most complaints is in ATI cards with both DVI and VGA output. Many times the VGA is significantly less sharp than the DVI. I suspect that's because ATI is really in the gaming market... not multi-monitor workstation display... and reluctantly added a cheap but lesser quality VGA capability to their cards for marketing. However, not all ATI VGA is poor. I also use 2 ATI Xpert128 singlehead VGA cards with my NVS 280, and the quality of display is equal. After the test, I removed the DVI in favor of VGA.