Digital vs. analog display support...

Discussion in 'Hardware' started by alanack, Mar 3, 2011.

  1. alanack

    alanack

    I've got a Matrox G550 PCIe video card I use to drive two 18" monitors. I'm thinking of replacing these with a 24" and a 19", which the Matrox card will support, but only with an analog signal. It will not support a monitor with the resolution of the 24" with a digital signal. Just wondering what if any differences I might expect if I go ahead with this. Thanks.
     
  2. Suggest getting a more modern video card... your G550 is old and slow. Recommend Nvidia Quadro NVS series.. 290 or 295. Pick up used on eBay for $30 or less.

    Passmark score for your G550 = 34
    Passmark score for NVS 295 = 206

    "206" isn't fast, but "34" is barely a crawl. (I used Matrox cards years ago, but they were so slow they wouldn't run my screen saver.)

    The 290 is supposed to come with a DMS-59 DVI cable (looks like a "splitter", but it's not). The 295 is supposed to come with 2, DVI-DP adapters. Make sure you get the cable/adapter(s) with the card.
     
  3. alanack

    alanack

    Thanks. I will at some point in the not too distant future, but in the meantime...
     
  4. Fair probability that your old G550 won't support the native resolutions of newer monitors.
     
  5. alanack

    alanack

    The card will support the resolution of the 24", as I stated above. Still hoping for an answer to this very simple question...
     
  6. VGA is usually different from DVI display, and usually degraded. Hard to say exactly what the difference would be.

    So, plug in the monitor.. and if you don't like the display on VGA, get a new video card that will run DVI.
     
  7. What is the resolution that you spoke of?
     
  8. alanack

    alanack

    The Samsung P2480L 24" monitor has a native resolution of 1920 x 1080, the Matrox card will support 2048 x 1536 in the primary display, 1600 x 1200 in the secondary, in analog.
     
  9. I recently hooked up a couple of old identical 19" monitors to an NVidia GT430 card. The sharpness with VGA was substantially poorer than with DVI. Replacing the VGA with a HDMI to DVI cable made a significant improvement to image quality.

    I reckon the GT430 is a decent, inexpensive (and faster) alternative to NVS cards. Power consumption is only 10 watts at idle and < 40W for flat out gaming. One DVI and one HDMI port allow for attaching two digital monitors.
     
  10. I think the difference may be minimal.

    I have a box that contains 4 x EVGA 8400 GS cards. Each card has 1 DVI out (digital) and 1 VGA out (analog). I hooked up 8 x 24" monitors to this box. 4 of them on VGA cables. 2 of them on DVI cables. And 2 of them I used the DVI-to-HD15 connectors and long VGA cable to run to the monitors. I was concerned about the display quality on the monitors using the VGA cables at first. These monitors are mounted side-by-side. I really can't tell any differences. Displaying charts, as well as displaying some demo images that come with Windows 7. I could not observe any difference. I am using the resolution of 1920 x 1080 on all 8 monitors.
     
    #10     Mar 3, 2011