Using SVGA extension cable, monitor info lost

Discussion in 'Hardware' started by Bolimomo, May 6, 2014.

Thread Status:
Not open for further replies.
  1. Dear monitor and video experts:

    I can't quite figure this out...

    From my 8400 GS video card (SVGA out) to my Samsung 23" SyncMaster monitor, using the vendor-provided 4' SVGA cable... no problem. Windows 7 recognized the monitor, plugged in the right driver, resolution 2048x1152 (the highest).

    But I moved the monitor a little bit farther from the computer. The 4' cable is not quite long enough. So... I plugged in a 6' SVGA extension cable to extend the original 4' cable (SVGA on both ends, 1 side is female and 1 side is male). Under this set up, Windows 7 doesn't recognize the monitor type. It plugged in a generic monitor for driver, and the highest resolution available for setup is 1600x1050.

    So some information is lost when the signals went through the extension cable. I thought those cables are all the same.

    My question is: these are analog signals (SVGA). How does the monitor's model ID get passed from the monitor's circuitry to the video card, via the analog wires, analog signals? And any reason why the SVGA extension cables do not pass that information? (Missing some wires compared to those regular SVGA cables (male to male)?
     
  2. Thanks for the link, TJ. That's educational.

    I knew from doing it that I would need a longer VGA cable (male to male) without going through an extension cable. Just didn't think some vendors will plug the 4 "unused" holes on the VGA specs. :eek:
     
  3. Cheap manufacturing... I'd assume that 99% of people would never notice the difference so they can save on tooling, dies, copper (the wire inside the cable) and the fittings/connectors... Sadly it's the way of the world these days.
     
Thread Status:
Not open for further replies.