View Single Post
  #1 (permalink)  
Old March 7, 2013, 09:13 AM
Traciatim Traciatim is offline
Rookie
 
Join Date: Mar 2013
Posts: 3
Default Multiple Video Card, Separate Monitors, How does it work?

I have two monitors, one is plugged in to my GTX 670 as my main display. I have a second one plugged in to my on board (HD4000) which i just use for general 2D windows stuff, like Voice chat sofware, forums, reading PDF files etc. I had originally set it up this way as I didn't want the second monitor having an effect on performance of the main monitor, even if it was very little. I also wanted access to quick sync without using Virtu, which I'm not sure will end up working. This plan was dashed when I was playing with some different settings in some games and flipped to window mode, at which point I noticed that I can take a game window and place it so that it's half on each monitor, but frame rate doesn't seem to be affected.

So a few questions popped up. Primarily, how does this even function? Is the 3D data being processed and rendered by both cards anyway, or does the primary cards frame buffer get copied to main memory for display by the second card, or does the second card just address the memory of the main card for use like an overlay? I can't really wrap my head around the data flow to even guess how it's working. Does it work differently if two dedicated cards are used? (Say my 670 and my old 5770?)

I couldn't find any benchmarking data that tests setups like these, so how does this effect performance? Can anyone point to links that benchmark things like both monitors on one card, the secondary on a second card, the secondary on the on board? I haven't been able to locate any good resources. I will probably end up doing it myself, but I would really like to read up on the topic some before hand.
Reply With Quote