![]() |
|||||||||||
|
|
|
|||||
Today we have the 4GB version of the Gigabyte GeForce GTX 680 on hand for some triple-monitor benchmarking. Designed to handle extreme resolutions, we will be comparing the GTX 680 4GB against a 2GB model and the Radeon HD 7970 GHz Edition in an effort to discover who makes the best GPU for multiple-monitor gaming...
Shortly after the initial GeForce GTX 680 launch Nvidia’s board partners begun releasing special 4GB versions which featured twice the frame buffer capacity. The idea behind adding the extra memory is to help eliminate bottlenecks at high resolutions caused by memory shortages and thus improving performance.
Nvidia was faced with a similar situation with their second generation of the Fermi architecture used by the GPU codenamed GF110. The GeForce GTX 580 which used the GF110 GPU was configured with just 1536MB of memory, while competing AMD products were fitted with a larger 2048MB frame buffer. Where this caused problems for Nvidia was at extreme resolutions such as 2560x1600 and higher using multiple monitors. At the maximum single monitor resolution of 2560x1600 it was clear Nvidia was at a disadvantage, though using only a single monitor it was not as noticeable. The real problem for Nvidia became evident when gaming with multiple monitors at resolutions that would max out the 1536MB frame buffer. When testing games such as Battlefield Bad Company 2 we found that the Radeon HD 6990 was 19% faster than the GeForce GTX 590 at 5040x1050, while at 7680x1600 the margin jumped to 160% in favor of AMD. In an effort to correct this Nvidia later released a 3072MB version of the GeForce GTX 580, though we never got a chance to test how the extra frame buffer impacted performance. Although the GeForce GTX 680 was released with a larger 2048MB frame buffer matching that of the previous generation Radeon HD 6900 series, games have become even more demanding and consequently at extreme resolutions can easily max out 2GB of memory. Again AMD seemed better prepared for this, fitting their current generation Radeon HD 7900 series with a 3072MB frame buffer as standard. For those gaming at 2560x1600 using the 2GB version of the GeForce GTX 680 graphics card, the more expensive 4GB cards will offer nothing new in the way of performance. In order to exceed the 2GB frame buffer gamers need to really push the envelope by playing at resolutions such as 7680x1600. This means purchasing over $3000 worth of LCD monitors, though we should point out that three 2560x1440 monitors are considerably more affordable at just over $2000. Still whether you plan to game at 7680x1600 or 7680x1440, it goes without saying you are going to need one hell of a graphics card, if not two or three. With the new Gigabyte GeForce GTX 680 4GB (GV-N680OC-4GD) on our test bed we plan to find out if that extra memory buffer can be used effectively and how it compares to the Radeon HD 7970. With that said let’s check out Gigabyte’s offering in more detail... |
|||||
|
|
ProX |
Finally I have been waiting for a good review on a 4GB card. Pretty much everything I have come across so far only tested at 2560x1600. Nice results. |
|
Carl M Bland |
Hello, I've been gaming on my three 30' samsungs for quite some time until AMD started using the mini display ports on the 6990's and I have been lost since then. I would greatly appreciate some in depth advice/photo's of how you are getting that resolution off just one 7970. It has been a problem in my gaming life for too long and I am glad I stumbled onto your review. Sincerely, Carl M Bland |
|
Steven Walton Posts: 104 Joined: 2010-02-08 |
Hi Carl, you need to use a pair of the mini-Display Ports and one of the DVI ports. |
|
GregM Posts: 1 Joined: 2013-03-21 |
Hi! Looking for some help! I currently max out the 1.5gb on my 2 580's at 2560x1600 very quickly in Skyrim with high res texture mods installed and then the game begins to jerk at times even at high frame rates. I,m assuming I would also pass the 2gb limit easily! So it's not just multi monitor setups I would assume that need more video memory. Would it be that texture memory is just as important depending on what you are running? TIA |












