What's new
  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

NVIDIA GeForce GTX 690 Review

Status
Not open for further replies.

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Taking Image Quality to the Next Level (pg.2)

Taking Image Quality to the Next Level (pg.2)


In this section we take a number of games we have tested previously in this review and bring things to the next level by pushing the in-game settings to the highest possible level. All other methodologies remain the same.

Shogun 2: Total War

NV-GTX-690-67.jpg


The Elder Scrolls: Skyrim

NV-GTX-690-73.jpg


Wargame: European Escalation

NV-GTX-690-79.jpg


The Witcher 2

NV-GTX-690-85.jpg
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Surround / Eyefinity Multi Monitor Performance

Surround / Eyefinity Multi Monitor Performance


Both NVIDIA and AMD now have single GPU multi monitor output options for some truly immersive gaming. However, spanning a game across three or more monitors demands a serious amount of resources which makes this a perfect test for ultra high-end solutions.

While all solutions have the ability to implement bezel correction, we leave this feature disabled in order to ensure compatibility. The benchmarks run remain the same as in normal testing scenarios.



Batman: Arkham City

NV-GTX-690-37.jpg


Battlefield 3

NV-GTX-690-42.jpg


Crysis 2

NV-GTX-690-46.jpg


Dirt 3

NV-GTX-690-58.jpg
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Surround / Eyefinity Multi Monitor Performance (pg.2)

Surround / Eyefinity Multi Monitor Performance (pg.2)


Both NVIDIA and AMD now have single GPU multi monitor output options for some truly immersive gaming. However, spanning a game across three or more monitors demands a serious amount of resources which makes this a perfect test for ultra high-end solutions.

While all solutions have the ability to implement bezel correction, we leave this feature disabled in order to ensure compatibility. The benchmarks run remain the same as in normal testing scenarios.



Metro 2033

NV-GTX-690-62.jpg


Shogun 2: Total War

NV-GTX-690-68.jpg


The Elder Scrolls: Skyrim

NV-GTX-690-74.jpg


Wargame: European Escalation

NV-GTX-690-80.jpg


The Witcher 2

NV-GTX-690-86.jpg
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Multi Monitor + 3D Vision Testing

Multi Monitor + 3D Vision Testing


3D Vision in hasn’t been covered here on Hardware Canucks in quite some time but with the GTX 690’s ability to display high framerates across multiple displays; our testing just wouldn’t have been complete without this section. Honestly, the ability of using a single card for 3D Vision Surround was an opportunity that just couldn’t be missed. An extreme card deserves extreme tests, right? The results you see below were achieved with three Acer GD235HZ monitors, each running at 1080P resolution hooked up to a single GTX 690 through dual link DVI cables.

NV-GTX-690-93.jpg

NV-GTX-690-94.jpg

NV-GTX-690-95.jpg

NV-GTX-690-96.jpg

Impressive, isn’t it? While AMD’s 3D effort is mired in the world of “open source” oblivion, NVIDIA has forged ahead and they now have a single card that can push stereoscopic 3D signals two three monitors, all while games are running at maximum detail settings. The performance penalty is massive but with a few small tweaks, all of these titles should deliver playable framerates.

Plus, with new 3D Vision profiles being released on a regular basis, most of the newest games will have at least some form of support quickly. If you have the money for a GTX 690, you should at least consider this type of awe-inducing setup.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
PCI-E 2.0 vs. PCI-E 3.0 with the GTX 690

PCI-E 2.0 vs. PCI-E 3.0 with the GTX 690


When the GTX 680 was first launched, some assumed that its performance would be somewhat curtailed on anything but a PCI-E 3.0 slot. NVIDIA had other ideas since their post release drivers all dialed its bandwidth back to PCI-E 2.0 when used on X79-based systems. The reasons for this were quite simple: while the interconnects are built into the Sandy Bridge E chips, Intel doesn't officially support PCI-E 3.0 though their architecture. As such, some performance issues arose in rare cases when running two cards or more on some X79 systems.

This new GTX 690 uses an internal PCI-E 3.0 bridge chip which allows it to avoid the aforementioned problems. But with a pair of GK104 cores beating at its heart, bottlenecks could presumably occur with anything less than a full bandwidth PCI-E 3.0 x16 connection. This could cause issues for users of non-native PCI-E 3.0 boards (like P67, Z68 and even X58) that want a significant boost to their graphics but don’t want to upgrade to Ivy Bridge or Sandy Bridge-E.

In order to test how the GTX 690 reacts to changes in the PCI-E interface, we used our ASUS X79WS board which can switch its primary PCI-E slots between Gen2 and Gen3 through a simple BIOS option. All testing was done at 2560 x 1600 in order to eliminate any CPU bottlenecks.

NV-GTX-690-91.jpg

While some of you may not have been expecting these results, they prove that anyone with a PCI-E 2.0 motherboard won’t have to run out for a last minute upgrade. We hardly saw any variance between the two interfaces and in the few cases where there was a discernable difference, it was well within the margin of error. Skyrim is the odd game out but we’ll get to that later.

Let’s explain what’s happening here since it all centers around the complex dance between the CPU, GPU and their interconnecting buses. At higher framerates, a ton of information is passed through the PCI-E interface as the GPU calls on the processor to fetch more frames. This potentially causes a lower bandwidth PCI-E bus to become saturated. At higher resolutions and image quality settings like the ones used in the tests above, the GPU becomes the bottleneck so it calls for less frames from the CPU, leading to the PCI-E interface being less of a determining factor in overall performance.

This brings us to our next point: users may still encounter a bandwidth bottleneck but only when the CPU has to send large batches of frames off to the GPU, something that doesn’t typically happen at higher resolutions. We’d normally see that kind of situation when the GPU is operating at ultra high framerates. This is why Skyrim –which still seems oddly CPU bound at 2560 x 1600- sees an ever so slight benefit from PCI-E 3.0.

Naturally, higher clocked processors can throw out more frames which is why overclocking your CPU can result in a potential PCI-E bottleneck. But once again, with a GPU as powerful as the GTX 690, a situation like this will only happen at framerates so high that it won't cause any noticeable in-game performance drop offs.

In order to put our summations into context, we’ve repeated some of the tests below at lower resolutions / IQ settings. The three games chosen were the only ones that displayed a clear difference after multiple benchmark run-throughs.

NV-GTX-690-92.jpg

As you can see, there is a falloff here that’s beyond our margin of error but even in this case, the GTX 690 with an overclocked processor is able to push such high framerates that a few percentage points won’t make one lick of difference. With a lower clocked CPU, the gap would be even less.

So let’s sum this up: while PCI-E 3.0 can make a minor on-paper difference in some situations, it certainly isn’t needed to ensure your GTX 690 is the fastest graphics card on the block. If you have a PCI-E 2.0 motherboard with a decent processor, an upgrade to a Gen3 interface really isn't necessary.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Temperatures & Acoustics / Power Consumption

Temperature Analysis


For all temperature testing, the cards were placed on an open test bench with a single 120mm 1200RPM fan placed ~8” away from the heatsink. The ambient temperature was kept at a constant 22°C (+/- 0.5°C). If the ambient temperatures rose above 23°C at any time throughout the test, all benchmarking was stopped. For this test we use the 3DMark Batch Size test at its highest triangle count with 4xAA and 16xAF enabled and looped it for one hour to determine the peak load temperature as measured by GPU-Z.

For Idle tests, we let the system idle at the Windows 7 desktop for 15 minutes and recorded the peak temperature.


NV-GTX-690-87.jpg

The temperatures displayed by the GTX 690 are nothing short of incredible. Remember, the cooling system has to deal with 300W of output in addition to the heat produced by the memory modules, PWM stages and the PLX bridge chip. It looks like NVIDIA’s heavy investment in a high end cooling solution is paying dividends.


Acoustical Testing


What you see below are the baseline idle dB(A) results attained for a relatively quiet open-case system (specs are in the Methodology section) sans GPU along with the attained results for each individual card in idle and load scenarios. The meter we use has been calibrated and is placed at seated ear-level exactly 12” away from the GPU’s fan. For the load scenarios, a loop of Unigine Heave 2.5 is used in order to generate a constant load on the GPU(s) over the course of 20 minutes.

NV-GTX-690-47.jpg

Not only do the two cores run at cooler temperatures than the single chip on a GTX 680 but NVIDIA’s fan design and high end shroud materials are able to keep noise levels down as well. This is actually one of the quietest dual GPU cards we have tested and while the fan does spin up after a few minutes under load, it shouldn’t be noticeable under normal gaming conditions.


System Power Consumption


For this test we hooked up our power supply to a UPM power meter that will log the power consumption of the whole system twice every second. In order to stress the GPU as much as possible we once again use the Batch Render test in 3DMark06 and let it run for 30 minutes to determine the peak power consumption while letting the card sit at a stable Windows desktop for 30 minutes to determine the peak idle power consumption. We have also included several other tests as well.

Please note that after extensive testing, we have found that simply plugging in a power meter to a wall outlet or UPS will NOT give you accurate power consumption numbers due to slight changes in the input voltage. Thus we use a Tripp-Lite 1800W line conditioner between the 120V outlet and the power meter.

NV-GTX-690-88.jpg

Power consumption has always been a problem area for dual GPU cards but NVIDIA seems to have tackled it quite well. By using a pair of slightly downclocked GK104 cores and integrating efficient components like a 2oz copper PCB and a high end PWM design, the GTX 690 is actually the most miserly ultra high end card ever created. Even two HD 7970 cards look horribly inefficient by comparison.

If you are running a GTX 690 with an overclocked Ivy Bridge or Sandy Bridge processor, you shouldn’t need anything more than a good quality 700W power supply.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Conclusion

Conclusion


NVIDIA’s goal for the GTX 690 was to make the fastest graphics card of all time and they’ve succeeded beyond most people’s expectations. After the experience with Fermi and G80 before it, many predicted this round of GeForce products would be more of the same: hot running, power hungry cards sporting huge, inefficient cores. What we got instead was Kepler; an architecture which marks a huge turn around for the company’s direction and exemplifies how the missteps of one generation can be rectified in its successor. The result is a product that follows in the GTX 590’s footsteps by moving the dual GPU market further away from the noisy, lackluster cards of previous generations.

The GTX 690’s list of accomplishments is impressive to say the least but none of these milestones can measure up to the performance numbers it achieves. By leveraging the Kepler architecture’s inherent efficiency, NVIDIA was able to combine two fully enabled GK104 cores onto a single PCB and has set bar so high, AMD will have a hard time regaining the performance lead. To hit optimal power consumption numbers, the Base Clock of both cores was reduced but due to Boost clocks and the PLX chip's low latency interface, the real world difference between the GTX 690 and a pair of GTX 680s in SLI is nonexistent. Indeed, the GTX 690 represents a quantum leap over the last generation of dual GPU cards and is so powerful that it would be a complete waste to buy for resolution at or above 2560 x 1440.

NV-GTX-690-90.jpg

When looking over the charts, it is hard not to mention the HD 7970 Crossfire solution. It scaled remarkably well in most situations and does have several merits, mostly due to a lower overall price, but latent driver problems prevent it from becoming a viable alternative. While the GTX 690 had a small misstep in Shogun 2 (as have all NVIDIA cards since the latest patch), the dual AMD cards presented scaling issues in Skyrim, black screens in The Witcher 2: Enhanced Edition, poor minimum framerates and an odd periphery flickering in Metro 2033. While the two HD 7970’s performance may look good on paper, we just can’t recommend investing $920 in them until AMD’s driver team makes some serious upgrades. With that being said, the HD 7970’s wider memory interface does tend to make a difference in some scenarios, particularly when gaming at multi monitor resolutions.

NVIDIA made huge strides in the power consumption and acoustical departments, two areas that high level gamers don’t typically care about, but we’ll take them nonetheless. It truly is amazing to see such a tame personality out of a card that runs right alongside two GTX 680s.

Putting money and features aside for a moment, some will question whether a card like the GTX 690 even has a place in today’s market. We think it does, particularly from a marketing standpoint. But questions have to be raised about a dual GK104 card launching at a time when retailers are still struggling to deliver GTX 680 cards. This is a true halo product that will elevate NVIDIA’s standing in the eyes of gamers but many of their customers are still waiting on GTX 680 backorder lists.

The GTX 690 may be the fastest graphics card around but it is also completely unaffordable for anyone without very deep pockets or a nice trust fund. Yet for those that can afford it, the GTX 690 is the pinnacle of uncompromising graphics card design. It delivers unheard of performance and stretches our expectations of what’s possible in the GPU market.

240463923aca1f6b.jpg

Note: Due to the quick nature of this review, we decided to skip overclocking results and instead concentrate upon an expanded testing suite. Stay tuned for OC performance!
 
Last edited:
Status
Not open for further replies.

Latest posts

Top