A Closer Look at the GTX 690
A Closer Look
The GTX 690 has been a true mystery up to this point. There have been plenty of rumors about it, with some far-fetched claims running around the web pointing towards the mythical “GK100” core finally making an appearance. That won’t be happening in the GeForce market anytime soon –if ever- but what we have here is something truly interesting: a pair of fully enabled GK104 cores melded onto a single PCB.
The specifications of his card are quite straightforward: take a pair of GTX 680 cards, combine them and you have a GTX 690….more or less. Each GK104 uses 1536 CUDA cores alongside 128 Texture Units and 32 ROPs for a total of 3072 cores, 256 TMUs and 64 ROPs.
The only difference between the two setups is the slightly lower speed at which the GTX 690 operates at. Instead of a 1006MHz Base Clock and 1058MHz Boost Clock, it makes do with slightly more pedestrian 915 / 1019Mhz speeds. On paper, this should cause a discrepancy between the performance of two GTX 680s in SLI and the GTX 690. However, there some additional factors which should allow this new flagship product nearly match the performance output of two individual GK104-based cards. We’ll go into further detail about this in our full review.
On the memory front, things are absolutely identical as the GTX 680. Each core benefits from 2GB of GDDR5 memory operating at a blazing 6Gbps and a 256-bit interface. The GTX 690 also supports quad SLI.
With slightly lower processor operating frequencies, power consumption is actually quite reasonable at 300W. Considering the HD 6990 had a threshold of 375W and even NVIDIA’s own GTX 590 hit 369W, it looks like the efficient Kepler architecture is once again flexing its muscles. Plus, for those overclockers among you, this card supposedly has a good amount of extra headroom as well.
There’s no denying the fact that the GTX 690’s looks like it’s worth a thousand bucks. NVIDIA has designed it so there isn’t one bit of plastic used in its construction. The heatsink shroud is fabricated from cast aluminum and injection molded magnesium alloy which happens to be a great conductor of heat while acting as a noise insulator to further reduce the fan’s acoustical profile. Speaking of the fan, it is outfitted with a high end beating design and its airflow is carefully directed in order to optimize cooling efficiency while minimizing noise causing restrictions.
There are two areas above the fin arrays that may look open in the picture above but they’re actually covered in a heat resistant clear polycarbonate. Finally, NVIDIA has also included a laser cut GeForce GTX logo on the 690’s side that uses LEDs to glow green and it should be modifiable through software from the likes of EVGA, MSI and ASUS.
By cracking apart the heatsink we see a pair of cores separated by an SLI bridge chip. Since the older NF200 bridge chip is only rated for PCI-E 2.0 speeds, we’re guessing that NVIDIA went with a Gen 3 PCI-E interconnect from PLX or another vendor.
In terms of power supply, the GTX 690 uses a robust 10-phase all digital PWM design for the cores themselves while the memory will likely receive its own dedicated distribution grid. This is hooked up to two 8-pin PCI-E connectors which, when combined with the PCI-E slot’s available power should give this card a good amount of overhead for overclocking. All of these components sit on top of a 10-layer PCB which is impregnated with 2 ounces of copper for high efficiency power delivery.
The heatsink assembly is really something to behold and the picture above doesn’t do it any justice. It is made up of two individual vapor chambers topped with a nickel plated fin stack.
NVIDIA has waited two and a half years to get to this point and for all intents and purposes, it looks like they’ll achieve their goal of releasing the fastest graphics card in the world. Indeed, it looks like the GTX 690 is about to set the bar so high that AMD will have little chance of catching up, even if they do manage to unveil their rumored New Zealand dual GPU product. We can’t wait to show you what this thing is capable of…..but that will have to wait for a few more days.
|Latest Reviews in Video Cards|