NVIDIA GTX 660 2GB Review

Author: SKYMTL
Date: September 12, 2012
Product Name: GTX 660 2GB
Share |

Under the GTX 660’s Heatsink

Note: we used EVGA’s GTX 660 SC for this section since it uses a reference PCB design. However, the heatsink on the reference card will be different from the one you see below.

Removing the heatsink from this card is extremely easy, which bodes well for anyone installing an aftermarket cooler. Just be aware that there are a few small overhangs that tend to snag the PCB so don’t force anything or you’ll likely crack the plastic casing.

Instead of using a somewhat cheap looking heatsink EVGA’s card uses a large cast aluminum affair that uses a copper contact plate with secondary cooling areas for the VRM modules and GDDR5 memory. This is all tied to a dense fin array that’s directly fed by a copper heatpipe, making it much more substantial than anything we’ve seen from reference GTX 660 Ti and GTX 670 cards. EVGA tells us this is a custom design which is supposed to evenly distribute heat while ensuring all of the components stay at optimal temperature.

The component layout on the GTX 660 is different from previous cards as well since its 4+2 PWM is pushed to the rearmost area of the PCB while the memory modules are loosely spaced around the core’s periphery. Speaking of the GK106 core, at 214mm˛ it is substantially smaller than the 294mm˛ of NVIDIA’s GK104 and it does without the large IHS from previous generations.

Latest Reviews in Featured Reviews
March 17, 2015
Another year and another TITAN is upon us.  This time NVIDIA is launching the TITAN X, a card that sports an 8 billion transistor GM200 core, 12GB of memory and unbelievable performance....
March 9, 2015
GIGABYTE's X99-SOC Champion seems to have everything it takes to become a legendary motherboard.  It has high end overclocking built into its bones and a low price but will that lead to success?...
February 24, 2015
The GTX 960 is a polarizing graphics card but we have five of the best models on the market going head to head.  These may just change your opinion of NVIDIA's latest GPU....