|by Michael "SKYMTL" Hoenig | January 28, 2008|
ATI Strikes Back
ATI Strikes Back
So this is how the new ATI lineup looks with the addition of the HD3870 X2 and not including the other mid and low-end cards which have gone through a soft release a few days ago. Personally, I think it is amazing to see how far we have progressed in the last few months in terms of the price / performance ratio of both ATI and Nvidia GPUs and in the whole scheme of things, I think this card is priced perfectly. Instead of people going out to buy a pair of HD3870 cards, they will be looking very seriously at buying the HD3870 X2 since it seems to offer all the benefits of two cards in one simple package. Unfortunately, as stated in the intro it looks like upon release the price for this card here in Canada will end up being a bit more expensive than a pair of HD3870 cards.
The concept of the HD3870 X2 is centered around a pair of ATI’s successful HD3870 512MB cards which are connected by what amounts to an internal Crossfire bridge chip/ switch which handles the communication between the two GPUs. Even though shoehorning what amounts to a pair of GPU cores onto one PCB is not a new concept (remember 3Dfx) ATI is the first company to do it in recent GPU history. Don’t mix up this implementation with Nvidia’s late 7950GX2 card which was basically two cards on two separate PCBs which communicated over an SLI bridge; ATI has gone a different route and has installed both GPU cores and their associated bridge chip on one PCB.
This card is a beast. Where the HD3870 was a scalpel that quietly cut away at the mid-priced market, the HD3870 X2 announces itself like a swift kick in the nuts. It is the length of an 8800GTX, it has a massive turbine of a fan and it holds 1GB of GDDR3 memory operating at 1800Mhz. Interestingly, the engineering sample we have has both cores are operating at a higher speed (at 825Mhz) than a stock HD3870 whose core runs at 777Mhz. Reports indicate that these cards will be available at different core speeds ranging from 777Mhz all the way past 825Mhz.
Some of you may also be wondering why ATI decided to go with slower-clocked GDDR3 memory for this card instead of the same GDDR4 which is used on the HD3870 512MB cards. While I would have liked to have seen the GDDR4 for both bandwidth and power saving improvements, it was not to be. I would think that the GDDR3 was used in order to lower the overall cost of the card instead of using the more expensive GDDR4.
When push comes to shove, the HD3870 X2 is still two HD3870 cards which are working in Crossfire and all of the inherent problems with a multi-GPU environment may still pop up. I hope that the ATI driver team has been burning the midnight oil in order to iron out all of the kinks associated with Crossfire gaming. Let’s be honest, one powerful single-chip card is always better than two lower-end cards running in Crossfire or SLI but if the drivers are up to snuff we should actually see an improvement over seperate cards.
A Quick and Dirty Rundown of Quad Crossfire
Some of you may hear the term Quad Crossfire or Crossfire X and shiver in dread from the memories of Quad SLI. Somehow, Nvidia never really got the concept of running a pair of 7950GX2 cards off the ground but ATI is determined to make it work with their HD3870 X2 cards. They have been working long and hard in order to get the drivers working properly but we will not be able to test this claim in this review.
Quad Crossfire is supposed to work on “select” motherboards which means it should work on motherboards with AMD’s 790FX chipset or any other Crossfire-compatible motherboard with dual x16 PCI-E slots. That means people with motherboards sporting Intel’s X38 chipset or those planning on buying a much-delayed X48 boards should be able to run a quad Crossfire configuration. On the other hand, it looks like you won't be able to run two of these cards on boards sporting the P35 chipset. You may notice that I say “should”. This is because I haven’t tried any of these configurations myself so until I see people in the community getting it to work, I will treat ATI’s claims with a grain of salt.
|Latest Reviews in Video Cards|