xentr_theme_editor

  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

Palit Radeon HD 4870 X2 2GB Video Card Review

Status
Not open for further replies.
xentr_thread_starter
Temperatures & Acoustics / Power Consumption

Temperature Testing


For this heat test we went about things a bit differently since there is currently no program available which monitors and logs temperatures for the new ATI cards. So, for this test we loaded the card with 3DMark’s Batch Size Test at the highest triangle count to put a constant and high load on both GPUs for 1 hour while keeping two instances of GPU-Z’s Sensors tab open. Right before the 30 minute test finished, we quickly Alt + Tab to the desktop and quickly take a note what temperatures the cores peaked at. While this may not be the most scientific method, we are reasonably sure about its accuracy so take it as you will.

HD4870X2-65.jpg

Idle temperatures on this card are actually very good for a dual chip card but it is very easy to tell which of the cores is further from the fan. While we were sure to see some extremely high temperatures, it was surprising to see the hotter of the two cores never breaking the 88C barrier after a whole hour of high load. The cooler of the two cores was well within the norms due to its proximity to the torrents of cool air being moved by the fan. Overall, this isn’t too bad of a result but we can see what effect the PCB layout and heatsink design has on this type of two core design; one core will always run significantly hotter than the other.

Something else that should be brought up is that this card gets very, VERY hot on every surface so make sure that you let it cool down for at least 10 minutes before even thinking of touching it. It also outputs waves of heat so be prepared to sweat like a pig if your room isn’t well ventilated. Indeed, by holding a temperature probe 2” away from the exhaust grille we measured a whopping temperature of 52C. This is great for those Canadian winters but it seriously sucks in the dead heat of summer.


Acoustical Properties


If you have sensitive hearing you may want to look the other way instead of buying this card. While it is not s loud as the GTX 280 cards we have had in our possession, the HD4870 X2 does make its presence known if your game audio isn’t turned up to medium. It isn’t an annoying sound but it is rather a loud “whoosh” which is accompanied by a blast of hot air. Since the air within the heatsink needs to move over two separate fin assemblies while maintaining its speed, the single fan naturally has to spin up to some pretty high RPMs. If you have a case with a perforated side panel (the Gigabyte Aurora 570 I use is a perfect example of this) you may want to think about blocking it off if you insist on absolute silence.

This kind of performance always has its tradeoffs and as has always been the case with high powered cards, that tradeoff is usually found in the acoustical footprint of any high end card.


Power Consumption


For this test we hooked up our power supply to a UPM power meter that will log the power consumption of the whole system twice every second. In order to stress the GPU as much as possible we once again use the Batch Render test in 3DMark06 and let it run for 30 minutes to determine the peak power consumption while letting the card sit at a stable Windows desktop for 30 minutes to determine the peak idle power consumption.

Please note that after extensive testing, we have found that simply plugging in a power meter to a wall outlet or UPS will NOT give you accurate power consumption numbers due to slight changes in the input voltage. Thus we use a Tripp-Lite 1800W line conditioner between the 120V outlet and the power meter


HD4870X2-61.jpg

The first thing that is sure to catch your attention is the fact that idle power consumption of the HD4870 X2 is seriously lowered over the Crossfire setups to the point where it hovers below that of dual HD4850 cards. It seems like ATI has finally got PowerPlay working in this card at least so hopefully they will get it to work in the rest of the lineup as well.

What completely stunned me was the fact that peak full load power consumption actually decreased a bit over a pair of HD4870 cards even though it has 1GB more memory in total. This left me scratching my head a bit but it is possible that the single PCB layout is more efficient than two separate cards on two separate PCBs connected by a Crossfire bridge. Thinking there was something amiss, the test was run twice with the same result so it seems our assumption of a more efficient design could very well be correct.

Since these power consumption figures leave the system memory and the processor pretty much alone, we would recommend that you use a good 700W or higher power supply with an 8-pin PCI-E connector if you are planning on using this card.
 
Last edited:
xentr_thread_starter
DFI Motherboard Owners….Take Note

DFI Motherboard Owners….Take Note


Unfortunately it has come to our attention that the fix listed below is not working for some people while others have claimed it worked. Thus, until we have either heard from DFI regarding this issue or the community is given a fix we are recommending you AVOID DFI X38 and X48 motherboards if you intend to buy this card.

You may not believe it but we had much, much more planned for this review but unfortunately our DFI motherboard conspired to haunt us again and again. Other than the fact that our LanParty DK X38 T2R decided to mysteriously give up on our very slight memory overclock, it refused to cooperate when we installed the HD4870 X2 into it. An ASUS X48 P5E Deluxe worked without a hitch, as did a Gigabyte EP35 and ASUS P5Q Pro so we knew it wasn’t the chipset and had read of these same things happening to owners of the HD3870 X2 when pairing that card up with certain DFI boards.

Basically, what happened was that we installed it into the board and a BIOS splash screen would not display even though the LED indicators told us everything was booting fine. There just wasn’t any video signal coming through the card so we somehow deduced that there was an issue with the X2’s display output. Well, we popped in the second HD4870 X2 card and lo and behold, had the same issue. Somewhere within the ensuing flurry of BIOS flashes, the motherboard decided enough was enough and finally died leaving us without a board and an approaching deadline.

To cut a long story short, we brought in another DK X38 T2R along with a beastly LanParty LT X48 T2R and found both new boards to have the same video display issue so we immediately knew that it was the board itself and not the card. After searching through the DFI Beta site, a BIOS was found that completely resolved the problem with the incompatibility between the motherboard and the HD4870 X2.

At this time we have only experienced problems with DFI’s X48 and X38 boards. The P35 and P45 series seem to be fine but if you are having problems with either of these chipsets, please post in our HD4870 X2 comment thread and we will try to sort things out.

Below we have listed the motherboards which may be affected by this problem as well as links to their corresponding Beta BIOS pages. You should use the most recent BIOS file we indicate or the latest one available.

LANPARTY DK X38-T2R: X38D725

LANPARTY DK X38-T2RB: X38D7251

LANPARTY DK X48-T2RS: X48DD725

LANPARTY LT X38-T2R: X38AD725

LANPARTY LT X48-T2R: X48AD725

LANPARTY LT X48-T3RS: X48GD725

LANPARTY UT X48-T2R: X48CD725

LANPARTY UT X48-T3RS: X48BD725

If your motherboard is not listed here, please visit the main page to the DFI beta BIOS site and look for your motherboard in the drop-down menu.

Use beta BIOS at your own risk..they are Betas for a reason folks.

The main problem we can see with this is that if you are building a new computer and do not have access to another graphics card, you will be in a fair bit of trouble. This is because you will have no way to update the BIOS if you can’t get a video signal. So, we recommend that if you are planning on buying a DFI board and the HD4870 X2 that you buy a cheap PCI-E graphics card just in case you have the display problem that seems to pop up on these boards. On the other hand, if you are upgrading to the HD4870 X2, you should update your BIOS before installing your new card.
 
Last edited:
xentr_thread_starter
Conclusion

Conclusion


Before I go on with this conclusion, let’s put one thing straight: the HD4870 X2 2GB is the absolute undisputed king of the single-card hill and probably will be for some time. It steamrolled over every single one of Nvidia’s offerings like a runaway freight train and made the GTX 280 look like nothing more than an also-ran in the grand scheme of things. But just remember: this isn’t a card for those of you with 19” or 20” LCD monitors. It is for the privileged few with massive monitors of the beastly 24” and 30” variety along with those of you who want e-penises the size of the Eiffel tower. Does it perform up to expectations? Hell yes and then some

When it comes to overall performance, there is nothing like experiencing this dual GPU monster rip through the framerates and seeing scores like these will make even the most jaded hardware reviewer giggle like a giddy schoolgirl. I could go on and on about how well this card did in our tests but the benchmarks shown here today more than speak for themselves. There is nothing like having a card that costs LESS than a HD4870 Crossfire setup but in most cases performs BETTER at some higher resolutions…and with beta drivers at that. And that leads us to another positive point about the HD4870 X2: its price. For once we see a performance crown holder which does not command an astronomical price for the admission fee. It also provides the perfect escape for those of you who don’t have a pair of high-powered PCI-E 2.0 slots and would rather stick to a single-card solution while not breaking the bank. Sorry folks, no matter how impartial we try to sound there is no hiding the fact that we love this card and what it can do for the right system.

Notice I said that this card needs the right system? I mean it. The HD4870 X2 does not deserve to be put in a non-overclocked system with an under-24” monitor. This means that if you are playing at a slightly lower resolution, you have the very real possibility of this card being severely CPU limited if you don’t overclock the balls off your processor. Just remember this when you are searching through the internet for more information about this card; in many cases, a stock processor can and will hamstring ATI’s new wunderkind. Heck, even our overclocked QX9770 at nearly 4Ghz showed it wasn’t up to the task of running with this card in a game or three.

As with all good things, there are a few stumbling points with this card and many of them either spring from the unpredictability and general characteristics of a dual GPU setup or beta drivers. The first thing we should point out is that this card outputs a load of heat which will be dumped right back into your immediate environment. While playing your favorite game in your skivvies may not be an issue for many of you, the noise the HD4870 X2’s fan puts out in order to keep temperatures under control may have some of you cringing. It is nothing a good sounds system won’t drown out but if you put value in a quiet system, this may not be the right card for you…or at least until someone releases an aftermarket cooler for it. Other than that, the only area we see the need for improvement is in the driver department since at some times this card is pretty far ahead of a HD4870 Crossfire setup but in others it falls behind by a fair margin. We usually see this with beta drivers on new ATI hardware so we wouldn’t be surprised to see this fix by the time the WHQL driver comes out. There also seems to be scant few games out there that can take advantage of the 2GB of memory at the settings we tested but that could all change as more games come out with high texture memory requirements. I could also mention the lack of proper warranty support from the majority of ATI partners but I have beaten that dead horse past death’s gate and back again. Just remember, when push come to shove this IS a dual GPU solution which will have all the limitations of Crossfire to go along with. So, while CF support in nearly every game seems good right now, there may be games in the future which will not have the proper profiles when they are first released.

Final mention has to be given to the power consumption of the Palit HD4870 X2 since it is one area we have to commend ATI on. It seems like they have finally gone ahead and released a driver that properly implements their PowerPlay features which has been promised for some time now. While me saying this may be of little solace to those of you looking at overall power consumption and rolling your eyes, performance per watt is extremely high nonetheless. That being said, hopefully PowerPlay will be a feature in the next WHQL driver update for the whole ATI family.

All in all, the Palit Radeon HD4870 X2 is a heck of a card which combines not only bleeding-edge performance but a surprising amount of performance per dollar. It is all bundled up into a product which once again adds ATI’s name to the very top of the extreme performance category. Even though it is not without its minor imperfections, it embodies the new spirit of ATI which is once again on the road to offering enthusiasts what they want most: performance like no other.



Pros:

- All the power HD4870 Crossfire and then some on a single card
- Amazing high resolution performance
- Surprising value @ approx. $550
- Good idle power consumption


Cons:

- Acoustics at high load may be too much for some
- 2-year warranty


240463923aca1f6b.jpg


Thanks to Palit and ATI / AMD for making this review possible

http://www.hardwarecanucks.com/forum/video-cards/9282-palit-hd4870-x2-review-comment-thread.html
 
Last edited:
Status
Not open for further replies.
Back
Top