What's new
  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

XFX Radeon HD 5830 1GB GDDR5 Review

Status
Not open for further replies.

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
HD5850-17.jpg

XFX Radeon HD 5830 1GB GDDR5 Review




Product Number: HD-583X-ZNFV
Manufacturer’s Product Page: Click Here
Price:Approx $260 USD / $270 CAD
Warranty: Double Lifetime



It almost feels odd to not have a week where ATI is launching a new card into the market but lo and behold, it seems we have fallen into a spot of relative calm. Let’s call it the calm before the storm since by the end of next week the whole graphics card market could be shaken up by the release of NVIDIA’s supposed juggernaut: the GF100. Until that time however, ATI’s products still reign supreme in nearly every price category.

Back when it was first released, the HD 5830 1GB wasn’t exactly the most well-loved product we have ever reviewed here on Hardware Canucks. To be honest with you, it was universally lambasted for being far too expensive for the performance it delivered. There were other teething problems with the HD 5830 such as high power consumption and a gargantuan size that was more akin to the HD 5870 than it was to other mid-range cards. We held out hope at the time that prices would eventually level out as the excitement surrounding the launch died down and to a certain extent this has actually happened. Over the last week or so we have seen some sales brining down the HD5830’s price to around the $230 USD mark which actually represents a decent value for your money.

Before the HD 5830 vaulted into the public’s perception, there was a vast chasm in price and performance between the $160 HD 5770 and the $320 HD5850. ATI needed to offer something to potential customers who were saw the HD 5770 as unsuitable for their gaming needs but couldn’t justify spending more than $300 on a higher end card. Simply put, the HD 5830 is supposed to give people a safe middle ground for their gaming needs.

In this review we will be looking at the XFX version of the HD 5830 1GB GDDR5. Basically, no two HD 5830 cards from different manufacturers are alike and XFX has decided to go with a unique cooling solution even though their card sticks to stock clock speeds. In addition having to one of the more interestingly-designed cards currently on the market, XFX also boasts what can arguably be called the best warranty coverage among ATI’s board partners. They offer what is called a Double Lifetime Warranty that protects not only the original purchaser of the card but also anyone who buys it second hand. Naturally, this adds a quite a bit of value to XFX’s video cards and is something you can’t get when purchasing for any other company.


XFX-HD5830-15.jpg
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
A Closer Look at the HD 5830’s Market Placement

A Closer Look at the HD 5830’s Market Placement


HD5830-82.jpg

As we mentioned in the introduction, the HD 5830 is tailor made to slot into the substantial price gap between the HD 5850 and the HD 5770. It is also destined to replace the outgoing HD 4890 1GB. Unlike the HD 5770 which used a smaller, 1.04 billion transistor version of the HD 5850’s and HD 5870’s 2.15 billion transistor cores, this new card uses that same enthusiast-grade Cypress core and redubs it the “Cypress LE”. However, as you can see from the specifications above the HD 5830 uses a highly cut down version of ATI’s high end architecture. This is basically done by taking cores that failed to pass the binning process necessary for HD 5850 and HD 5870 cards, shaving off some SPs, ROPs and texture units in order to make a product that will fit into a more mainstream price segment.

HD5830-17.jpg

The result of these die cuts is a video card that does bridge the gap between high end and mainstream products but at a cost of significant amounts of rendering horsepower. Many people were hoping that this card would have something along the lines of 1280 Stream processors enabled but ATI seems to have gone a bit wild here and ended up cutting 320 for a total of 1120 SPs. The ROPs and texture units also went under the knife with the ROPs in particular being castrated down to half of their original count in the HD 5850. This can and will have a significant impact on overall rendering performance regardless of the fact the HD 5830 has its core clocked 75Mhz higher than its bigger brother. Memory speeds stay the same however.

HD5830-18.jpg

When compared to the outgoing HD 4890 on the other hand, this new card looks to hold an edge even though it has been theorized that the HD 5000-series’ SPs don’t work as well as those on the HD 4000 series. With more texture units and the same number of ROPs yet slightly lower core clock speeds, it seems like the HD 5830 1GB should hold a slight edge in some situations. Or at least we hope…

All of this cutting results in a card that is priced right at the midpoint between the HD 5850 and HD 5770 while targeting the now-discontinued GTX 260 216 and GTX 275 cards in terms of overall performance. Will this approach appeal to this card’s target audience? Only time will tell.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Focusing on DX11

Focusing on DX11


It has been a hair under three years since the release of Windows Vista and with it the DirectX 10 API. In that amount of time, a mere 33 DX10 games were released. That isn’t exactly a resounding success considering the hundreds of titles released in that same time. Let’s hope DX11 does a bit better than that.

HD5870-109.jpg

DX11 is focused on taking the lessons learned from the somewhat inefficient DX10 and shaping them into a much more efficient API which will demand less system resources while being easier to develop for. In addition to the usual 3D acceleration, it will also be used to speed up other applications which in the past have not been associated with the DirectX runtime. This may be a tall order but with the features we will be discussing here, developers have already started using DX11 to expand the PC gaming experience. It is an integral component in Windows 7 and according to Microsoft, will also be adopted into Windows Vista through a software update.

Let’s scratch the surface of what DX11 can bring to the table.

HD5870-110.jpg

Unlike past DirectX versions, DX11 endeavours to move past the purely graphics-based uses of the API and push it towards being the lynchpin of an entire processing ecosystem. This all begins with the power which DirectX Compute will bring into the fold. Not only can it increase the efficiency of physics processing and in-game NPC intelligence within games by transferring those operations to the GPU but it can also be used to accelerate non-3D applications.

HD5870-111.jpg


HD5870-104.jpg

Through the use of Compute Shader programs in Shader Model 5.0, developers are able to use additional graphical features such as order independent transparency, ray tracing, and advanced post-processing effects. This should add a new depth of realism to tomorrow’s games and as mentioned before, also allow for programs requiring parallel processing to be accelerated on the GPU.

HD5870-101.jpg

For the majority of you reading this review, it is the advances in graphics processing and quality that will interest you the most. As games move slowly towards photo-realistic rendering quality, new technologies must be developed in order to improve efficiency while adding new effects.

HD5870-105.jpg

Some of the technologies that ATI is championing are DX11’s new Depth of Field, OIT (or Order Independent Transparency) and Detail Tessellation. While the pictures above do a good job of showing you how each of these works, it is tessellation which ATI seems most excited about. They have been including hardware tessellation units in their GPUs for years now and finally with the dawn of DX11 will these units be finally put to their full use. OIT on the other hand allows for true transparency to be added to an object in a way that will be more efficient resource-wise than the standard alpha blending method currently used.

HD5870-102.jpg

Let’s talk about DX11 games. As you would expect, due to the ease of programming for this new API and the advanced tools it gives developers, many studios have been quite vocal in their support. Even though some of the titles listed above may not be high on your list of must have games, A-list titles like Aliens vs. Predator from Rebellion and DiRT 2 are sure to get people interested. What we like see is at least three DX11 games being available before the Christmas buying season even though BattleForge is already available and will have DX11 support added through a patch.

Another exciting addition to the list is EA DICE’s FrostBite 2 Engine which will power upcoming Battlefield games. Considering the popularity of this series, the inclusion of DX11 should open up this API to a huge market.

HD5870-103.jpg
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
OpenCL: The Next Big Thing?

OpenCL: The Next Big Thing?


HD5870-115.jpg

As consumers, we have all heard of the inroads GPUs have been making towards offering stunning performance in compute-intensive applications. There have been attempts to harness this power by engines such as NVIDIA’s Compute Unified Device Architecture (CUDA) and ATI’s Stream SDK (which in v2.0 supports OpenCL).

HD5870-113.jpg

“Build it and the will come” says the old mantra but industry adoption of CUDA and Stream was anything but quick since there were two standards being pushed for the same market. CUDA in particular is having a hard time of it since it is vendor-specific without hardware support from any other vendor. The industry needed a language that was universal and available across multiple platforms. That’s were OpenCL (Open Computing Language) along with DirectX Compute come into play. It is completely open-source and managed by a non-profit organization called the Khronos Group which also has control over OpenGL and OpenAL

HD5870-114.jpg

At its most basic level, OpenCL is able to be executed across multiple mediums such as GPUs, CPUs and other types of processors. This makes it possible to prioritize workloads to the processor that will handle them most efficiently. For example, a GPU is extremely good at crunching through data-heavy parallel workloads while an x86 CPU is much more efficient at serial and task-specific This also allows developers to write their programs for heterogeneous platforms instead of making them specific to one type of processor.

HD5870-116.jpg

So what does this mean for gamers? First of all, AMD has teamed up with Bullet and PixeLux in order to achieve more realistic environments for players. The Bullet Physics is an open-source physics engine which has an ever-expanding library for soft body, 3D collision detection and other calculations. Meanwhile, PixeLux uses their DMM (Digital Molecular Matter) engine which uses the Finite Element Analysis Method of calculating physics within a game. In past applications, it has been used to calculate actions which have an impact on the game’s environment such as tumbling rubble or debris movement.

HD5870-117.jpg

With Stream moving to OpenCL, ATI is truly moving towards an open platform for developers which they are hoping will lead to broader developer and market adoption than the competition’s solutions. At this point it looks like we will soon see ATI’s GPUs accelerating engines from Havok, PixeLux and Bullet through the use of OpenCL. Considering these are three of the most popular physics engines on the market, ATI is well placed to make PhysX a thing of the past.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
ATI’s Eyefinity Technology

ATI’s Eyefinity Technology


2404aba41c60e982.jpg

The term Surround Gaming may not mean much to many of you who are reading this article but with the advent of ATI’s new Eyefinity technology, now is a good time to educate yourself. Basically, Eyefinity will give users the ability to use multiple monitors all running from the same graphics card. In the past, simple dual monitor setups have been used by many graphics, CAD or other industry professionals in order to increase their productivity but gaming on more than one monitor was always a bit of a clunky affair. Granted, some products like Matrox’s TripleHead2Go were able to move multi monitor setups into the public’s perception but there were always limitations (resolution and otherwise) associated with them. ATI is aiming to make the implementation of two or even more monitors as seamless as possible within games and productivity environments while offering the ability to use extreme resolutions.

2404aba41c633257.jpg

While the price of two or even three new monitors may be a bit daunting at first for many of you, but good 20” and even 22” LCDs have come down in price to the point where some are retailing below the $200 mark. ATI figures that less than $600 for three monitors will allow plenty of people to make the jump into a true surround gaming setup. Indeed, with three or even six monitors, the level of immersion could be out of this world.

2404aba41c626f4b.jpg

The reason that main in the professional field are familiar with multi monitor setups is for one simple matter: they increase productivity exponentially. Imagine watching a dozen stocks without having to minimize windows all the time or using Photoshop on one screen while watching a sports broadcast on another and using the third screen for Photoshop’s tooltips. The possibilities are virtually limitless if it is implemented properly.

2404aba41c634766.jpg

When it comes to a purely gaming perspective, the thought of a massive view of the battlefield or the ability to see additional enemies in your peripheral vision is enough to make most gamers go weak in the knees. Unfortunately, the additional monitors will naturally mean decreased performance considering the massive amount of real-estate that would need rendering. This will mean tradeoffs may have to be made in terms of image quality if you want to use Eyefinity.

HD5870-15.jpg

According to ATI, all of the new HD 5800-series graphics cards will have the ability to run up to three monitors simultaneously. This is done by having a pair of DVI connectors as well as a DisplayPort and HDMI connector located on the back of the card. It should be noted that ATI will be releasing a special Eyefinity version of the HD 5870 in the coming months which features six DisplayPort connectors for those of you who want to drive six monitors from a single card.

2404aba41c635d0d.jpg

This technology is all made possible through the use of DisplayPort connectors but this also provides a bit of a limitation as well. Above we can see that a number of 3-screen output combinations which the current HD5800-series support and one thing is constant: you will need at least one monitor which supports DisplayPort. Unfortunately, at this time DP-supporting monitors tend to carry a price premium over standard screens which will increase the overall cost of an Eyefinity setup. Luckily the other two monitors can either use DVI or a combination of DVI and HDMI for connectivity.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
HD Audio and Video

HD Audio and Video


HD5450-20.jpg
HD5450-21.jpg

One of the main drawing points of the lower-end cards in the HD 5000 series lineup is the fact that they are literally unmatched when it comes to HTPC use. Granted, the GT 210, 220 and 240 cards from NVIDIA are the first cards from the green side of the pond to receive native audio processing without having to resort to a clunky S/PDIF cable but their HD audio compatibility is limited to non-PAP (Protected Audio Path) implementations. Meanwhile, the HD 5000 series features not only support for native HDMI audio support with compatibility with AC3, 8-channel LPCM and DTS among others but it also introduces PAP support for bitstream output of Dolby True HD, DTS HD Master Audio, AAC and Dolby AC-3. This allows high-end audio for 7.1 sources to be passed unhindered from your computer onto your receiver and is a huge step up from what the competition offers.

As for HD video, you get everything that you would expect from and ATI card: compatibility with HDMI 1.3 formats, an option for a DisplayPort connector and full support for ATI’s UVD 2.2.


Enhanced DVD Upscaling & Dynamic Contrast

4670-17.JPG

While there are plenty of us who will use HD signals through the HD5000-series of cards, whether we like it or not we will still be outputting lower definition signals to our wonderful new HDTV every now and then. In these cases, a standard 480i picture will look absolutely horrible if it is scaled up to fit on a high definition 1080P TV so ATI provides the Avivo HD upscaling option in their drivers. What this does is take the low resolution signal and clean it up so to speak so it looks better when displayed on a high definition screen.

4670-18.JPG

Another interesting feature ATI has packed into their drivers is the Dynamic Contrast Adjustment. Personally, I more often than not adjust the contrast manually based on the application since the values from one game or movie to the next can vary a lot. ATI has taken the guesswork and thrown it out the window by providing a post-processing algorithm which will automatically (and smoothly) adjust the contrast ratio in real time.

While there are other benefits of using the 5000-series for audio and video pass-through for your home theater, we will stop here and get on with the rest of this review.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Packaging and Accessories

Packaging and Accessories


XFX-HD5830-1.jpg
XFX-HD5830-2.jpg

The packaging for this HD 5830 is typical of most other XFX products with an industrial look and an oversized “X” imprinted in the background. Naturally, XFX also uses this space to advertise the fact that they include the new Aliens versus Predator game along with this card.

XFX-HD5830-3.jpg
XFX-HD5830-4.jpg

Unlike some manufacturers, XFX packages their cards with an exterior sleeve that is nothing more than a cover for another box. Within this box we have a small compartment with the accessories and instructions manuals which sits atop the card itself. For protection, this package makes due with cardboard liners on all sides of the graphics card in order to protect it from blunt force trauma as well as a standard anti static bag.

XFX-HD5830-5.jpg

XFX has spared no expense when it comes to the accessories they bundle with their HD 5830. Granted, you don’t get an amazing array of extras but a convenient door knocker listing warranty information and a coupon code for a free download of Aliens versus Predator from Steam does go a long way to differentiate this card from the competition. As with most other cards, Molex to 6-pin power cables, a short Crossfire bridge and a DVI to VGA dongle are also included.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
A Closer Look at the XFX HD 5830 1GB

A Closer Look at the XFX HD 5830 1GB


XFX-HD5830-6.jpg
XFX-HD5830-7.jpg

The design of XFX’s HD 5830 is unique to say the least when compared to the original HD 5830 we reviewed which featured a full-coverage heatsink shroud. Instead of going the full coverage route, XFX decided to go with the same heatsink they use on one of their HD 5770 cards and graft it onto this higher-end product. It unfortunately doesn’t exhaust heat outside of your case but we will see how well this setup performs a bit later in the review. The black PCB and contrasting red accents look great in our opinion.


The shroud is roughly octagonal in design with graphics that carry the same theme from the card’s package and it covers an extensive but relatively simple heatsink. There is a thick copper contact plate which touches the GPU passes off heat to the two extremely large heatpipes that are intersected by an aluminum fin array. We have seen this design quite a few times in the past and while it looks robust, higher end GPUs tend to tax it when the heat is turned up.

It should also be noted that the memory modules are left bare without their own heatsinks and the pictures show a bowed PCB that could be an indication of overly tightened screws.

XFX-HD5830-11.jpg
XFX-HD5830-12.jpg

XFX uses a simple 4-phase power layout with open-face chokes for this card which may seem shockingly bare for those of you who are used to the huge 8 and more phase arrays on higher end cards. However, the HD 5830 is a low powered product and as such it doesn’t need an intricate PWM design even though the VRMs are covered with a perfectly adequate heatsink.

XFX-HD5830-13.jpg

The backplate houses exactly what we would expect from an Eyefinity-enabled card with single DisplayPort and HDMI connectors as well as a pair of DVI outputs. There is also a small area for heat exhaust but that will go mostly unused with this card.

XFX-HD5830-14.jpg

As with all other HD 5830 cards, the length of the XFX version is a bit shocking when compared to the HD 5770 and even the much more powerful HD 5850. At around 10 ¼” in length plus another inch for the rear-mounted PCI-E connectors, this is one hell of a long card considering it is targeted at a mid-range price bracket.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Test System & Setup

Test System & Setup

Processor: Intel Core i7 920(ES) @ 4.0Ghz (Turbo Mode Enabled)
Memory: Corsair 3x2GB Dominator DDR3 1600Mhz
Motherboard: Gigabyte EX58-UD5
Cooling: CoolIT Boreas mTEC + Scythe Fan Controller (Off for Power Consuption tests)
Disk Drive: Pioneer DVD Writer
Hard Drive: Western Digital Caviar Black 640GB
Power Supply: Corsair HX1000W
Monitor: Samsung 305T 30” widescreen LCD
OS: Windows 7 Ultimate N x64 SP1


Graphics Cards:

Sapphire HD 5830 1GB
Sapphire HD 5870 1GB (Stock)
ATI HD 4890 1GB (Reference)
Sapphire HD 5850 1GB (Stock)
EVGA GTX 285 (Stock)
GTX 275 896MB (Stock)
GTX 295 (Stock)
EVGA GTX 260 216 (Stock)


Drivers:

ATI 10.3 Beta
NVIDIA 195.62 WHQL


Applications Used:

Batman Arkum Asylum
Borderlands
Dawn of War II
DiRT 2
Dragon Age: Origins
Far Cry 2
Left 4 Dead 2


*Notes:

- All games tested have been patched to their latest version

- The OS has had all the latest hotfixes and updates installed

- All scores you see are the averages after 2 benchmark runs

All game-specific methodologies are explained above the graphs for each game

All IQ settings were adjusted in-game
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Batman: Arkham Asylum

Batman: Arkham Asylum (DX9)


Even though Batman: AA has its own in-game benchmarking tool, we found that its results are absolutely not representative of real-world gaming performance. As such, we used FRAPS to record run-through of the first combat challenge which is unlocked after completing the first of The Riddler’s tasks. It includes close-in combat with up to 8 enemies as well as ranged combat. In addition, we made sure to set the smoothframerate line in the game’s config to “false”. No AA was used as the game engine does not natively support it.


1680 x 1050

XFX-HD5830-30.jpg


1920 x 1200

XFX-HD5830-31.jpg


2560 x 1600

XFX-HD5830-32.jpg
 
Status
Not open for further replies.

Latest posts

Top