What's new
  • Please do not post any links until you have 3 posts as they will automatically be rejected to prevent SPAM. Many words are also blocked due to being used in SPAM Messages. Thanks!

Sapphire HD 5670 1GB GDDR5 Review

Status
Not open for further replies.

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
TOXIC-31.jpg


Sapphire HD 5670 1GB GDDR5 Review





Product Number: 11168-00
Price:
$125 USD (this version)
$115 USD (reference 1GB)
$100 USD (512MB)
Warranty: 2-years
Buy from: NCIX | DirectCanada | BestDirect



Welcome to the review of yet another ATI DX11 graphics card. In recent months we have seen the release of the ultra high end HD 5970, the well-endowed but surprisingly affordable HD 5870 and HD 5850 and the more budget friendly but equally impressive HD 5770 and HD 5750. Now ATI is rounding out their already-full stable not with a thoroughbred that is designed to push high framerates but rather with a number of cards which will appeal to casual gamers and HTPC enthusiasts. Basically, it is being marketed to people who don’t need blazingly fast performance but understand the value of efficiency, HD decoding capabilities and passable 3D performance. Say hello to Redwood.

Redwood is the code name for a whole series of products ranging from the $115 USD Redwood XT-based HD 5670 1GB to the $100 512MB version and the sub-$100 HD5570 and HD5550 which use the Redwood Pro core. These products follow closely in the highly successful footsteps of the HD 4670 and HD 4650 which will still be around through Q2 of 2010 but their goal remains the same: bring value to a segment of the market that doesn’t get all that much time in the limelight. In this review we will be looking a bit closer at Sapphire’s HD 5670 1GB GDDR5 that sports reference clocks but comes decked out in a custom heatsink from Arctic Cooling.

The HD 5670 series also features support for ATI’s HyperMemory technology which allows system memory to be dedicated on the fly for use by the GPU. While it is doubtful this will be much use for cards with 1GB or even 512MB of GDDR5 memory, lower-end products can benefit from this in situations where memory bandwidth becomes a limiting factor.

Since the HD decoding capabilities of ATI’s present and past generation chips has been talked about at length in the past, we’ll be concentrating on the gaming aspects and capabilities of the HD 5670 1GB in this article. Price-wise it slots into a bit of a grey zone that has popped up in the market within the last few months. With NVIDIA positioning the $99 GT 240 512MB at approximately the same performance as the 9600 GT while still relying on the ages-old 9800 GT 512MB to do most of the grunt work in the sub-$150 market, ATI saw an opening and went for it. Granted, the HD 5670 1GB might be priced above the GT 240 512MB and slightly above the 9800 GT but competition is tough in this price segment, especially with ATI’s own $135 to $145 HD 5750 1GB knocking at the door as well.

The logistics of releasing new products across such a wide variety of price points all within four months of one another simply boggles the mind but ATI has done it. Let’s see how the newest member of their rapidly growing family fits into the mix.


HD5670-15.jpg
 
Last edited by a moderator:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
A Look at the ATI 5000-series

A Look at the ATI 5000-series


HD5670-62.jpg

As you can probably tell by the chart above, all of the HD 5000-series fit perfectly into ATI’s current lineup. At the top of the heap we have the ultra high performance dual GPU HD 5970 which carries most of the same specifications as a pair of HD 5870s. There are however some sacrifices that had that had to be made in the clock speed department in order to keep power consumption within reasonable levels. So, while this card has the same number of texture units and stream processors as the HD 5870, its core and memory run at speeds identical to the HD 5850.

Judging from paper specifications alone, the HD 5870 is a technological marvel considering it packs all of the rendering potential of ATI’s past flagship card and then some while not being saddled by an inefficient dual processor design. The fact that this new card could trump the performance of a HD 4890 just a few months after that card’s release is nothing short of stunning.

The HD 5850 on the other hand looks to be the purebred price / performance leader of the new ATI lineup. Barring slightly lower clock speeds for both the core and memory along with eight disabled texture units (totalling 160 stream processors), it is basically a clone of the HD 5870. This is the card ATI hopes will compete directly with the GTX 285 for the near future and then come into its own when DX11 games make their way into the market. We believe this card will appeal to the majority of early adopters since it allows them to buy class-leading DX9 and DX10 performance now without gambling $400 on unproven DX11 potential.

We can also see that ATI did some careful price cutting prior to launch since even though the HD 4890 looks to offer significantly less performance than a HD 5850, it is actually priced accordingly. As such, this previously high end card will stick around for the next few months in the $200 price bracket but that isn’t to say that it will stay there indefinitely...

HD5870-100.jpg

Meanwhile, we now have the HD 5700-series of code-named Juniper cards as well with the HD 5770 and HD 5750. The HD 5770 1GB is one of the first sub-$200 cards which will come stock with 1GB of memory and along with the GDDR5 memory, comes with some hefty clock speeds as well. However, even though upon first glance the HD 5770 looks like it can compete with the HD 4890, this isn’t the case. According to ATI, the 128-bit memory interface will limit this card’s performance so it lies right within its stated price range. We should also mention that ATI won’t be replacing the HD 4890 until at least the first quarter of 2010 even though the HD 5770 is looking to take over from the HD 4850.

The HD 5750 on the other hand is simply a cut down HD 5770 with lower clocks, less SPs and a cut down number of Texture Units. It is this card that ATI sees going head to head with the NVIDIA GTS 250 and 9800 GT. It uses GDDR5 memory but there will be both 512MB and 1GB versions released to cater to the $100 market along with those looking for a little jump in performance.

Now we have the HD 5600 series added into the mix as well which is basically a further cut-down card featuring roughly half the number of SPs seen on the HD 5700 series. These new “Redwood” products still come equipped with fast GDDR5 memory operating across a 128-bit bus and will be offered in both 512MB and 1GB configurations. This should make the 5600 series perfect competition of NVIDIA’s GT 240 cards in their 512MB and 1GB guises. We should also mention that even though ATI's documentation lists "32 ROPs" this is a bit of a misnomer since the Redwood core features 32 Z-Stencil ROP units but only 8 color ROPs.

So there you have it. In the high stakes game of poker that is the GPU industry, ATI has shown its hand. All that is left is for the competition to respond.
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Focusing on DX11

Focusing on DX11


It has been a hair under three years since the release of Windows Vista and with it the DirectX 10 API. In that amount of time, a mere 33 DX10 games were released. That isn’t exactly a resounding success considering the hundreds of titles released in that same time. Let’s hope DX11 does a bit better than that.

HD5870-109.jpg

DX11 is focused on taking the lessons learned from the somewhat inefficient DX10 and shaping them into a much more efficient API which will demand less system resources while being easier to develop for. In addition to the usual 3D acceleration, it will also be used to speed up other applications which in the past have not been associated with the DirectX runtime. This may be a tall order but with the features we will be discussing here, developers have already started using DX11 to expand the PC gaming experience. It is an integral component in Windows 7 and according to Microsoft, will also be adopted into Windows Vista through a software update.

Let’s scratch the surface of what DX11 can bring to the table.

HD5870-110.jpg

Unlike past DirectX versions, DX11 endeavours to move past the purely graphics-based uses of the API and push it towards being the lynchpin of an entire processing ecosystem. This all begins with the power which DirectX Compute will bring into the fold. Not only can it increase the efficiency of physics processing and in-game NPC intelligence within games by transferring those operations to the GPU but it can also be used to accelerate non-3D applications.

HD5870-111.jpg


HD5870-104.jpg

Through the use of Compute Shader programs in Shader Model 5.0, developers are able to use additional graphical features such as order independent transparency, ray tracing, and advanced post-processing effects. This should add a new depth of realism to tomorrow’s games and as mentioned before, also allow for programs requiring parallel processing to be accelerated on the GPU.

HD5870-101.jpg

For the majority of you reading this review, it is the advances in graphics processing and quality that will interest you the most. As games move slowly towards photo-realistic rendering quality, new technologies must be developed in order to improve efficiency while adding new effects.

HD5870-105.jpg

Some of the technologies that ATI is championing are DX11’s new Depth of Field, OIT (or Order Independent Transparency) and Detail Tessellation. While the pictures above do a good job of showing you how each of these works, it is tessellation which ATI seems most excited about. They have been including hardware tessellation units in their GPUs for years now and finally with the dawn of DX11 will these units be finally put to their full use. OIT on the other hand allows for true transparency to be added to an object in a way that will be more efficient resource-wise than the standard alpha blending method currently used.

HD5870-102.jpg

Let’s talk about DX11 games. As you would expect, due to the ease of programming for this new API and the advanced tools it gives developers, many studios have been quite vocal in their support. Even though some of the titles listed above may not be high on your list of must have games, A-list titles like the upcoming Aliens vs. Predator from Rebellion and DiRT 2 are sure to get people interested.

Another exciting addition to the list is EA DICE’s FrostBite 2 Engine which will power upcoming Battlefield games. Considering the popularity of this series, the inclusion of DX11 should open up this API to a huge market.

HD5870-103.jpg
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
OpenCL: The Next Big Thing?

OpenCL: The Next Big Thing?


HD5870-115.jpg

As consumers, we have all heard of the inroads GPUs have been making towards offering stunning performance in compute-intensive applications. There have been attempts to harness this power by engines such as NVIDIA’s Compute Unified Device Architecture (CUDA) and ATI’s Stream SDK (which in v2.0 supports OpenCL).

HD5870-113.jpg

“Build it and the will come” says the old mantra but industry adoption of CUDA and Stream was anything but quick since there were two standards being pushed for the same market. CUDA in particular is having a hard time of it since it is vendor-specific without hardware support from any other vendor. The industry needed a language that was universal and available across multiple platforms. That’s were OpenCL (Open Computing Language) along with DirectX Compute come into play. It is completely open-source and managed by a non-profit organization called the Khronos Group which also has control over OpenGL and OpenAL

HD5870-114.jpg

At its most basic level, OpenCL is able to be executed across multiple mediums such as GPUs, CPUs and other types of processors. This makes it possible to prioritize workloads to the processor that will handle them most efficiently. For example, a GPU is extremely good at crunching through data-heavy parallel workloads while an x86 CPU is much more efficient at serial and task-specific This also allows developers to write their programs for heterogeneous platforms instead of making them specific to one type of processor.

HD5870-116.jpg

So what does this mean for gamers? First of all, AMD has teamed up with Bullet and PixeLux in order to achieve more realistic environments for players. The Bullet Physics is an open-source physics engine which has an ever-expanding library for soft body, 3D collision detection and other calculations. Meanwhile, PixeLux uses their DMM (Digital Molecular Matter) engine which uses the Finite Element Analysis Method of calculating physics within a game. In past applications, it has been used to calculate actions which have an impact on the game’s environment such as tumbling rubble or debris movement.

HD5870-117.jpg

With Stream moving to OpenCL, ATI is truly moving towards an open platform for developers which they are hoping will lead to broader developer and market adoption than the competition’s solutions. At this point it looks like we will soon see ATI’s GPUs accelerating engines from Havok, PixeLux and Bullet through the use of OpenCL. Considering these are three of the most popular physics engines on the market, ATI is well placed to make PhysX a thing of the past.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
ATI’s Eyefinity Technology

ATI’s Eyefinity Technology


2404aba41c60e982.jpg

The term Surround Gaming may not mean much to many of you who are reading this article but with the advent of ATI’s new Eyefinity technology, now is a good time to educate yourself. Basically, Eyefinity will give users the ability to use multiple monitors all running from the same graphics card. In the past, simple dual monitor setups have been used by many graphics, CAD or other industry professionals in order to increase their productivity but gaming on more than one monitor was always a bit of a clunky affair. Granted, some products like Matrox’s TripleHead2Go were able to move multi monitor setups into the public’s perception but there were always limitations (resolution and otherwise) associated with them. ATI is aiming to make the implementation of two or even more monitors as seamless as possible within games and productivity environments while offering the ability to use extreme resolutions.

2404aba41c633257.jpg

While the price of two or even three new monitors may be a bit daunting at first for many of you, but good 20” and even 22” LCDs have come down in price to the point where some are retailing below the $200 mark. ATI figures that less than $600 for three monitors will allow plenty of people to make the jump into a true surround gaming setup. Indeed, with three or even six monitors, the level of immersion could be out of this world.

2404aba41c626f4b.jpg

The reason that main in the professional field are familiar with multi monitor setups is for one simple matter: they increase productivity exponentially. Imagine watching a dozen stocks without having to minimize windows all the time or using Photoshop on one screen while watching a sports broadcast on another and using the third screen for Photoshop’s tooltips. The possibilities are virtually limitless if it is implemented properly.

2404aba41c634766.jpg

When it comes to a purely gaming perspective, the thought of a massive view of the battlefield or the ability to see additional enemies in your peripheral vision is enough to make most gamers go weak in the knees. Unfortunately, the additional monitors will naturally mean decreased performance considering the massive amount of real-estate that would need rendering. This will mean tradeoffs may have to be made in terms of image quality if you want to use Eyefinity.

HD5870-15.jpg

According to ATI, all of the new HD 5800-series graphics cards will have the ability to run up to three monitors simultaneously. This is done by having a pair of DVI connectors as well as a DisplayPort and HDMI connector located on the back of the card. It should be noted that ATI will be releasing a special Eyefinity version of the HD 5870 in the coming months which features six DisplayPort connectors for those of you who want to drive six monitors from a single card.

2404aba41c635d0d.jpg

This technology is all made possible through the use of DisplayPort connectors but this also provides a bit of a limitation as well. Above we can see that a number of 3-screen output combinations which the current HD5800-series support and one thing is constant: you will need at least one monitor which supports DisplayPort. Unfortunately, at this time DP-supporting monitors tend to carry a price premium over standard screens which will increase the overall cost of an Eyefinity setup. Luckily the other two monitors can either use DVI or a combination of DVI and HDMI for connectivity.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Packaging and Accessories

Packaging and Accessories


HD5670-1.jpg
HD5670-2.jpg

The packaging for this particular card follows the general design and size of nearly every other Sapphire HD 5000 series product we have seen so far. However, it is good to see that there is some information about the heatsink included although the details are sparse.

HD5670-3.jpg

While the back of the box includes a massive amount of information regarding features, there is still no mention of clock speeds, memory bandwidth or any other true specification for that matter. In inclusion of our Dam Good Award in the bottom left hand corner is a nice touch though…

HD5670-4.jpg
HD5670-5.jpg

The protection scheme meanwhile is pure Sapphire with a recycled cardboard insert making sure the card doesn’t move around too much while an anti static bubble wrap bag offers some additional protection.

Since this is a budget-oriented card, the list of accessories is limited but still includes the necessary Crossfire connector, HDMI to DVI break-out cable, DVI to VGA adaptor and a trial version of SimHD. As usual, there is also a quick start guide and a driver CD.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
A Closer Look at the Sapphire HD 5670 1GB GDDR5

A Closer Look at the Sapphire HD 5670 1GB GDDR5


HD5670-6.jpg
HD5670-7.jpg

The first thing you will probably notice about the HD 5670 1GB is how compact it is at a mere 6 ¾“ in length and that it doesn’t need a stand-alone power connector due to its power consumption of about 75W. Unlike past 5000 series cards Sapphire has decided to go with a blue PCB instead of the black one used for higher-end cards.

It should also be mentioned that the HD 5670 will come in two forms: one which supports hardware Crossfire with the necessary connectors on the PCB for the usual bridge cables while the other will only support software Crossfire and won’t come equipped with a bridge.

HD5670-8.jpg
HD5670-9.jpg

The heatsink sapphire chose for this card is the Arctic Cooling Accelero L2 Pro which is in our opinion an odd choice since it takes up two slots on a card that is touted for its thermal efficiency. Is a heatsink of this size really necessary? Hopefully Sapphire will release a single slot version in the near future but we hear rumors that the reference single slot ATI heatsink isn’t quite up to the job of cooling off this card.

The Arctic Cooling L2 Pro is a basic heatsink that is comprised of a multi-bladed 92mm fan that pushes cool air over an aluminum heatsink which doesn’t touch the GDDR5 memory modules. Since there is supposedly very little heat produced by the 40nm Redwood core, there should be no worries of excess heat buildup within your case.

HD5670-10.jpg
HD5670-11.jpg

The back of the card shows us an additional four 128MB GDDR5 memory modules. These H5GQ1H24AFR Hynix modules are rated for 5Ghz (5Gbps) at 1.5V so they should have some additional overcloking headroom left in them.

HD5670-12.jpg

The backplate is a HTPC user’s dream come true. Included are connectors for DVI, HDMI and DisplayPort which is basically everything that’s needed for a card of this stature but in case you were wondering, this card DOES support Eyefinity.

HD5670-13.jpg
HD5670-14.jpg

Even though we mentioned length at the beginning of this section, when comparing it to other cards the actual compact size of the HD 5670 comes into stark contrast. It is slightly shorter than the HD 5750 and exactly the same size as some companies’ GT 240 cards. However, it is important to remember that some GT 240s (particularly those made by EVGA) are slightly longer than the one we used in the picture above.
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Test System & Setup

Test System & Setup

Please note that this test system was specifically picked out to run our budget GPUs hand in hand with a configuration that doesn't cost more than $500CAD for the CPU, motherboard and memory.

Processor: Intel Core i5 750 @ 2.67Ghz (Turbo Mode Enabled)
Memory: 2x2GB OCZ Platinum PC-15000 @ 6-7-6-17 1066Mhz DDR
Motherboard: MSI H57M ED-65
Cooling: Thermalright TRUE
Disk Drive: Pioneer DVD Writer
Hard Drive: Western Digital Caviar Black 640GB
Power Supply: Corsair HX520
Monitor: Samsung 305T 30” widescreen LCD
OS: Windows 7 Ultimate N x64


Graphics Cards:

Sapphire HD 5670 1GB GDDR5
XFX HD 5750 (Reference)
Diamond HD 4770 (Reference)
Sparkle GT 240 1GB GDDR3
Sparkle GT 240 512MB GDDR5
EVGA 9800 GT 512MB (Reference)
EVGA 9600 GT 512MB (Reference)


Drivers:

ATI 10.1 Beta (HD 5670 1GB)
ATI 9.12 WHQL
NVIDIA 195.62 WHQL


Applications Used:

Batman Arkum Asylum
Borderlands
Dawn of War II
DiRT 2
Dragon Age: Origins
Far Cry 2
Left 4 Dead 2


*Notes:

- All games tested have been patched to their latest version

- The OS has had all the latest hotfixes and updates installed

- All scores you see are the averages after 2 benchmark runs

All game-specific methodologies are explained above the graphs for each game

All IQ settings were adjusted in-game
 
Last edited:

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Batman: Arkham Asylum

Batman: Arkham Asylum (DX9)


Even though Batman: AA has its own in-game benchmarking tool, we found that its results are absolutely not representative of real-world gaming performance. As such, we used FRAPS to record run-through of the first combat challenge which is unlocked after completing the first of The Riddler’s tasks. It includes close-in combat with up to 8 enemies as well as ranged combat. In addition, we made sure to set the smoothframerate line in the game’s config to “false”. No AA was used as the game engine does not natively support it.


1680 x 1050

HD5670-30.jpg


1920 x 1200

HD5670-31.jpg
 

SKYMTL

HardwareCanuck Review Editor
Staff member
Joined
Feb 26, 2007
Messages
12,840
Location
Montreal
Borderlands

Borderlands (DX9)


In this benchmark we once again stayed away from the in-game benchmark as it is not representative of an actual gameplay sequence. Instead, a 10 minute combat sequence was played through and the results were recorded using FRAPS. The location of this benchmark is right after the first town you enter and includes explosions and fast-paced action. In addition, we made sure to set the smoothframerate line in the game’s config to “false”.

1680 x 1050

HD5670-32.jpg


1920 x 1200

HD5670-33.jpg
 
Status
Not open for further replies.

Latest posts

Top