Conclusion
After more than a year of respinning the G80 core again and again, Nvidia has finally given us a brand new architecture to play with and it is a hell of a step forward. The GeForce GTX 280 seems to represent the next logical step forward in the progression of the Nvidia lineup with a move to not only extreme gaming performance but also parallel computing on an impressive level. To be honest with you, this card is exactly what we have been waiting for since a few months after the release of the legendary 8800 GTX. True, at nearly $700 the GTX 280 is geared towards a very specific market segment and is therefore definitely not for everyone but it is still hard to deny it is a step in the right direction.
When you take a step back and look at the gaming performance of the GTX 280 it becomes apparent that there just aren’t many games on the market that can take advantage of the raw power it brings to the table. Indeed, we see glimmers of the horsepower this card has hiding in its bowels when we boost resolution, turn on all the eye candy and hold on for dear life. The GTX 280 wasn’t meant for the DX9 environment since every DX9 game we tested it with showed a fair amount of CPU bottlenecking at anything under 2560x1600. However, when you move to the DX10 environment this card was built for, it gobbles up the additional textures like nobody’s business. Simply put, if you game on anything under a 24” LCD the GTX 280 is just not for you and you would be better served with a 9800 GTX or even 8800 GTS 512MB.
Unfortunately, Nvidia stole a bit of their own thunder with the now-discontinued 9800 GX2 since in many cases the performance between it and the new kid on the block is quite similar. What needs to be realized is that the GTX 280 has no problem beating a dual GPU card by quite a large margin in some cases. This in itself is impressive but we also have to remember that a single GPU solution is much more appealing than the internal SLI solution offered by the GX2 which is sometimes hobbled by lackluster SLI performance.
As for EVGA’s take on the GTX 280 with their Superclocked Edition we reviewed today, we really must say that this card continues EVGA’s history of excellence. Great customer service, a lifetime warranty and a step-up program go a long way in selling us on literally any card so as long as EVGA continues their tradition of offering the customer good value for their hard earned money, we will continue to recommend their products.
While this is a great card in many aspects, there isn’t only sunshine and roses in the land of the GTX 280 since there are a few areas where it falls flat for us. The first thing that immediately comes to mind is how atrociously high its power consumption is when under full load and how loud the fan needs to get in order to keep the core’s rampaging heat under control. Granted, there are over a billion transistors but in this day and age consumers are looking for energy efficiency and silence; not nearly 250W of heat that gets blown right back into your room.
The other major disappointment was the fact that Nvidia didn’t make full HDMI compatibility essential nor added a HDMI connector to their flagship card. This leaves companies an escape clause for not including the “optional” DVI to HDMI dongle. The fact that EVGA didn’t include the dongle or SPDIF cable with their Superclocked edition proves our worries are grounded in reality and leaves a bad taste in our mouths considering the price of admission.
When push comes to shove, the EVGA GTX 280 Superclocked is a very good single GPU solution that can go toe-to-toe with literally anything on the market and come out on top. As newer games are released with graphics that beggar those currently on the market we are sure that the GTX 280’s lead will continue to grow over the other cards tested here today. However, at this point there are only a few games where this card can really stretch its legs. The addition of applications based on CUDA is a huge selling point but we must ask ourselves if we are really willing to put up with the heat, noise and power consumption of the GTX 280 in order to transcode a movie a bit faster or play a game at ultra high resolutions. Are these sacrifices I would be wiling to make? Definitely, but then again I’m a bit of a speed junky.
No matter which way you look at the release of the GTX 280, one thing is for certain: the future of computer graphics and computing in general just got a hell of a lot more exciting.
Pros:
- Blistering speed in games
- Great performance in DX10
- CUDA architecture compatible
- Very low idle power consumption
- EVGA Lifetime Warranty
Cons:
- Copious power consumption when under load
- Quite loud
- Runs hot and dumps a lot of that heat into your immediate environment
- No included HDMI connector of any sort
Many thanks to EVGA and Nvidia for making this review possible
http://www.hardwarecanucks.com/foru...ga-geforce-gtx-280-review-comment-thread.html
Conclusion
After more than a year of respinning the G80 core again and again, Nvidia has finally given us a brand new architecture to play with and it is a hell of a step forward. The GeForce GTX 280 seems to represent the next logical step forward in the progression of the Nvidia lineup with a move to not only extreme gaming performance but also parallel computing on an impressive level. To be honest with you, this card is exactly what we have been waiting for since a few months after the release of the legendary 8800 GTX. True, at nearly $700 the GTX 280 is geared towards a very specific market segment and is therefore definitely not for everyone but it is still hard to deny it is a step in the right direction.
When you take a step back and look at the gaming performance of the GTX 280 it becomes apparent that there just aren’t many games on the market that can take advantage of the raw power it brings to the table. Indeed, we see glimmers of the horsepower this card has hiding in its bowels when we boost resolution, turn on all the eye candy and hold on for dear life. The GTX 280 wasn’t meant for the DX9 environment since every DX9 game we tested it with showed a fair amount of CPU bottlenecking at anything under 2560x1600. However, when you move to the DX10 environment this card was built for, it gobbles up the additional textures like nobody’s business. Simply put, if you game on anything under a 24” LCD the GTX 280 is just not for you and you would be better served with a 9800 GTX or even 8800 GTS 512MB.
Unfortunately, Nvidia stole a bit of their own thunder with the now-discontinued 9800 GX2 since in many cases the performance between it and the new kid on the block is quite similar. What needs to be realized is that the GTX 280 has no problem beating a dual GPU card by quite a large margin in some cases. This in itself is impressive but we also have to remember that a single GPU solution is much more appealing than the internal SLI solution offered by the GX2 which is sometimes hobbled by lackluster SLI performance.
As for EVGA’s take on the GTX 280 with their Superclocked Edition we reviewed today, we really must say that this card continues EVGA’s history of excellence. Great customer service, a lifetime warranty and a step-up program go a long way in selling us on literally any card so as long as EVGA continues their tradition of offering the customer good value for their hard earned money, we will continue to recommend their products.
While this is a great card in many aspects, there isn’t only sunshine and roses in the land of the GTX 280 since there are a few areas where it falls flat for us. The first thing that immediately comes to mind is how atrociously high its power consumption is when under full load and how loud the fan needs to get in order to keep the core’s rampaging heat under control. Granted, there are over a billion transistors but in this day and age consumers are looking for energy efficiency and silence; not nearly 250W of heat that gets blown right back into your room.
The other major disappointment was the fact that Nvidia didn’t make full HDMI compatibility essential nor added a HDMI connector to their flagship card. This leaves companies an escape clause for not including the “optional” DVI to HDMI dongle. The fact that EVGA didn’t include the dongle or SPDIF cable with their Superclocked edition proves our worries are grounded in reality and leaves a bad taste in our mouths considering the price of admission.
When push comes to shove, the EVGA GTX 280 Superclocked is a very good single GPU solution that can go toe-to-toe with literally anything on the market and come out on top. As newer games are released with graphics that beggar those currently on the market we are sure that the GTX 280’s lead will continue to grow over the other cards tested here today. However, at this point there are only a few games where this card can really stretch its legs. The addition of applications based on CUDA is a huge selling point but we must ask ourselves if we are really willing to put up with the heat, noise and power consumption of the GTX 280 in order to transcode a movie a bit faster or play a game at ultra high resolutions. Are these sacrifices I would be wiling to make? Definitely, but then again I’m a bit of a speed junky.
No matter which way you look at the release of the GTX 280, one thing is for certain: the future of computer graphics and computing in general just got a hell of a lot more exciting.
Pros:
- Blistering speed in games
- Great performance in DX10
- CUDA architecture compatible
- Very low idle power consumption
- EVGA Lifetime Warranty
Cons:
- Copious power consumption when under load
- Quite loud
- Runs hot and dumps a lot of that heat into your immediate environment
- No included HDMI connector of any sort
Many thanks to EVGA and Nvidia for making this review possible
http://www.hardwarecanucks.com/foru...ga-geforce-gtx-280-review-comment-thread.html