Go Back   Hardware Canucks > HARDWARE > Video Cards

    
Reply
 
LinkBack Thread Tools Display Modes
  #41 (permalink)  
Old June 17, 2008, 12:58 PM
bojangles's Avatar
Hall Of Fame
F@H
 
Join Date: Jan 2008
Location: Oakville, ON
Posts: 2,683

My System Specs

Default

While this card seems like it's on it's way to perform many FLOPS, I still don't think this is a big step in gaming. They should have just left the non-gaming stuff to their workstation and server stuff.

Honestly, what I think nvidia did was slap all those GX2 transistors onto one die. Pure stupidity I think. We're looking for more efficiency, not insane power consumption. They should have went to 55nm to take some of the heat off.
Reply With Quote
  #42 (permalink)  
Old June 17, 2008, 01:04 PM
CMetaphor's Avatar
Quadfather
 
Join Date: May 2007
Location: Montreal, Canada
Posts: 4,994

My System Specs

Default

Quote:
Originally Posted by bojangles View Post
They should have went to 55nm to take some of the heat off.
That's one of the biggest reasons that my next video card(s) will be from AMD/ATI. Right now with my two 8800GTS 320mb's in SLI the "upper" one (which has 2x Coolermaster 120mm fans pointing right at it from the side) Is constantly about 10 degress hotter than the lower one (which has 2x Antec Tricool 120mms pointed at it from the bottom). Considering that the Coolermaster fans are running on MAX and the tricools are running on "Low" I'm very disappointed with how well these 65nm Graphics cards handle heat. I can only imagine how hot the 3x SLI GT280's get (from that guy on the XS site) but I have no doubt it's unbearable. Just my opinion.
__________________
"Backed by common sense and physics!" -Squeetard
Opteron Server for Sale! http://www.hardwarecanucks.com/forum...ade-ideas.html
Reply With Quote
  #43 (permalink)  
Old June 17, 2008, 01:26 PM
zlojack's Avatar
Hall Of Fame
F@H
 
Join Date: Nov 2007
Location: Toronto
Posts: 2,057

My System Specs

Default

Great review SKYMTL as per usual. Thorough, informing and just generally very good.

This card does look like a beast, but it hasn't blown me away.

I'll be waiting for the 4870 and X2 to see how they compare.

Certainly going to be interesting, but unless I see something that blows me away, I'm not changing my setup for a while.
__________________
[SIZE=3]
Reply With Quote
  #44 (permalink)  
Old June 17, 2008, 01:46 PM
MpG's Avatar
MpG MpG is offline
Hall Of Fame
 
Join Date: Aug 2007
Location: Kitchener, ON
Posts: 3,141
Default

Quote:
Originally Posted by CMetaphor View Post
...I'm very disappointed with how well these 65nm Graphics cards handle heat.
Your GTS 320's are actually 90nm, which is one of the reasons they run so hot.
__________________
i7 2600K | ASUS Maximus IV GENE-Z | 580GTX | Corsair DDR3-2133
Reply With Quote
  #45 (permalink)  
Old June 17, 2008, 03:09 PM
S_G's Avatar
S_G S_G is offline
Allstar
 
Join Date: Nov 2007
Location: Montreal, home of the Canadiens
Posts: 830
Default

Looks awesome, though that's quite a hefty price tag. Guess I'm glad to be going ATI this time around.
Reply With Quote
  #46 (permalink)  
Old June 17, 2008, 06:43 PM
Rookie
 
Join Date: Jun 2008
Location: Halifax, NS
Posts: 41
Default

DX10 performance on crysis must of been pretty bad for you to test it in dx9?
Reply With Quote
  #47 (permalink)  
Old June 17, 2008, 07:32 PM
SKYMTL's Avatar
HardwareCanuck Review Editor
 
Join Date: Feb 2007
Location: Montreal
Posts: 11,605
Default

I chose to test in DX9 since I feel that DX10 at High settings does nothing but impact performance with ZERO visual benefits. I try to stay as realistic as possible in as many tests as possible which means I test at resolutions and settings I myself would use. IMO, if someone decides to play Crysis with DX10 on they are a sucker for punishment based on the lack of benefits it offers.

I don't add certain tests because the hardware did bad or had less than pleasing performance. I keep said tests out because no one in their right mind would actually use the settings theses tests would involve.
Reply With Quote
  #48 (permalink)  
Old June 17, 2008, 08:21 PM
Retro's Avatar
Top Prospect
 
Join Date: Mar 2008
Location: B.C.
Posts: 129
Default

Another excellent review from SKYMTL
One of my favorite features is the customary fitting of a Thermalright heatsink. Although, in this case, it didn't work but good show trying both. I would think that Thermalright will have a cooler for the 200 series soon enough. I bit the bullet after work today and joined the EVGA step-up list for a 280. I know it isn't going to blow away my GX2 but I look forward to leaving behind the stuttering and getting back the gaming smoothness that I had with my previous single-gpu 8800GTS640 and G92 512 cards. And of course, being able to bolt on a Thermalright HR-03 type cooler with a good 120mm fan.
Too bad I'm #255 in the waiting listI guess it will be a while before I get to play with the new toy
__________________
Latest build:
i5 750@3800MHz 20x190@1.38125v
Noctua NH-U12P/NF-P12
4G GSkill DDR3 2000 Ripjaws
Asus P7P55D-Pro
2x EVGA GTX275 in SLI
PC P&C 750 Silencer
Cuda 500
Antec 902
W7 64 RTM

Last edited by Retro; June 17, 2008 at 08:33 PM.
Reply With Quote
  #49 (permalink)  
Old June 17, 2008, 08:27 PM
S_G's Avatar
S_G S_G is offline
Allstar
 
Join Date: Nov 2007
Location: Montreal, home of the Canadiens
Posts: 830
Default

Quote:
Originally Posted by SKYMTL View Post
I chose to test in DX9 since I feel that DX10 at High settings does nothing but impact performance with ZERO visual benefits. I try to stay as realistic as possible in as many tests as possible which means I test at resolutions and settings I myself would use. IMO, if someone decides to play Crysis with DX10 on they are a sucker for punishment based on the lack of benefits it offers.
I agree with you about DX9 vs DX10 at high settings, but at very high there is quite a noticeable difference. Even when you unlock the very high settings in DX9, there are some missing features. Most notably, the handling of soft particles, volumetric lighting and dynamic shadows. For me, dynamic shadowing is the best current use of DX10 technology. Clouds casting shade adds considerable realism to a scene.
Reply With Quote
  #50 (permalink)  
Old June 17, 2008, 08:48 PM
SKYMTL's Avatar
HardwareCanuck Review Editor
 
Join Date: Feb 2007
Location: Montreal
Posts: 11,605
Default

Quote:
Originally Posted by S_G View Post
I agree with you about DX9 vs DX10 at high settings, but at very high there is quite a noticeable difference. Even when you unlock the very high settings in DX9, there are some missing features. Most notably, the handling of soft particles, volumetric lighting and dynamic shadows. For me, dynamic shadowing is the best current use of DX10 technology. Clouds casting shade adds considerable realism to a scene.
For the fun of it I just tried my usual custom timedemo on Very High settings in DX10 @ 1280 X 1024. The resulting slideshow from the GTX 280 and GX2 was telling enough about two things:

- No one is going to but this card to run Crysis at 1280 x 1024 at very high settings
- Crysis is ahead of its time. It looked REALLY REALLY good...
- My custom timedemo is VERY different from the usual "benchmark" since that "built in" BS achieved a 32FPS average on the GX2. Optimizations anyone???
Reply With Quote
Reply


Thread Tools
Display Modes