Go Back   Hardware Canucks > HARDWARE > Video Cards

    
Reply
 
LinkBack Thread Tools Display Modes
  #161 (permalink)  
Old August 16, 2008, 03:26 PM
Rookie
 
Join Date: Aug 2008
Posts: 5
Default

Well I tried the official 725, beta 711 and beta 611. None of those would work. I don't know what the bios version of the card is. I would have to take it back over to my friends and put it in his P35 again.
__________________
DFI LP Dark X35 T2RS 311
Core 2 Duo Q6600 @3.5GHz
GSkill 2x2GB PC8000 5-5-5-15 1052GHz
HIS 4870 X2 787/960 both gpu's
Raid 0 2x500GB Seagate on SATA 1/2
Raid 0 2x500GB Seagate SATA II on 3/4
Sony DVD Burner
SilverStone OP700 PSU
Reply With Quote
  #162 (permalink)  
Old August 16, 2008, 05:46 PM
encorp's Avatar
Hall Of Fame
F@H
 
Join Date: Apr 2008
Location: Toronto
Posts: 3,425

My System Specs

Default

Wonderful review Sky, the only qualm I have is that a lot of people lately are all "rarara ATI" right now and while I'm not really on either side of the fence, but when taking in your graphs for example I personally would not exactly feel so safe in saying that the ATIx2 is really killing the 280. SURE, You're right, in the proper system without any bottlenecks it's going to obliterate the 280 AT VERY HIGH resolution, but in a lot of case the 280 at the lower resolutions at the very least the 280 stays within the realm of comparible, it may not be raping the ATIx2 all that much but for a single chip, 1gig card it sure is holding it's own. And I myself believe it to be far superior in manufacturing compared to the single 4870's.


I'm pretty sure if you were to have a GTX280 with nvidia's superior architecture and 2gigs of ram that the 280 would seriously pull ahead.
__________________
DISCLAIMER: The views and opinions expressed in the immediately preceding post are those of encorp and do not reflect the views and/or opinions of family, friends, or anyone remotely associated with encorp unless explicitly stated. encorp does not make any warranty, express or implied, or assumes any liability or responsibility for the quality, factuality or use of information in the immediately preceding post.
Reply With Quote
  #163 (permalink)  
Old August 16, 2008, 06:13 PM
SKYMTL's Avatar
HardwareCanuck Review Editor
 
Join Date: Feb 2007
Location: Montreal
Posts: 11,605
Default

TO tell you the truth, I think it is ATI with the superior architecture right now, just look at how well their cards do when AA is enabled. Once Nvidia goes through with a die shrink....well, we will have to see.
Reply With Quote
  #164 (permalink)  
Old August 16, 2008, 07:06 PM
MpG's Avatar
MpG MpG is offline
Hall Of Fame
 
Join Date: Aug 2007
Location: Kitchener, ON
Posts: 3,141
Default

I thought the GTX's performance was mostly just a matter of brute force? Also, I think I'd be surprised to see 2 gigs of VRam make much difference over 1, except in the most extreme of circumstances. At which point, a single card really isn't powerful enough to work with that much information.
__________________
i7 2600K | ASUS Maximus IV GENE-Z | 580GTX | Corsair DDR3-2133
Reply With Quote
  #165 (permalink)  
Old August 16, 2008, 07:19 PM
encorp's Avatar
Hall Of Fame
F@H
 
Join Date: Apr 2008
Location: Toronto
Posts: 3,425

My System Specs

Default

Quote:
Originally Posted by SKYMTL View Post
TO tell you the truth, I think it is ATI with the superior architecture right now, just look at how well their cards do when AA is enabled. Once Nvidia goes through with a die shrink....well, we will have to see.

Well in that regard you are definitely right, I don't know a lot about this but ATI's cards always had a lot of shader pipes while nvidia uses less and still accomlishes the "same" performance..
__________________
DISCLAIMER: The views and opinions expressed in the immediately preceding post are those of encorp and do not reflect the views and/or opinions of family, friends, or anyone remotely associated with encorp unless explicitly stated. encorp does not make any warranty, express or implied, or assumes any liability or responsibility for the quality, factuality or use of information in the immediately preceding post.
Reply With Quote
  #166 (permalink)  
Old August 25, 2008, 10:57 AM
Chilly's Avatar
Hall Of Fame
F@H
 
Join Date: Sep 2007
Location: Ontario
Posts: 2,593
Default

Quote:
Originally Posted by encorp View Post
Well in that regard you are definitely right, I don't know a lot about this but ATI's cards always had a lot of shader pipes while nvidia uses less and still accomlishes the "same" performance..
You can't compare things like this, using this logic, a Dual CPU system with HT Intel Pentium 4 Xeon at 3.8GHz should beat/smack down a Intel Core 2 Quad at 2.8GHz, thou the reality is the Core 2 Quad can probably beat the Xeon P4 at only 2.0GHz.

nVidia is at the level where they are using simple brute force atm, little intelligent design atm. THIS IS NOT ALWAYS A BAD THING, brute force works, the 280GTX proves this (basically 2 G92's glued together). That being said, Intelligent design will always beat out brute force in the long run (ie GPU refreshes, respins, next-gen chips, etc) while being cheaper in the long run as well.

The 280GTX due to its brute force method is expensive to manufacture, low yields, etc. Brute Force is cheap in the short, expensive in the long run (little R&D costs vs Massive Manufacture cost). The R770 chip on the other hand (48xx series) is cheap to manufacture, high yeilds, etc. Intelligent design is more expensive in the short run, but cheaper in the long run (High R&D costs vs cheap manufacture costs)

There is nothing wrong with either approach, but usually when this approach is taken, it is typically done on smaller, more efficient chips ala 4870 X2 (frankly this is a brute force approch, but a well designed, intelligent one). In fact I would not be surprised to see the R8xx (58xx series) from ATi to do, what nVidia did with GTX280 (2xG92 in simplest terms), with a lot of tweaking mind you. Either way this round, ATi won trying its hand at intelligent design, and nVidia tried and did a decent job with Brute Force, but lost out to the intelligent design.

(I don't belive the brute force method is the one they(ATi) will be taking, I'm simply saying that the design of the R770 lends its self to this type of brute force more easily than the G92 did)
Reply With Quote
  #167 (permalink)  
Old September 30, 2008, 12:18 PM
Rookie
 
Join Date: Sep 2007
Posts: 26
Default

Quote:
Originally Posted by SKYMTL View Post
I posted the memory chip in the review now. Hynix.

We wil be posting a review with CrossfireX review with Overclocking, mixed CF and everything else soon. I just want to get my processor overclocked somemore so I dont' repeat what everyone else seems to be doing: Crossfire reviews with a 3.0Ghz processor. Pfftttt. I want 4.3Ghz at least before I tackle it.
I wait this new review since a moment now , How many time again for you release it ?
Reply With Quote
  #168 (permalink)  
Old September 30, 2008, 12:46 PM
SKYMTL's Avatar
HardwareCanuck Review Editor
 
Join Date: Feb 2007
Location: Montreal
Posts: 11,605
Default

God, you are right. Way too much going on right now....
Reply With Quote
  #169 (permalink)  
Old October 2, 2008, 11:41 PM
Rookie
 
Join Date: Sep 2007
Posts: 26
Default

Not sure if I understand, the review will come out soon ?
Reply With Quote
  #170 (permalink)  
Old October 3, 2008, 06:09 AM
cegras's Avatar
Top Prospect
 
Join Date: Feb 2008
Location: Toronto
Posts: 151
Default

When the RV770 reviews came out, I actually took the time to read up on the architecture for it. It was interesting, but what is the most interesting is that most reviewers basically say you can get free 8xAA with the card. Not that it REALLY matters on super high resolutions .. but eh.
Reply With Quote
Reply


Thread Tools
Display Modes