Go Back   Hardware Canucks > HARDWARE CANUCKS NEWS > Site Announcements

    
Reply
 
LinkBack Thread Tools Display Modes
  #241 (permalink)  
Old April 21, 2009, 11:33 PM
Perineum's Avatar
Hall Of Fame
F@H
 
Join Date: Mar 2009
Location: Surrey, B.C.
Posts: 4,049

My System Specs

Default

Quote:
Originally Posted by Perineum View Post
Soooo.... am I correct in thinking that my 8800GT at 750core/1900shader/2100mem is going to out PPD my 576core/1242shader/2000mem? Is it all about the shaders?
D'oh.

I meant to say....

Will my 8800GT at 750/1900/2100 going to beat my GTX 260 at 576/1242/2000?

So 216 cores still lose out to 96 (or however many the 8800GT has)?
Reply With Quote
  #242 (permalink)  
Old April 21, 2009, 11:49 PM
Dashock's Avatar
Hall Of Fame
F@H
 
Join Date: Nov 2007
Location: Toronto
Posts: 1,938
Default

The Gtx 260 will rape it its not how high yur shaders can go its how many you have the 216 is almost double the 8800gt at 112 shaders.
__________________
Intel Xeon E3110 @ 3.0ghz
XFX 8800 GT Alpha Dog Edition
Asus P5Q-E
Mushkin Ascent Redline 8000 4gb DDR2 @ 1066 5-5-5-15 2.05v
OCZ 600W GameXstream
Seagate 7200.11 500gb
Seagate 7200.10 320gb
Antec Spot Cool
Logitech X-230
Samsung 32" 720P HDTV



Coolermaster Cosmos S/
Swiftech Mcp655/ MCR-320-QP/ 6X Noctua NF-P12'S/ Swiftech Apogee GT/ Swiftech Mcw-60/ Swiftech Micro-Res.

Reply With Quote
  #243 (permalink)  
Old April 22, 2009, 04:37 AM
Perineum's Avatar
Hall Of Fame
F@H
 
Join Date: Mar 2009
Location: Surrey, B.C.
Posts: 4,049

My System Specs

Default

Hmm, ok....

I haven't seen a lot of PPD from this card yet... I know it's done some 511's or whatever they are.

Anyway, in order to keep up with everyone and get my spot back in the top 20 of the team for output I have tentatively overclocked my 260. I aimed for basically what other members of this site have got (700/1500, roughly). It would run but get artifacting with ATItool. My card seems to like being unlinked and then push the core higher... It ended up being stable (2hrs in both ATItool and Furmark stability test -extreme burn mode) at 756/1512



I could "walk" my clocks higher. While running ATItool looking for artifacts I could move my core up 15mhz, then walk the shader up a bit more. I did this all the way up to 800/1625. Still no artifacts. However, it failed spetacularly after loading up the furmark stability test

It's quite odd, I'm used to leaving things at stock, unlinking, and then just moving up and finding max core, then putting it back to stock and finding max shader, etc. This method worked well in the past but this time with everything else at stock I could not get the shaders up over somewhere near the 1400mhz mark. It was only after I upped the core that I could get the shaders up.

Anyone know the reason for this? I'm rather curious if there is a technical explanation.

So anyway... er...

MOOOARRR PPD!
Reply With Quote
  #244 (permalink)  
Old April 22, 2009, 09:38 AM
SugarJ's Avatar
Moderator
F@H
 
Join Date: Jan 2008
Location: Langley, BC
Posts: 6,140

My System Specs

Default

Keep in mind that EVGA bins their GPU's, saving the ones that will OC higher for their SC, SSC, and FTW models. You may not get a huge OC out of it.

But if it will fold without errors at those clocks, that's actually a really good OC. I actually had to back off the clock speed of my 55nm SSC model for it to fold without errors. It would run games fine, pass all tests, but would error out folding.
Reply With Quote
  #245 (permalink)  
Old April 22, 2009, 04:38 PM
Silvgearx's Avatar
Hall Of Fame
F@H
 
Join Date: Sep 2008
Location: Toronto
Posts: 2,105

My System Specs

Default

I do not know how a high clocked 8800GT vs your GTX 260, But i give you some numbers

drivers 182.50
8800GT 625/800/2150

Below numbers not to be exact, but its damn close!
353PT : 6300
384PT: 5900
511PT: 3800

I do not remember what other WUs as i don't tend to get them much
Reply With Quote
  #246 (permalink)  
Old April 22, 2009, 05:05 PM
Perineum's Avatar
Hall Of Fame
F@H
 
Join Date: Mar 2009
Location: Surrey, B.C.
Posts: 4,049

My System Specs

Default

Quote:
Originally Posted by SugarJ View Post
But if it will fold without errors at those clocks, that's actually a really good OC. I actually had to back off the clock speed of my 55nm SSC model for it to fold without errors. It would run games fine, pass all tests, but would error out folding.
Well, it's now run 8+ hours at 756core / 1512 shader / 1100 mem.

I'm not done clocking the memory but I didn't want to piss around anymore cause I already lost PPD doing all this


How do I know if I'm erroring out?
Reply With Quote
  #247 (permalink)  
Old April 22, 2009, 05:16 PM
geokilla's Avatar
Hall Of Fame
F@H
 
Join Date: Aug 2008
Location: Toronto
Posts: 3,635

My System Specs

Default

Quote:
Originally Posted by Perineum View Post
Well, it's now run 8+ hours at 756core / 1512 shader / 1100 mem.

I'm not done clocking the memory but I didn't want to piss around anymore cause I already lost PPD doing all this


How do I know if I'm erroring out?
Keep in mind that when Folding, the CPU and Memory speed has little to no effect. It's all about the shaders.

You will know if you're erroring out or not by checking your logs to see if you're getting any EUEs. You could also run FurMark or OCCT GPU stress test and see if you get any errors. Also try looping 3DMark06 as well.

My 9600GT is currently clocked at 1820 shaders. I had it at 1870 before but it would give me errors during the OCCT GPU stress test.
__________________
Reply With Quote
  #248 (permalink)  
Old April 22, 2009, 05:17 PM
Silvgearx's Avatar
Hall Of Fame
F@H
 
Join Date: Sep 2008
Location: Toronto
Posts: 2,105

My System Specs

Default

Quote:
Originally Posted by Perineum View Post
Well, it's now run 8+ hours at 756core / 1512 shader / 1100 mem.

I'm not done clocking the memory but I didn't want to piss around anymore cause I already lost PPD doing all this


How do I know if I'm erroring out?
I just make the GPU fold, when it was clock at 2250 shader the console client would just shut down core due to error
Reply With Quote
  #249 (permalink)  
Old April 22, 2009, 06:17 PM
Top Prospect
 
Join Date: Sep 2008
Location: Cambridge, ON
Posts: 105
Default

Quote:
Originally Posted by 5ILVgeARX View Post
I just make the GPU fold, when it was clock at 2250 shader the console client would just shut down core due to error
You're keeping pace with me, and I have the Q9450 going at 3.2Ghz with the console SMP, and my 4870 cranking them out.

I'm a little jealous of the Nvidia folding results. I did read on the folding@home forums (written by one of the programmers for FAH, mhouston), that the science value of the Nvidia and ATI work units is much closer than the points awarded would indicate.

Could Nvidia be paying for some extra points? Nvidia is all about marketing so I wouldn't be suprised.
Reply With Quote
  #250 (permalink)  
Old April 22, 2009, 06:27 PM
Silvgearx's Avatar
Hall Of Fame
F@H
 
Join Date: Sep 2008
Location: Toronto
Posts: 2,105

My System Specs

Default

i just got assassin's creed installed, so i was a little preoccupied at nights Also got bioshock and farcry 2

And yea nvidia is the only way to go for folding
Reply With Quote
Reply


Tags
contest , fah , folding

Thread Tools
Display Modes

Similar Threads
Thread Thread Starter Forum Replies Last Post
Folding @ Home F.A.Q. Supergrover HardwareCanucks F@H Team 229 May 15, 2010 09:24 PM
Folding@Home Passes The 5 Petaflop Mark LCB001 HardwareCanucks F@H Team 4 February 19, 2009 10:53 AM
Folding@home gpu benchmarks chrisk HardwareCanucks F@H Team 49 December 4, 2008 06:06 PM