Go Back   Hardware Canucks > PC BUILDERS & TWEAKERS CORNER > Overclocking, Tweaking and Benchmarking

    
Reply
 
LinkBack Thread Tools Display Modes
  #1 (permalink)  
Old September 15, 2007, 05:34 PM
Top Prospect
 
Join Date: Mar 2007
Location: West Coast Canada
Posts: 175
Default 8800 GTS/GTX locking tables??

I saw this page awhile back and just remembered it when I was doing some GPU overclocking today. According to this table:http://www.overclock.net/hardware-ne...ng-tables.html
It says when you increase your core clocks you also increase you shader clocks. So I was trying to find a good overclock while trying to stay in the highest shader clock I could for max performance in games(I hope thats how it works?). At 597 Core MHZ my shader clock should have jumped to 1404 MHZ right. Well according to Rivatuner, in the shader clock hardware monitoring plug in, it says 1350 no matter what I do. Does anybody understand why the shader clock stays at 1350MHZ? Crysis demo is less than two weaks away, Need To get my rig Tweaked some more
__________________
System spec: Intel i7 2600k, Gigabyte GTX 570 Super Overclock, Kingston DDR3 2X4 GB sticks, ASUS P8P67 PRO, Cooler Master 700 watt PSU, 750GB*2 Sata,SATA III 1TB HDD Acer 24 Widescreen, Antec 300, Windows 7 Ultimate 64 Bit
Reply With Quote
  #2 (permalink)  
Old September 15, 2007, 05:35 PM
Top Prospect
 
Join Date: Mar 2007
Location: West Coast Canada
Posts: 175
Exclamation

Quote:
Originally Posted by bushwickbill View Post
I saw this page awhile back and just remembered it when I was doing some GPU overclocking today. According to this table:http://www.overclock.net/hardware-ne...ng-tables.html
It says when you increase your core clocks you also increase you shader clocks. So I was trying to find a good overclock while trying to stay in the highest shader clock I could for max performance in games(I hope thats how it works?). At 597 Core MHZ my shader clock should have jumped to 1404 MHZ right. Well according to Rivatuner, in the shader clock hardware monitoring plug in, it says 1350 no matter what I do. Does anybody understand why the shader clock stays at 1350MHZ? Crysis demo is less than two weaks away, Need To get my rig Tweaked some more
Somebody must be able to help me.
__________________
System spec: Intel i7 2600k, Gigabyte GTX 570 Super Overclock, Kingston DDR3 2X4 GB sticks, ASUS P8P67 PRO, Cooler Master 700 watt PSU, 750GB*2 Sata,SATA III 1TB HDD Acer 24 Widescreen, Antec 300, Windows 7 Ultimate 64 Bit
Reply With Quote
  #3 (permalink)  
Old September 18, 2007, 02:09 PM
MVP
 
Join Date: Aug 2007
Location: Windsor, Ontario, Canada
Posts: 305
Default

does it change when u run a 3d application?? Run 3dmark and see if it changes during the benchmark....
__________________
i2600k sandy bridge @ 3.4ghz/ 2x 8800 GTX KO ACS3 PCI-E 630MHZ 768MB 2.0GHZ SLI'd / Raptor 74GB / Antec 1200 Hundred Gaming Case / G.Skill Ripjaws ddr3 1600 cl 8-8-8-24 - 8gigs / evga z68 SLI MOBO / Samsung DVD+RW / Corsair ax850 PSU / 24" Samsung CinkMaster

Cooling:
Swiftech Dual rad
Swiftech Single rad
Swiftech Micro Res
Swiftech Apogee GT Cpu block
Feser One Red Fluid
Reply With Quote
Reply


Thread Tools
Display Modes

Similar Threads
Thread Thread Starter Forum Replies Last Post
An 8800 for 100$ Is is possible? TwystedMonkey Video Cards 27 March 20, 2008 05:08 PM
8800 GT SSC v. 8800 GT (EVGA) Inhocmark Video Cards 8 February 26, 2008 03:13 PM
A better CPU or another 8800 for better FPS Forge Overclocking, Tweaking and Benchmarking 14 January 26, 2008 12:09 PM
AMD answers the 8800 GT... qwerty Rumor Mill 19 November 14, 2007 10:10 PM