View Single Post
  #2 (permalink)  
Old September 15, 2007, 04:35 PM
bushwickbill bushwickbill is offline
Top Prospect
 
Join Date: Mar 2007
Location: West Coast Canada
Posts: 175
Exclamation

Quote:
Originally Posted by bushwickbill View Post
I saw this page awhile back and just remembered it when I was doing some GPU overclocking today. According to this table:http://www.overclock.net/hardware-ne...ng-tables.html
It says when you increase your core clocks you also increase you shader clocks. So I was trying to find a good overclock while trying to stay in the highest shader clock I could for max performance in games(I hope thats how it works?). At 597 Core MHZ my shader clock should have jumped to 1404 MHZ right. Well according to Rivatuner, in the shader clock hardware monitoring plug in, it says 1350 no matter what I do. Does anybody understand why the shader clock stays at 1350MHZ? Crysis demo is less than two weaks away, Need To get my rig Tweaked some more
Somebody must be able to help me.
__________________
System spec: Intel i7 2600k, Gigabyte GTX 570 Super Overclock, Kingston DDR3 2X4 GB sticks, ASUS P8P67 PRO, Cooler Master 700 watt PSU, 750GB*2 Sata,SATA III 1TB HDD Acer 24 Widescreen, Antec 300, Windows 7 Ultimate 64 Bit
Reply With Quote