Hardware Canucks

Hardware Canucks (http://www.hardwarecanucks.com/forum/)
-   HardwareCanucks F@H Team (http://www.hardwarecanucks.com/forum/hardwarecanucks-f-h-team/)
-   -   Horrible PPD with XP Pro?? (http://www.hardwarecanucks.com/forum/hardwarecanucks-f-h-team/21202-horrible-ppd-xp-pro.html)

smwoodruff0908 July 25, 2009 12:21 PM

Horrible PPD with XP Pro??
 
I just switched over my folding rig from Vista Home Premium to XP Pro. I have two 9800GX2's, a Phenom II X4 940, 3GB of RAM. I also changed some of the programs that I now use to fold. I now use VMWare server 1.0.9 instead of 2.0. I'm running 2 instances of notfreds utilizing 2 cores per client, rather than running Ubuntu in the VM's. I am also running the GPU CLI client instead of the version with the nVidia Viewer. I have the same CUDA drivers installed as I did under Vista. I am using WinAFC to spread out the work of the clients across all 4 cores. My CPU is only at 1-3% with all the GPU clients running, and right about 100% with both VM's running.

I don't understand what's happening here. The last two days I was folding under Vista, I hit 25k each day. Now with XP I just hit 8,500 for today!! I really need some help here! Any and all advice is *greatly* appreciated!

Woody

Nademon July 25, 2009 01:06 PM

Is the PPD down for both the GPU and CPU clients? Have you adjusted the priority for the clients...is anything taking priority over them? Maybe make sure you GPU clients have higher priority than the VMware.

I've had an ongoing issue with my 2 9800GX2's where one GPU usually underperforms by a huge gap, but sometimes it runs fine. I'm not running VM clients on that rig either. My understanding is that XP uses more resources for folding than Vista.

smwoodruff0908 July 25, 2009 01:15 PM

Quote:

Originally Posted by Nademon (Post 230363)
Is the PPD down for both the GPU and CPU clients? Have you adjusted the priority for the clients...is anything taking priority over them? Maybe make sure you GPU clients have higher priority than the VMware.

I've had an ongoing issue with my 2 9800GX2's where one GPU usually underperforms by a huge gap, but sometimes it runs fine. I'm not running VM clients on that rig either. My understanding is that XP uses more resources for folding than Vista.

Well it seems more or less that my GPUs are getting horrible PPD, whereas my CPU clients seem to be about what they were before. My GX2's used to be getting about 5k PPD per core, but now they're down to ~2k. My CPU clients are at 2700PPD each. With all the GPU clients up and running, and nothing else my CPU usage is only 1-3%. When I start up the CPU clients, usage then goes to 100%. Also, I did give the GPU clients a higher priority than the VM's.

Nademon July 25, 2009 01:18 PM

I see. You sure you're not getting stuck with the 511's? Also, did you downgrade the priority of Vmware-vmx? I just downgraded the priority of EVERY Vmware process but especially Vmware-vmx. Any change in your setup that might be causing excessive temps and make the GPU's start throttling?

Oh ya...fresh install of XP...just to confirm.

smwoodruff0908 July 25, 2009 01:23 PM

Quote:

Originally Posted by Nademon (Post 230377)
I see. You sure you're not getting stuck with the 511's? Also, did you downgrade the priority of Vmware-vmx? I just downgraded the priority of EVERY Vmware process but especially Vmware-vmx. Any change in your setup that might be causing excessive temps and make the GPU's start throttling?

Oh ya...fresh install of XP...just to confirm.

Yeah, it's a completely fresh install of XP Pro 64. I installed all the latest drivers for everything I could find, and have run all updates for XP.

I have gotten a 511, but I've also gotten many other work units as well. One time under Vista when I happened to notice all clients running 511's I was still getting about 4k PPD per GPU. Well, the VM's have lower priority than the GPU clients, if that's what you mean? There really shouldn't be any temp issues, I think the highest they have gotten is ~80*C. I have an Ultra Kaze fan right in front of the cards blowing cold air right into them, and another Ultra Kaze removing the hot air from the case.

Nademon July 25, 2009 01:26 PM

Ok...seems pretty straight up so far. But did you make sure the process Vmware-vmx in particular is downgraded in priority? I think that's the one that sucks up the cycles.

smwoodruff0908 July 25, 2009 01:30 PM

Quote:

Originally Posted by Nademon (Post 230386)
Ok...seems pretty straight up so far. But did you make sure the process Vmware-vmx in particular is downgraded in priority? I think that's the one that sucks up the cycles.

Yes, in WinAFC I set vmware-vmx to have a lower priority than the GPU clients. I also assigned them each to 2 different cores of the CPU. So one VM is on CPU0 and CPU1, and the other is on CPU2 and CPU3. I also split up the GPU clients to spread out the workload. I put two GPU clients on CPU0 and CPU1, and the other two GPU clients on CPU2 and CPU3. Again, with only the GPU clients up and running my total CPU usage is only around 1-3%.

Nademon July 25, 2009 01:33 PM

OK try this...Kill one of the Vmclients and see if your GPU PPD goes back up to normal. I have a feeling it might be that you're on XP now and it uses more CPU. I would kill all the folding first, reboot, then start up the 4 GPU clients and only 1 Vmclient. you should know within about 15-20 mins what kind of PPD your GPU's are putting up.

geokilla July 25, 2009 02:17 PM

Have you tried checking your GPU clocks? Also just verify that your folding clients aren't trying to fold off the same GPU.

smwoodruff0908 July 25, 2009 02:22 PM

Quote:

Originally Posted by geokilla (Post 230430)
Have you tried checking your GPU clocks? Also just verify that your folding clients aren't trying to fold off the same GPU.

Well in GPU-z it shows my clocks just as I set them in nvidia's control panel. I also run them by using shortcuts on my desktop. Each one has a different "-gpu x -local" flag, so I'm guessing that they're not trying to use the same GPU? Is there a better way to tell how much load is on each GPU?


All times are GMT -7. The time now is 11:42 AM.