Go Back   Hardware Canucks > HARDWARE CANUCKS COMMUNITY > HardwareCanucks F@H Team

    
Reply
 
LinkBack Thread Tools Display Modes
  #21 (permalink)  
Old January 25, 2012, 06:44 PM
lowfat's Avatar
Moderator
 
Join Date: Feb 2007
Location: Grande Prairie, AB
Posts: 7,950

My System Specs

Default

If anyone is having issues setting this up. Let us know. I'm a linux/ssh n00b and I was able to get them all working. Added some serious PPD too.
Reply With Quote
  #22 (permalink)  
Old January 25, 2012, 06:51 PM
3.0charlie's Avatar
3.0 "I kill SR2's" Charlie
F@H
 
Join Date: May 2007
Location: Laval, QC
Posts: 9,627

My System Specs

Default

Quote:
Originally Posted by SwingOP3 View Post
I feel like I'm wading in way over my head.

BUT

A quick search brought up a guide on OCN Quick and Dirty HPCS F@H Setup Guide
This worked like a charm. Except... once the CTRL+A+D command has been given, how can I go back the fah screen?
__________________
Hydro-Quebec is salivating...
Reply With Quote
  #23 (permalink)  
Old January 25, 2012, 07:06 PM
lowfat's Avatar
Moderator
 
Join Date: Feb 2007
Location: Grande Prairie, AB
Posts: 7,950

My System Specs

Default

Quote:
Originally Posted by 3.0charlie View Post
This worked like a charm. Except... once the CTRL+A+D command has been given, how can I go back the fah screen?
As in to watch progress? No idea. I just closed all the windows and just used HFM to monitor.
Reply With Quote
  #24 (permalink)  
Old January 25, 2012, 07:11 PM
Dead Things's Avatar
Hall Of Fame
F@H
 
Join Date: Oct 2008
Location: Centre of the Universe
Posts: 1,572

My System Specs

Default

You'll be able to see it again by doing...

Code:
cd fah
tail -f FAHlog.txt
To quit tailing fah, use Ctrl-C. Note this will NOT stop the client, just your viewing of it. You can verify this with top.

...which brings up another issue. How to stop the client when Ctrl-C'ing a tail just kills the tail and not the process. To stop the client, do top and note the PID of the client in the far-left column. Ctrl-C to quit top then do...

Code:
kill XXXX
...where XXXX is the PID of the client.

Enjoy boys!

edit - Sorry!

Just noticed that guide uses screen instead of nohup - which is a very clumsy way of doing it if you ask me. But oh well. So how to re-attach a screen: do...

Code:
screen -list
And note whichever instance says "Attached." Then do...

Code:
screen -r XXXX.hostname
...where XXXX.hostname is the instance of the attached screen. This method works, but is cumbersome to use and adds overhead compared to a simple nohup.
__________________
Follow my folding, mining & benching shenanigans @dt_oc!

Think you can overclock? Then show us what you got!
Join the Hardware Canucks Overclocking team today!

Last edited by Dead Things; January 25, 2012 at 07:18 PM.
Reply With Quote
  #25 (permalink)  
Old January 25, 2012, 07:54 PM
lowfat's Avatar
Moderator
 
Join Date: Feb 2007
Location: Grande Prairie, AB
Posts: 7,950

My System Specs

Default

I think you better revise your post on the first page on how much PPD this will generate. I did 4 instances of standard.2xlarge + Ubuntu Maverick 10.10. Seems closer to about 200,000 PPD w/ the WU's that I've pulled.
Reply With Quote
  #26 (permalink)  
Old January 25, 2012, 07:56 PM
Dead Things's Avatar
Hall Of Fame
F@H
 
Join Date: Oct 2008
Location: Centre of the Universe
Posts: 1,572

My System Specs

Default

In light of the 20 core ceiling, may I ask how?
__________________
Follow my folding, mining & benching shenanigans @dt_oc!

Think you can overclock? Then show us what you got!
Join the Hardware Canucks Overclocking team today!
Reply With Quote
  #27 (permalink)  
Old January 25, 2012, 08:01 PM
lowfat's Avatar
Moderator
 
Join Date: Feb 2007
Location: Grande Prairie, AB
Posts: 7,950

My System Specs

Default

Quote:
Originally Posted by Dead Things View Post
In light of the 20 core ceiling, may I ask how?
Each instance of the standard.2xlarge is able to do bigadv. If they pull a P6903 that is around 60,000 PPD or so.

2 instances per cluster (AZ1 and AZ2). 3 of the instances pulled a P6903 the other just a P6901.
Reply With Quote
  #28 (permalink)  
Old January 25, 2012, 08:32 PM
SwingOP3's Avatar
Top Prospect
F@H
 
Join Date: Dec 2009
Location: Regina
Posts: 122
Default

I actually used HP Cloud Services SMP client setup from the EVGA Forums to stay in Windows.

I've set up the first one but I'm not sure it's right...............Wait and see, I guess.
__________________
Reply With Quote
  #29 (permalink)  
Old January 25, 2012, 08:58 PM
dandelioneater's Avatar
Hall Of Fame
F@H
 
Join Date: Dec 2010
Location: Kelowna, BC
Posts: 1,114

My System Specs

Default

Just got my 3 clients setup. Very cool. I wonder when the beta is over what the monthly charge would be? And would it be more cost effective to fold via cloud than the with your own machine. ie: monthly cloud charge vs. cost of building a rig and paying power bill to run it.
__________________
Reply With Quote
  #30 (permalink)  
Old January 25, 2012, 09:32 PM
Tim_H's Avatar
Top Prospect
 
Join Date: May 2011
Location: Regina, SK
Posts: 216
Default

up and running!!!
Reply With Quote
Reply


Thread Tools
Display Modes

Similar Threads
Thread Thread Starter Forum Replies Last Post
Absolute best Folding GPU for Folding CasheKicker HardwareCanucks F@H Team 21 February 14, 2012 09:54 AM
Folding client V7 - SLI - Cannot use system during folding..fix? Jake_HT HardwareCanucks F@H Team 8 January 18, 2012 10:17 AM
6 GPU folding computer/need help with dedicated folding machine Silent_Avenger New Builds 11 December 25, 2011 01:17 PM
folding on 3 core / core free for gpu folding phil3_66666 HardwareCanucks F@H Team 12 January 11, 2010 06:07 PM
CPU folding BT! HardwareCanucks F@H Team 9 December 2, 2008 10:20 PM