Adamis-cluster
Note on AnalysisBackend installation
Software
- Add pyslalib to $PYTHONPATH :
export PYTHONPATH=/usr/local/lib/python2.6/dist-packages/pyslalib:$PYTHONPATH
- Download AnalysisBackend and add its path (minus leaf) to $PYTHONPATH :
git clone ssh://bolowiki.berkeley.edu/pbrepo/AnalysisBackend
export ABSPATH=path_to_analysisbackend
export PATH=$PATH:$ABPATH/indexing:$ABPATH/misc:$ABPATH/unpack:$ABPATH/calibration:$ABPATH/starcamera
- Download PyPolarBear and add its path to $PYTHONPATH :
git clone ssh://bolowiki.berkeley.edu/pbrepo/PyPolarbear
- Change PyPolarBear/Mapping/DeviceManager.py line17: remove relative path.
Data
- Remove first line of /data/polarbear/saturn620/table 'Fetching archive table...' before unpacking
- Data has been downloaded to /data/polarbear which belongs to polarbear group.
Pipeline
Browsing pipelines
- pipeweb is running on port 8080. The pointing pipeline shortname is pointing.
- To browse the pipe at http://localhost:8080 through ssh tunneling:
ssh -N -f username@134.158.189.3 -L8080:134.158.189.3:8080 sleep 60
- The pointing pipeline is stored in /data/polarbear/pointing. The sql database is stored in /home/lejeune/work/polarbear/.sqlstatus.
Running pipelines
- Pipelet jobs should run from the master node using the job submission.
- The HOST environment variable has to be set:
export HOST=134.158.189.3
- Update the job_header variable in main.py according to your .bashrc enviromnent
job_header = """ #!/bin/bash export PYTHONPATH=/usr/local/lib/python2.6/dist-packages/pyslalib/:/home/lejeune/Soft/PyPolarbear:/home/lejeune/Soft/:$PYTHONPATH PATH=/home/lejeune/bin:$PATH export ABPATH=$HOME/Soft/AnalysisBackend export PATH=$PATH:$ABPATH/indexing:$ABPATH/misc:$ABPATH/unpack:$ABPATH/calibration:$ABPATH/starcamera echo $PYTHONPATH """
- Run the main.py script:
cd AnalysisBackend/pipelet
python2.6 main.py -b 4
to use 4 workers. - Or in the interactive mode using
qsub -I
ANDrun main.py -d
__(no process mode allowed)__.
Troubleshooting
glob_seg is not able to find an existing file
This may come from a bug in the /data filesystem. Try to use another node by adding the following line at the beginning of the job_header variable : #PBS -l nodes=node5.local:ppn=1,walltime=24:00:00
import multiarray error
Empty the PYTHONPATH