Home
What is Virgo ?
People
CBwaves
GPU_inspiral
Computing activities
Analysis activities
Theory
Galery
Publications
Events
Useful links
Getting started
TDK/BSc/MSc/PhD
Archive
Contact
Other GW groups

How to run analysis pipelines at Bologna (CNAF) computer center

Introduction

The bologna computer center runs LSF batch system and is part of the EGEE Grid. Using the concept of the so called pilot jobs a virtual condor metacluster is maintened which is accessible from the User Interface machines (see below), so CBC condor pipelines can be submitted to it. Follow the procedure and if you have questions, send an e-mail to:

Gergely.Debreczeni@cern.ch

The procedure

Perform the following steps:
  1. Obtain an account in the CNAF computer center. Instructions could be found here.
  2. Logg in to one of the following machines
         ui01-virgo.cr.cnaf.infn.it.
         ui02-virgo.cr.cnaf.infn.it.
        
    These are the User Interface machine which shares the following directories with the Worker Nodes:
        /storage/gpfs_virgo3/home
        /storage/gpfs_virgo3/scratch
        /storage/gpfs_virgo3/virgo
      
    If possible use the scratch area, since it is world readable.
  3. Change to one of the shared directories.
        (cmd.: cd /storage/gpfs_virgo3/home/gdebrecz)
        
  4. IMPORTANT: Set the file creation mask to 0002, i.e. the files you are creating from now on will be group writeable.
       (cmd.: umask 0002)
        
  5. Create a new directory, and enter into it.
        (cmd.: mkdir mywd; cd mywd)
        
  6. Source VIRGO environment setting file:
       (cmd.:  . /opt/exp_software/virgo/lscsoft/etc/virgoenv.sh)
        
  7. Launch some condor test job and check whether it works fine or not.
        (cmd:.
        cp /storage/gpfs_virgo3/home/gdebrecz/condortest/* .
        condor_submit vog-testjob.des
        )
        
  8. If the testjobs finished OK, then from this point on you can do everything in the same way as on other clusters. Except the fact, that you have to copy the
        /opt/exp_software/virgo/lscsoft/etc/LSCdataFind
         
    executable and overwrite the one comes with the standard LSCsoft installation !

BE AWARE, that all your files and software _should be installed on the shared areas_ and should be group read/writeable so that your jobs on the Worker Nodes - which are running as a different user - could access it !

Useful remarks

  1. You can find a test installation as an example here:
        /opt/exp_software/virgo/lscsoft/
        
    for S5 GRB runs or under /storage/gpfs_virgo3/home/gdebrecz/lscsoft/ /storage/gpfs_virgo3/home/gdebrecz/lscsoft/non-lsc for the s6_20090722 branch.
        /opt/exp_software/virgo/lscsoft/etc/s5grbenv.sh
         
    to see how I source the various files.
  2. The system python installation should not be used, use the one installed in the above software area , this is automatically happens if you source the
       /opt/exp_software/virgo/lscsoft/etc/virgoenv.sh
      
    file
  3. An automated software installation script is available in the RMKI Virgo Group SVN repository. You can check it out with the following command:
        svn co http://grid.kfki.hu/svn/virgo/scripts
       
    See the description inside on how to use it.


The RMKI Budapest Virgo Group

Edit | Attach | Watch | Print version | History: r4 < r3 < r2 < r1 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r4 - 2009-10-02 - GergelyDebreczeni
 
This site is powered by the TWiki collaboration platform Powered by Perl This site is powered by the TWiki collaboration platformCopyright &© by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback