Installation

General Presentation

The ECLAIRs instrument pipeline, named ECPI, is distributed as the python package eclairs-gp and it is available in the svom project repository at http://drf-gitlab.cea.fr/svom/eclairs. As a python package we have written an installation script and a module for managing the user space which allows importing the code of the pipeline as modules of the eclairs-gp package, with main module ecpi.

To work cleanly with the code of the pipeline, one of the constraints is to install eclairs-gp in a python virtual environment (which will contain ECPI_ROOT), and to create workspaces (ECPI_HOME) in the user space tree. One of the advantages of these workspaces is that the user will be able to create as many virtual environments and workspaces as he wants, because the configuration locale is stored in an .ect/ directory for each different space.

For the project developers, an script ecpi_checkout will clone the sub-trees of the component code from the pipeline and to configure the package installed under ECPI_ROOT to read the development version instead of the installed version. To use this feature, the scripts import the module GitPython which uses the software git present in your OS. So git must support the feature git --sparse available in git version >= 2.33. If your git version does not support git --sparse or git sparse-checkout, you cannot use ecpi_checkout.

The modules available to clone with ecpi_checkout currently have all six components, dpco, cali, bube, mosa, spex and imag, and in addition the module of simulation simu. It is possible to define other modules or components if necessary. Cloning pipeline subtree uses native functionality of software git which is stable and well documented: git sparse-checkout. The processing and control chain of versions remains unchanged with git.

If you are installing eclairs-gp as a developer, you must provide your username and an access token write repository (write_repository) on http://drf-gitlab.cea.fr. This information will be requested by the script ecpi_conf_tool for creating your workspace (ECPI_HOME). Read the documentation A VOIR LE LIEN for creating access tokens at https://drf-gitlab.cea.fr for more information.

Local Installation

To have a local installation of the pipeline, you will have to follow 3 steps:

  1. Install the piepline.

  2. Define an input dataset.

  3. Create a parameter’s file.

All the commands presented on this page are executed in a shell environment bash

$ conda list

Install the pipeline

With the default calibration files

To do a local installation of the pipeline we need to create a virtual environment of python version 3.8 (the official version of the svom project), and do a remote installation using a (public) deployment token of the pipeline which gives a read-only access.

In this example, we will assume that the python distribution is based on conda (miniconda, anaconda, etc.) and we will create the virtual environment for the pipeline. You must follow the following steps:

  1. Create a python virtual environment (in this case with conda), unconstrained with the name of the environment. In this example we will name the environment ecpi_test and we will use python (v. 3.8):

    $ conda create -n ecpi_test python=3.8
    
  2. Activez l’environnement conda

    $ conda activate ecpi_test
    
  3. Install the pipeline from the repository gitlab.cea.fr, with the PyPi registry API access token TCssPbB3kBJjgdJEGa23 (All the necessary dependencies will be installed automatically.). The command is as follows:

    $ pip install eclairs-gp -i https://__token__:TCssPbB3kBJjgdJEGa23@drf-gitlab.cea.fr/api/v4/groups/158/-/packages/pypi/simple
    
  4. To complete the installation, you will need to take the calibration files with the caldb_init script and the -s flag for the default files.

    $ caldb_init -s
    

So! The pipeline is installed and the default calibration files are used.

With the calibration files obtained from the server

In order to be able to use the calibration files from the CalDB server, the procedure to follow is the same as before up to point 4.

  1. From there, it is imperative to declare the environment variables KC_USERNAME (your indentifier fsc.svom.org), KC_PASSWORD (your password fsc.svom.org) and KC_CLIENT_ID (type of client, in your case FSC_PUBLIC).

    $ export KC_USERNAME=username
    $ export KC_PASSWORD=password
    $ export KC_CLIENT_ID=FSC_PUBLIC
    
  2. Check your environment variables with

    $ env | grep KC
    KC_USERNAME=************
    KC_PASSWORD=********
    KC_CLIENT_ID=FSC_PUBLIC
    
  3. Finally, call the script caldb_init script (without flag):

    $ caldb_init
    

So! Calibration files are downloaded from the CalDB server.

Definition of the input dataset

In this part, we will define input files that are necessary for the execution of the pipeline. Here are the steps:

  1. We assume that you have a triplet of input files consisting of an attitude file SVO-ATT-CNV*.fits, an orbit file SVO-ORB-CNV*.fits and an event file ECL-EVT-SEC*.fits consistent with each other.

    $ ls
    ECL-EVT-SEC_tuto.fits  SVO-ATT-CNV_tuto.fits  SVO-ORB-CNV_tuto.fits
    
  2. Create an input directory for the pipeline

$ mkdir input
  1. Move the files to this directory

    $ mv ECL-EVT-SEC* SVO-ATT-CNV* SVO-ORB-CNV* input
    
  2. Create an output directory

$ mkdir output
  1. Create an empty paramater file *.ini. For this example, we will call it parameters_file.ini. We will see in the next section how to fill this file.

  2. At this stage, you should have an input directory containing the 3 files SVO-ATT-CNV_*.fits, SVO-ORB-CNV_*.fits and ECL-EVT-SEC_*.fits, an output directory as well as a parameter file.

    $ ls -R
    .:
    parameters_file.ini input  output
    
    ./input:
    ECL-EVT-SEC_tuto.fits  SVO-ATT-CNV_tuto.fits  SVO-ORB-CNV_tuto.fits
    
    ./output:
    

So! Your input directory is created and filled with the input files, the output directory is created as well as your parameter file. Now, you have to fill this file in with the correct information.

Filling the parameter file

In order to be able to launch the pipeline, you need a parameter file containing several specific information.

Below, you will find the structure of a file:

[general]
# Information level for the logs during the execution of the pipeline.
# Available values: debug, info, warning, error, critical.
# Default: warning.
logger_level = info
# MANDATORY.
# Path to a directory containing a triplet of consistent input files for analysis.
working_in = <path/to/directory/input>
# MANDATORY.
# Path to an existing directory in which the products created by the pipeline will be deposited.
working_out = <path/to/directory/output>
# Path to the directory containing the calibration files.
# @local to use the local caldb, or specify a path to another valid directory.
caldb_dir = @local

[dpco]
# MANDATORY.
# List containing the desired products for the DPCO component.
# No possible products at the time.
out_files = []

[cali]
# MANDATORY.
# List containing the desired products for the CALI component.
# Possible products: 'ECL-GTI-CAL', 'ECL-EVT-CAL'.
out_files = []

[bube]
# List of energy bands desired for treatment.
# Possibles values: Any value (in channel number) in the range of energy defined in the EBOUND column of the RMF file.
# By default, 5 bands are processed [[13,60],[61,140],[141,300],[301,620],[621,1260]].
energy_channels = []
# Type of Earth occultation to be taken into account for the gti combinations, a version of each product will be created for each type of occultation.
# Possible values: 'NEO', 'PEO', 'TEO'.
# Default: 'PEO'.
ear_occ_bub = []
# Type of correction for each occlusion defined above.
# Possible values: 'no_correction', 'flat', 'shape', 'flat_moretti', 'shape_moretti', 'shape_moretti_earth', 'shape_moretti_earth_ref', 'shape_moretti_smart'.
# Default: 'shape_moretti_earth'.
bkg_cor_mod = []
# MANDATORY.
# List containing the desired products for the BUBE component.
# Possible products: 'ECL-DET-IMA', 'ECL-DET-UBC', 'ECL-GTI-UBC', 'ECL-EAR-OFM'.
out_files = []

[imag]
# SNR threshold for source detection.
# Minimum value: 7.
# Default: 0.
snr_ths =
# Half window width for maximum fit with SPFS.
# Minimum value: 1.0.
# Default: 5.0.
win_dim =
# irf correction.
# Possible values: no_correction, cos_theta, irf.
# Default: cos_theta.
irf_corr =
# MANDATORY
# List containing the desired products for the IMAG component.
# Possible products: 'ECL-SKY-IMA', 'ECL-SOP-IMA'.
out_files = []

So! You can now run the pipeline with the following statement:

$ ecpi_run parameters_file.ini

To conclude:

  1. You have installed a local version of the eclairs-gp pipeline.

  2. You have downloaded the calibration files (default versions or directly from the CalDB server).

  3. You have prepared the input data with in particular the creation of the input and output directories.

  4. You have built a parameter file that can be used by the pipeline.

With the launch command, you now have all the information you need to be able to run the eclairs-gp pipeline.