This repository contains scripts for analyzing VERITAS data and MC simulations.
These scripts are part of the Eventdisplay packages and require additionally the following items installed:
- binaries and libraries from the Eventdisplay package.
- Eventdisplay analysis files with configuration files, calibrations files, and instrument response functions. From Eventdisplay_AnalysisFiles_VTS.
The scripts are optimized for the DESY computing environment, utilizing HTCondor batch systems and apptainers.
Set the following environment variables:
$EVNDISPSYS
: Path to the Eventdisplay installation (Eventdisplay package).$EVNDISPSCRIPT
: Path to the./scripts
directory of this repository (scripts directory).$VERITAS_ANALYSIS_TYPE
(recommended): Indicates the reconstruction methods applied, e.g.,AP_DISP
,NN_DISP
.
Additional environment variables, especially useful for batch systems, can be found in ./scripts/set_environment.sh.
Submission commands for various batch systems are available in submissionCommands.dat. Modify these commands according to your local requirements.
Scripts for downloading run-wise information from the VERITAS database are available in ./scripts/db_scripts/README.md.
IRFs are provided for each Eventdisplay release. The following instructions are for the IRF processing team.
VERITAS IRFs are divided into epochs (e.g., summer/winter, throughput epochs, instrument stages like V4, V5, V6). Epochs are defined in ParameterFiles/VERITAS.Epochs.runparameter and should align with calibration throughput corrections (see internal VERITAS wiki page).
Throughput corrections are defined in ParameterFiles/ThroughputCorrection.runparameter.
Analysis scripts require a list of all V6 summer and winter periods, which are listed in IRF_EPOCHS_WINTER.dat and IRF_EPOCHS_SUMMER.dat. UV Filter IRF periods are defined in IRF_EPOCHS_obsfilter.dat. No changes to the analysis scripts are required, except for updating the help message (list of epochs) in ./IRF.production.sh.
Adding a new epoch usually requires re-running the mscw data analysis steps with updated lookup tables and dispBDTs, as these IRFs have changed. This step also updates the IRF flag in the mscw files.
This stage requires the most computing resources and usually takes several days. MC simulation files are required in the directory structure outlined in ./scripts/IRF.production.sh.
Run the following steps for all analysis types (AP
, NN
):
./IRF.generalproduction.sh CARE_RedHV_Feb2024 EVNDISP
./IRF.generalproduction.sh CARE_24_20 EVNDISP
Results are stored in $VERITAS_IRFPRODUCTION_DIR/v491/AP/CARE_24_20/V6_2022_2023w_ATM61_gamma/
. For DESY productions, the evndisp files should be moved to $VERITAS_IRFPRODUCTION_DIR/v4N/AP/CARE_24_20/V6_2022_2023w_ATM61_gamma/
.
Fill lookup tables per bin:
./IRF.generalproduction.sh CARE_24_20 MAKETABLES
Then combine the tables with:
./IRF.generalproduction.sh CARE_24_20 COMBINETABLES
Move the tables from $VERITAS_IRFPRODUCTION_DIR/v491/${VERITAS_ANALYSIS_TYPE:0:2}/Tables
to $VERITAS_EVNDISP_AUX_DIR/Tables
.
./IRF.generalproduction.sh CARE_24_20 TRAINMVANGRES
Copy and zip the files to $VERITAS_EVNDISP_AUX_DIR/DispBDTs
:
cd $VERITAS_EVNDISP_AUX_DIR/DispBDTs
./copy_DispBDT.sh
(take care of any errors printed to the screen)
Generate background training events using:
./IRF.selectRunsForGammaHadronSeparationTraining.sh <major epoch> <source mscw directory> <target mscw directory> <TMVA run parameter file (full path)>
Use e.g. $VERITAS_DATA_DIR/processed_data_v491/AP/mscw/
for the source directory, which contains processed mscw files from observations. Main purpose is to select runs with good data quality and runs obtained from observations of strong gamma-ray sources.
This script links mscw files from the archive to a target directory sorted by epoch and zenith bins (read from TMVA run parameter file).
(only for regular HV)
- Use
TRAINTMVA
in./IRF.generalproduction.sh
, which calls the scriptIRF.trainTMVAforGammaHadronSeparation.sh
. - Copy TMVA BDT files to
$VERITAS_EVNDISP_AUX_DIR/GammaHadronBDTs
using$VERITAS_EVNDISP_AUX_DIR/GammaHadronBDTs/copy_GammaHadron_V6_BDTs.sh
(XML files are not zipped).
Requires as input:
TMVA.runparameter
file- mscw files from observations (see above) for background events
- mscw files from simulations for signal events
Cut optimization requires signal rates (from simulations) and background rates (from data). The $EVNDISPSYS/bin/calculateCrabRateFromMC
tool is used to calculate rates after pre-selection cuts (note: set CALCULATERATEFILES="TRUE"
in $EVNDISPSCRIPTS/helper_scripts/IRF.optimizeTMVAforGammaHadronSeparation_sub.sh
).
Important: This script is not working in combination with the usage of apptainers.
- Generate effective areas for pre-selection cuts using
PRESELECTEFFECTIVEAREAS
. - Generate background anasum files for pre-selection cuts. Use
$EVNDISPSCRIPTS/ANALYSIS.anasum_allcuts.sh
with thePRECUTS
option to submit the corresponding jobs (use the same runs for background rate calculation as used for BDT training). These files should be moved into e.g.$VERITAS_IRFPRODUCTION_DIR/v491/AP/BDTtraining/BackgroundRates/V6/NTel2-Moderate
(adjust epoch and cut directory name).
Cut values are extracted by the optimization tool and written e.g., to
VERITAS_IRFPRODUCTION_DIR/v491/AP/BDTtraining/BackgroundRates/V6/Optimize-NTel2-Moderate/
Copy and paste those values into the files defining the gamma/hadron separation cuts in $VERITAS_EVNDISP_AUX_DIR/GammaHadronCuts
.
Effective area generation requires the MC-generated mscw files and well defined gamma/hadron cuts values in $VERITAS_EVNDISP_AUX_DIR/GammaHadronCuts
.
./IRF.generalproduction.sh CARE_24_20 EFFECTIVEAREAS
This generates effective areas per bin in parameter spaces. To combine the effective areas for a single file per cut and epcoh, run:
./IRF.generalproduction.sh CARE_24_20 COMBINEEFFECTIVEAREAS
Move the generated effective areas files to $VERITAS_EVNDISP_AUX_DIR/EffectiveAreas
.
For any questions, contact Gernot Maier
Eventdisplay_AnalysisScripts_VTS is licensed under the BSD 3-Clause License - see the LICENSE file.