justIN           Dashboard       Workflows       Jobs       AWT       Sites       Storages       Docs       Login

Workflow 2790, Stage 1

Priority50
Processors1
Wall seconds28800
RSS bytes4194304000 (4000 MiB)
Max distance for inputs100.0
Enabled input RSEs CERN_PDUNE_EOS, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MANCHESTER, MONTECARLO, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled output RSEs CERN_PDUNE_EOS, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MANCHESTER, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled sites BR_CBPF, CA_SFU, CA_Victoria, CERN, CH_UNIBE-LHEP, CZ_FZU, ES_CIEMAT, ES_PIC, FR_CCIN2P3, IN_TIFR, IT_CNAF, NL_NIKHEF, NL_SURFsara, UK_Bristol, UK_Brunel, UK_Durham, UK_Edinburgh, UK_Imperial, UK_Lancaster, UK_Liverpool, UK_Manchester, UK_Oxford, UK_RAL-Tier1, UK_Sheffield, US_BNL, US_Caltech, US_Colorado, US_FNAL-FermiGrid, US_FNAL-T1, US_Michigan, US_MIT, US_Nebraska, US_NotreDame, US_PuertoRico, US_SU-ITS, US_Swan, US_UChicago, US_UConn-HPC, US_UCSD, US_Wisconsin
Scopeusertests
Events for this stage

Output patterns

 DestinationPatternLifetimeFor next stage
1https://fndcadoor.fnal.gov:2880/dune/scratch/users/lwhite86/02790/1H4_v34b_*.root

Environment variables

NameValue
CENTRALP5
INPUT_DIR/cvmfs/fifeuser2.opensciencegrid.org/sw/dune/4e1d2c14ec459d1bb9f5407dba2a8385844995e3
NPART100000
POLARITY+

File states

Total filesFindingUnallocatedAllocatedOutputtingProcessedNot foundFailed
1000000001000000

Job states

TotalSubmittedStartedProcessingOutputtingFinishedNotusedAbortedStalledJobscript errorOutputting failedNone processed
1144000001056301414940128114
Files processed0020020040040060060080080010001000120012001400140016001600Aug-09 10:00Aug-09 12:00Aug-09 14:00Aug-09 16:00Aug-09 18:00Aug-09 20:00Aug-09 22:00Aug-10 00:00Aug-10 02:00Aug-10 04:00Aug-10 06:00Aug-10 08:00Aug-10 10:00Aug-10 12:00Aug-10 14:00Aug-10 16:00Aug-10 18:00Aug-10 20:00Aug-10 22:00Aug-11 00:00Aug-11 02:00Aug-11 04:00Files processedBin start timesNumber per binIT_CNAFUS_FNAL-FermiG…US_FNAL-FermiGridUK_BrunelUS_UChicagoUK_SheffieldUK_LancasterUS_ColoradoCERNCZ_FZUNL_SURFsaraUK_LiverpoolNL_NIKHEFBR_CBPFFR_CCIN2P3UK_ImperialUS_WisconsinUK_RAL-Tier1UK_DurhamUK_ManchesterES_PICCA_SFUUS_NotreDameUS_FNAL-T1

RSEs used

NameInputsOutputs
MONTECARLO105870

Stats of processed input files as CSV or JSON, and of uploaded output files as CSV or JSON (up to 10000 files included)

File reset events, by site

SiteAllocatedOutputting
US_NotreDame2981
NL_SURFsara2627
US_PuertoRico260
US_FNAL-FermiGrid1827
UK_Manchester1616
ES_PIC134
CZ_FZU97
BR_CBPF713
IT_CNAF65
UK_Sheffield51
UK_Brunel50
NL_NIKHEF411
CERN31
US_UChicago31
UK_Lancaster22
UK_Imperial20
FR_CCIN2P320
US_Colorado11
CA_SFU10
UK_RAL-Tier1022
UK_Liverpool01

Jobscript

#!/bin/bash

SECONDS=0
#export PMOMENTUM=80000
export POLARITY="${POLARITY:-+}"

if [ $POLARITY != "+" ] && [ ${POLARITY} != "-" ]; then
  echo "ERROR MUST SUPPLY + OR - TO POLARITY"
  exit 1
fi

export PMOMENTUM="${POLARITY:-+}80000"
echo "PMOMENTUM: ${PMOMENTUM}"

export INFILE=H4.in
export CENTRALP="${CENTRALP:-1}"
export MOMENTUMVLE="${POLARITY}${CENTRALP}" #"3"
echo "MOMENTUMVLE: ${MOMENTUMVLE}"

export PARTPERJOB=${NPART:-100}
export ADDPARAM="momentumVLE=$MOMENTUMVLE pMomentum=$PMOMENTUM"
export ADDFILES=""

JOBID=1
#export JOBID=$(($st+1)) ##TODO -- REPLACE WITH JUSTIN JOB

DID_PFN_RSE=`$JUSTIN_PATH/justin-get-file`
echo "did_pfn_rse $DID_PFN_RSE"
pfn=`echo $DID_PFN_RSE | cut -f2 -d' '`

echo $INPUT_DIR
ls $INPUT_DIR
#echo $G4DATA_DIR
#ls $G4DATA_DIR
#
#echo $G4BL_DIR
#ls $G4BL_DIR
#
#echo $PACK_DIR
#ls $PACK_DIR

echo "Unpacking g4bl"
tar -xzf $INPUT_DIR/g4bl.tar.gz --checkpoint=1000
if [ $? -ne 0 ]
then
  echo "Exiting with error"
  exit 1
fi

echo "Unpacking Geant4Data"
tar -xzf $INPUT_DIR/Geant4Data.tar.gz --checkpoint=1000
if [ $? -ne 0 ]
then
  echo "Exiting with error"
  exit 1
fi
CURDIR=$(pwd)
echo $CURDIR/Geant4Data > g4bl/.data


echo "Unpacking Inputfiles Pack"
tar -xzf $INPUT_DIR/pack.tar.gz --checkpoint=1000
if [ $? -ne 0 ]
then
  echo "Exiting with error"
  exit 1
fi

$CURDIR/g4bl/bin/g4bl $INFILE jobID=$JOBID totNumEv=$PARTPERJOB $ADDPARAM 2>&1 | tee g4bloutput.txt

#Clean up
rm -rf Geant4Data
rm -rf g4bl



#Add timestamp to the output
now=$(date -u +"%Y%m%dT%H%M%SZ")
oldname=`ls H4*.root`
newname=`echo ${oldname} | sed -e "s/.root/_${now}_${pfn}.root/"`
mv ${oldname} ${newname}

if [ $? -ne 0 ]
then
  echo "Exiting with error"
  exit 1
else
  echo "$pfn" > justin-processed-pfns.txt
fi

#errorsSaving=$((`cat g4bloutput.txt | grep "Error in <T" | wc -l`))
#if [ $errorsSaving -ne 0 ]
#then
#  echo "Exiting with error"
#  exit 1
#fi
#
#echo "RUNTIME: $SECONDS seconds elapsed."
justIN time: 2024-11-17 03:18:57 UTC       justIN version: 01.01.09