justIN           Dashboard       Workflows       Jobs       AWT       Sites       Storages       Docs       Login

Workflow 4095, Stage 1

Priority50
Processors1
Wall seconds3600
RSS bytes6291456000 (6000 MiB)
Max distance for inputs30.0
Enabled input RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MONTECARLO, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled output RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled sites BR_CBPF, CA_SFU, CA_Victoria, CERN, CH_UNIBE-LHEP, CZ_FZU, ES_CIEMAT, ES_PIC, FR_CCIN2P3, IN_TIFR, IT_CNAF, NL_NIKHEF, NL_SURFsara, UK_Bristol, UK_Brunel, UK_Durham, UK_Edinburgh, UK_Imperial, UK_Lancaster, UK_Liverpool, UK_Manchester, UK_Oxford, UK_QMUL, UK_RAL-PPD, UK_RAL-Tier1, UK_Sheffield, US_BNL, US_Caltech, US_Colorado, US_FNAL-FermiGrid, US_FNAL-T1, US_Michigan, US_MIT, US_Nebraska, US_NotreDame, US_PuertoRico, US_SU-ITS, US_Swan, US_UChicago, US_UConn-HPC, US_UCSD, US_Wisconsin
Scopeusertests
Events for this stage

Output patterns

 DestinationPatternLifetimeFor next stage
1https://fndcadoor.fnal.gov:2880/dune/scratch/users/chappell/sec_vtx/04095/1*.csv

Environment variables

NameValue
INPUT_TAR_DIR_LOCAL/cvmfs/fifeuser2.opensciencegrid.org/sw/dune/af8fa889da2d7a9e27183a56aad2c6a7406a99a5
NUM_EVENTS100

File states

Total filesFindingUnallocatedAllocatedOutputtingProcessedNot foundFailed
1000001000

Job states

TotalSubmittedStartedProcessingOutputtingFinishedNotusedAbortedStalledJobscript errorOutputting failedNone processed
29000028001000
Files processed00112233445566778899Nov-13 20:00Nov-13 21:00Nov-13 22:00Nov-13 23:00Nov-14 00:00Files processedBin start timesNumber per binUS_ColoradoNL_SURFsaraUS_FNAL-FermiG…US_FNAL-FermiGridCA_SFU
Replicas per RSE10489.7173477848594236.2778214531536510272.7335564621949268.982865758543541363.60162466396633135.70388834458024Replicas per RSEDUNE_US_FNAL_DISK_STAGE (47%)FNAL_DCACHE (47%)DUNE_FR_CCIN2P3_DISK (4%)

RSEs used

NameInputsOutputs
DUNE_US_FNAL_DISK_STAGE100
DUNE_FR_CCIN2P3_DISK10

Stats of processed input files as CSV or JSON, and of uploaded output files as CSV or JSON (up to 10000 files included)

File reset events, by site

SiteAllocatedOutputting
US_PuertoRico10

Jobscript

#!/bin/bash
# fcl file and DUNE software version/qualifier to be used
FCL_FILE=pndr.fcl
DUNE_VERSION=${DUNE_VERSION:-v09_92_00d00}
DUNE_QUALIFIER=${DUNE_QUALIFIER:-e26:prof}
FW_SEARCH_PATH=.:$INPUT_TAR_DIR_LOCAL:$FW_SEARCH_PATH
FHICL_FILE_PATH=.:INPUT_TAR_DIR_LOCAL:$FHICL_FILE_PATH
echo $FW_SEARCH_PATH

cp $INPUT_TAR_DIR_LOCAL/pndr.fcl .
cp $INPUT_TAR_DIR_LOCAL/setup-grid .
cp -r $INPUT_TAR_DIR_LOCAL/localProducts* .

# number of events to process from the input file
if [ "$NUM_EVENTS" != "" ] ; then
 events_option="-n $NUM_EVENTS"
fi

# Setup DUNE environment
source /cvmfs/dune.opensciencegrid.org/products/dune/setup_dune.sh

# the xroot lib for streaming non-root files is in testproducts, 
# so add it to the start of the path
export PRODUCTS=/cvmfs/dune.opensciencegrid.org/products/dune/testproducts:${PRODUCTS}
setup dunesw "$DUNE_VERSION" -q "$DUNE_QUALIFIER"
export OMP_NUM_THREADS=${JUSTIN_PROCESSORS} 

source setup-grid
mrbslp

for i in {0..19}; do
    # First get an unprocessed file from this stage
    did_pfn_rse=`$JUSTIN_PATH/justin-get-file`

    if [ "$did_pfn_rse" = "" ] ; then
      echo "Nothing to process - exit jobscript"
      break
    fi

    # Keep a record of all input DIDs, for pdjson2meta file -> DID mapping
    echo "$did_pfn_rse" | cut -f1 -d' ' >>all-input-dids.txt

    # pfn is also needed when creating justin-processed-pfns.txt
    pfn=`echo $did_pfn_rse | cut -f2 -d' '`
    echo "Input PFN ${i} = $pfn"

    # Construct outFile from input $pfn 
    now=$(date -u +"%Y-%m-%dT_%H%M%SZ")
    Ffname=`echo $pfn | awk -F/ '{print $NF}'`
    fname=`echo $Ffname | awk -F. '{print $1}'`

    campaign="justIN.r${JUSTIN_WORKFLOW_ID}s${JUSTIN_STAGE_ID}"

    (
    # Do the scary preload stuff in a subshell!
    export LD_PRELOAD=${XROOTD_LIB}/libXrdPosixPreload.so
    echo "$LD_PRELOAD"

    lar -c $FCL_FILE $events_option "$pfn" > ${fname}_reco_${now}.log 2>&1
    )

    # Subshell exits with exit code of last command
    larExit=$?
    echo "lar exit code $larExit"
    echo "$pfn" > justin-processed-pfns.txt

    ls -l *.csv
done

mv SecVtx_CaloHitListU.csv SecVtx_CaloHitListU_${fname}.csv
mv SecVtx_CaloHitListV.csv SecVtx_CaloHitListV_${fname}.csv
mv SecVtx_CaloHitListW.csv SecVtx_CaloHitListW_${fname}.csv

echo "Post loop ${fname}"
ls -l *.csv

# Create compressed tar file with all log files 
tar zcf `echo "$JUSTIN_JOBSUB_ID.logs.tgz" | sed 's/@/_/g'` *.log
exit $larExit
justIN time: 2024-11-17 05:45:47 UTC       justIN version: 01.01.09