justIN           Dashboard       Workflows       Jobs       AWT       Sites       Storages       Docs       Login

Workflow 3882, Stage 1

Priority50
Processors1
Wall seconds80000
RSS bytes6291456000 (6000 MiB)
Max distance for inputs30.0
Enabled input RSEs CERN_PDUNE_EOS, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MANCHESTER, MONTECARLO, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled output RSEs CERN_PDUNE_EOS, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MANCHESTER, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled sites BR_CBPF, CA_SFU, CA_Victoria, CERN, CH_UNIBE-LHEP, CZ_FZU, ES_CIEMAT, ES_PIC, FR_CCIN2P3, IN_TIFR, IT_CNAF, UK_Bristol, UK_Brunel, UK_Durham, UK_Edinburgh, UK_Imperial, UK_Lancaster, UK_Liverpool, UK_Manchester, UK_Oxford, UK_RAL-Tier1, UK_Sheffield, US_BNL, US_Caltech, US_Colorado, US_FNAL-FermiGrid, US_FNAL-T1, US_Michigan, US_MIT, US_Nebraska, US_NotreDame, US_PuertoRico, US_SU-ITS, US_Swan, US_UChicago, US_UConn-HPC, US_UCSD, US_Wisconsin
Scopeusertests
Events for this stage

Output patterns

 DestinationPatternLifetimeFor next stage
1Rucio usertests:afm-muons_g4_unfiltered*_reco_data_*.root1728000False

Environment variables

NameValue
NUM_EVENTS10

File states

Total filesFindingUnallocatedAllocatedOutputtingProcessedNot foundFailed
10000000100000

Job states

TotalSubmittedStartedProcessingOutputtingFinishedNotusedAbortedStalledJobscript errorOutputting failedNone processed
169800001697000001
Files processed0010010020020030030040040050050060060070070080080090090010001000Oct-31 09:00Oct-31 10:00Oct-31 11:00Files processedBin start timesNumber per binUK_BrunelCERNUS_UChicagoUS_PuertoRicoES_PICCZ_FZUUK_RAL-Tier1US_FNAL-T1US_FNAL-FermiG…US_FNAL-FermiGridCA_SFUUK_Manchester
Replicas per RSE514485.19816802160653238.62984285887117216306.95830689791165309.84854032069666188285.87849803687186186.7852473457196570349.35389163298294133.2582209708551812376.03122376485595128.77481853094596Replicas per RSERAL-PP (51%)DUNE_CERN_EOS (21%)DUNE_US_FNAL_DISK_STAGE (18…DUNE_US_FNAL_DISK_STAGE (18%)PRAGUE (7%)RAL_ECHO (1%)

RSEs used

NameInputsOutputs
RAL-PP514142
DUNE_CERN_EOS216360
DUNE_US_FNAL_DISK_STAGE188188
PRAGUE70193
RAL_ECHO12109
DUNE_IT_INFN_CNAF08

Stats of processed input files as CSV or JSON, and of uploaded output files as CSV or JSON (up to 10000 files included)

Jobscript

#!/bin/bash
:<<'EOF'

To use this jobscript to process n files through the g4 stage
and put the output in pnfs scratch,
use this command to create the workflow:

justin simple-workflow \
--mql "$MQL_QUERY" \
--jobscript my-g4.jobscript --max-distance 30 --rss-mb 4000 -- env NUM_EVENTS=10 --env INPUT_TAR_DIR_LOCAL="$INPUT_TAR_DIR_LOCAL"\
--scope usertests --output-pattern "*_reco_data_*.root:<my-dataset-name>" --lifetime-days 1

The following optional environment variables can be set when creating the
workflow/stage: FCL_FILE, NUM_EVENTS, DUNE_VERSION, DUNE_QUALIFIER 

EOF
echo "AFM g4 jobscript."
# fcl file and DUNE software version/qualifier to be used
#FCL_FILE=${FCL_FILE:-$INPUT_TAR_DIR_LOCAL/standard_g4_dune10kt_1x2x6_filtered.fcl}
FCL_FILE=${FCL_FILE:-standard_g4_dune10kt_1x2x6.fcl}
DUNE_VERSION=${DUNE_VERSION:-v09_75_03d00}
DUNE_QUALIFIER=${DUNE_QUALIFIER:-e20:prof}

# number of events to process from the input file
if [ "$NUM_EVENTS" != "" ] ; then
 events_option="-n $NUM_EVENTS"
fi

# First get an unprocessed file from this stage
did_pfn_rse=`$JUSTIN_PATH/justin-get-file`

if [ "$did_pfn_rse" = "" ] ; then
  echo "No input files provided."
  exit 0
fi

# Keep a record of all input DIDs, for pdjson2meta file -> DID mapping
echo "$did_pfn_rse" | cut -f1 -d' ' >>all-input-dids.txt

# pfn is also needed when creating justin-processed-pfns.txt
pfn=`echo $did_pfn_rse | cut -f2 -d' '`
echo "Input PFN = $pfn"

# Setup DUNE environment
source /cvmfs/dune.opensciencegrid.org/products/dune/setup_dune.sh

# the xroot lib for streaming non-root files is in testproducts, 
# so add it to the start of the path
export PRODUCTS=/cvmfs/dune.opensciencegrid.org/products/dune/testproducts:${PRODUCTS}
setup dunesw "$DUNE_VERSION" -q "$DUNE_QUALIFIER"
export TF_NUM_THREADS=${JUSTIN_PROCESSORS}   
export OPENBLAS_NUM_THREADS=${JUSTIN_PROCESSORS} 
export JULIA_NUM_THREADS=${JUSTIN_PROCESSORS} 
export MKL_NUM_THREADS=${JUSTIN_PROCESSORS} 
export NUMEXPR_NUM_THREADS=${JUSTIN_PROCESSORS} 
export OMP_NUM_THREADS=${JUSTIN_PROCESSORS}  

# Construct outFile from input $pfn 
now=$(date -u +"%Y-%m-%dT_%H%M%SZ")
Ffname=`echo $pfn | awk -F/ '{print $NF}'`
fname=`echo $Ffname | awk -F. '{print $1}'`
outFile=${fname}_reco_data_${now}.root

campaign="justIN.w${JUSTIN_WORKFLOW_ID}s${JUSTIN_STAGE_ID}"
 
(
# Do the scary preload stuff in a subshell!
export LD_PRELOAD=${XROOTD_LIB}/libXrdPosixPreload.so
echo "$LD_PRELOAD"

lar -c $FCL_FILE $events_option -o $outFile "$pfn"  > ${fname}_reco_${now}.log 2>&1
)

echo '=== Start last 100 lines of lar log file ==='
tail -100 ${fname}_reco_${now}.log
echo '=== End last 100 lines of lar log file ==='

# Subshell exits with exit code of last command
larExit=$?
echo "lar exit code $larExit"
jobscriptExit=1

if [ $larExit -eq 0 ] ; then
  # write metadata file if lar succeeded
  extractor_prod.py --infile "$outFile" --no_crc --appname reco \
    --appversion ${DUNE_VERSION} --appfamily art \
    --campaign ${campaign} > $outFile.ext.json  
  extractorExit=$?
  echo "extractor_prod.py exit code $extractorExit"

  # Run pdjson2meta. THIS SHOULD MOVE TO SOMEWHERE LIKE duneutil ?
  /cvmfs/dune.opensciencegrid.org/products/dune/justin/pro/NULL/jobutils/pdjson2metadata \
     $outFile.ext.json all-input-dids.txt > $outFile.json
  p2mExit=$?
  echo "pdjson2metadata exit code $p2mExit"

  if [ $extractorExit -eq 0 -a $p2mExit -eq 0 ] ; then
    echo "Metadata extraction succeeds"
    echo "$pfn" > justin-processed-pfns.txt
    echo "===Metadata JSON==="
    cat $outFile.json
    echo
    echo "==================="
    jobscriptExit=0
  fi
fi

ls -lRS

# Create compressed tar file with all log files 
tar zcf `echo "$JUSTIN_JOBSUB_ID.logs.tgz" | sed 's/@/_/g'` *.log
exit $jobscriptExit
justIN time: 2024-11-23 10:58:21 UTC       justIN version: 01.01.09
<