justIN           Dashboard       Workflows       Jobs       AWT       Sites       Storages       Docs       Login

Workflow 4103, Stage 1

Priority50
Processors1
Wall seconds80000
RSS bytes8388608000 (8000 MiB)
Max distance for inputs30.0
Enabled input RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MONTECARLO, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled output RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled sites BR_CBPF, CA_SFU, CA_Victoria, CERN, CH_UNIBE-LHEP, CZ_FZU, ES_CIEMAT, ES_PIC, FR_CCIN2P3, IN_TIFR, IT_CNAF, NL_NIKHEF, NL_SURFsara, UK_Bristol, UK_Brunel, UK_Durham, UK_Edinburgh, UK_Imperial, UK_Lancaster, UK_Liverpool, UK_Manchester, UK_Oxford, UK_QMUL, UK_RAL-PPD, UK_RAL-Tier1, UK_Sheffield, US_BNL, US_Caltech, US_Colorado, US_FNAL-FermiGrid, US_FNAL-T1, US_Michigan, US_MIT, US_Nebraska, US_NotreDame, US_PuertoRico, US_SU-ITS, US_Swan, US_UChicago, US_UConn-HPC, US_UCSD, US_Wisconsin
Scopeusertests
Events for this stage

Output patterns

 DestinationPatternLifetimeFor next stage
1Rucio usertests:afm-muons_g4_uf*_reco_data_*.root2592000False

Environment variables

NameValue
NUM_EVENTS-1

File states

Total filesFindingUnallocatedAllocatedOutputtingProcessedNot foundFailed
10000010928071

Job states

TotalSubmittedStartedProcessingOutputtingFinishedNotusedAbortedStalledJobscript errorOutputting failedNone processed
2737001021410211053531160
Files processed002020404060608080100100120120140140160160180180200200220220240240Nov-14 13:00Nov-14 14:00Nov-14 15:00Nov-14 16:00Nov-14 17:00Nov-14 18:00Nov-14 19:00Nov-14 20:00Files processedBin start timesNumber per binUS_ColoradoUK_RAL-PPDCERNFR_CCIN2P3US_UChicagoES_PICCZ_FZUNL_SURFsaraUK_LancasterUK_RAL-Tier1US_FNAL-FermiG…US_FNAL-FermiGridUK_QMULUK_EdinburghUK_ManchesterUS_WisconsinUS_FNAL-T1UK_Durham
Replicas per RSE676469.609718932981289.3008885349244761278.60869108338073262.423449406690359275.26537549730216223.1028246651304752285.29262701006115187.9717097716567545303.4664367842292161.6763959588737736324.13717229250096144.739513330240733344.6423249969613134.813686335364323362.54798626619913130.1562846550874615375.03969074027475128.81689616650456Replicas per RSEDUNE_US_FNAL_DISK_STAGE (67…DUNE_US_FNAL_DISK_STAGE (67%)RAL-PP (6%)DUNE_FR_CCIN2P3_DISK (5%)DUNE_CERN_EOS (5%)DUNE_UK_GLASGOW (4%)PRAGUE (3%)RAL_ECHO (3%)QMUL (2%)SURFSARA (1%)

RSEs used

NameInputsOutputs
DUNE_US_FNAL_DISK_STAGE854658
DUNE_UK_GLASGOW27015
DUNE_FR_CCIN2P3_DISK745
RAL-PP6818
DUNE_CERN_EOS6772
RAL_ECHO4369
PRAGUE3612
QMUL2410
SURFSARA1536
DUNE_UK_MANCHESTER_CEPH028
DUNE_UK_LANCASTER_CEPH05

Stats of processed input files as CSV or JSON, and of uploaded output files as CSV or JSON (up to 10000 files included)

File reset events, by site

SiteAllocatedOutputting
US_NotreDame860
US_FNAL-FermiGrid80
UK_Manchester40
US_FNAL-T130
US_Wisconsin10

Jobscript

#!/bin/bash
:<<'EOF'

To use this jobscript to process n files through the g4 stage
and put the output in pnfs scratch,
use this command to create the workflow:

justin simple-workflow \
--mql "$MQL_QUERY" \
--jobscript my-g4.jobscript --max-distance 30 --rss-mb 4000 -- env NUM_EVENTS=10 --env INPUT_TAR_DIR_LOCAL="$INPUT_TAR_DIR_LOCAL"\
--scope usertests --output-pattern "*_reco_data_*.root:<my-dataset-name>" --lifetime-days 1

The following optional environment variables can be set when creating the
workflow/stage: FCL_FILE, NUM_EVENTS, DUNE_VERSION, DUNE_QUALIFIER 

EOF
echo "AFM g4 jobscript."
# fcl file and DUNE software version/qualifier to be used
#FCL_FILE=${FCL_FILE:-$INPUT_TAR_DIR_LOCAL/standard_g4_dune10kt_1x2x6_filtered.fcl}
FCL_FILE=${FCL_FILE:-standard_g4_dune10kt_1x2x6.fcl}
DUNE_VERSION=${DUNE_VERSION:-v09_75_03d00}
DUNE_QUALIFIER=${DUNE_QUALIFIER:-e20:prof}

# number of events to process from the input file
if [ "$NUM_EVENTS" != "" ] ; then
 events_option="-n $NUM_EVENTS"
fi

# First get an unprocessed file from this stage
did_pfn_rse=`$JUSTIN_PATH/justin-get-file`

if [ "$did_pfn_rse" = "" ] ; then
  echo "No input files provided."
  exit 0
fi

# Keep a record of all input DIDs, for pdjson2meta file -> DID mapping
echo "$did_pfn_rse" | cut -f1 -d' ' >>all-input-dids.txt

# pfn is also needed when creating justin-processed-pfns.txt
pfn=`echo $did_pfn_rse | cut -f2 -d' '`
echo "Input PFN = $pfn"

# Setup DUNE environment
source /cvmfs/dune.opensciencegrid.org/products/dune/setup_dune.sh

# the xroot lib for streaming non-root files is in testproducts, 
# so add it to the start of the path
export PRODUCTS=/cvmfs/dune.opensciencegrid.org/products/dune/testproducts:${PRODUCTS}
setup dunesw "$DUNE_VERSION" -q "$DUNE_QUALIFIER"
export TF_NUM_THREADS=${JUSTIN_PROCESSORS}   
export OPENBLAS_NUM_THREADS=${JUSTIN_PROCESSORS} 
export JULIA_NUM_THREADS=${JUSTIN_PROCESSORS} 
export MKL_NUM_THREADS=${JUSTIN_PROCESSORS} 
export NUMEXPR_NUM_THREADS=${JUSTIN_PROCESSORS} 
export OMP_NUM_THREADS=${JUSTIN_PROCESSORS}  

# Construct outFile from input $pfn 
now=$(date -u +"%Y-%m-%dT_%H%M%SZ")
Ffname=`echo $pfn | awk -F/ '{print $NF}'`
fname=`echo $Ffname | awk -F. '{print $1}'`
outFile=${fname}_reco_data_${now}.root

campaign="justIN.w${JUSTIN_WORKFLOW_ID}s${JUSTIN_STAGE_ID}"
 
(
# Do the scary preload stuff in a subshell!
export LD_PRELOAD=${XROOTD_LIB}/libXrdPosixPreload.so
echo "$LD_PRELOAD"

lar -c $FCL_FILE $events_option -o $outFile "$pfn"  > ${fname}_reco_${now}.log 2>&1
)

echo '=== Start last 100 lines of lar log file ==='
tail -100 ${fname}_reco_${now}.log
echo '=== End last 100 lines of lar log file ==='

# Subshell exits with exit code of last command
larExit=$?
echo "lar exit code $larExit"
jobscriptExit=1

if [ $larExit -eq 0 ] ; then
  # write metadata file if lar succeeded
  extractor_prod.py --infile "$outFile" --no_crc --appname reco \
    --appversion ${DUNE_VERSION} --appfamily art \
    --campaign ${campaign} > $outFile.ext.json  
  extractorExit=$?
  echo "extractor_prod.py exit code $extractorExit"

  # Run pdjson2meta. THIS SHOULD MOVE TO SOMEWHERE LIKE duneutil ?
  /cvmfs/dune.opensciencegrid.org/products/dune/justin/pro/NULL/jobutils/pdjson2metadata \
     $outFile.ext.json all-input-dids.txt > $outFile.json
  p2mExit=$?
  echo "pdjson2metadata exit code $p2mExit"

  if [ $extractorExit -eq 0 -a $p2mExit -eq 0 ] ; then
    echo "Metadata extraction succeeds"
    echo "$pfn" > justin-processed-pfns.txt
    echo "===Metadata JSON==="
    cat $outFile.json
    echo
    echo "==================="
    jobscriptExit=0
  fi
fi

ls -lRS

# Create compressed tar file with all log files 
tar zcf `echo "$JUSTIN_JOBSUB_ID.logs.tgz" | sed 's/@/_/g'` *.log
exit $jobscriptExit
justIN time: 2024-11-17 03:49:42 UTC       justIN version: 01.01.09
<