justIN           Dashboard       Workflows       Jobs       AWT       Sites       Storages       Docs       Login

Workflow 6981, Stage 1

Priority50
Processors1
Wall seconds80000
Image/cvmfs/singularity.opensciencegrid.org/fermilab/fnal-wn-sl7:latest
RSS bytes4194304000 (4000 MiB)
Max distance for inputs0.0
Enabled input RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, MONTECARLO, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled output RSEs CERN_PDUNE_EOS, DUNE_CA_SFU, DUNE_CERN_EOS, DUNE_ES_PIC, DUNE_FR_CCIN2P3_DISK, DUNE_IN_TIFR, DUNE_IT_INFN_CNAF, DUNE_UK_GLASGOW, DUNE_UK_LANCASTER_CEPH, DUNE_UK_MANCHESTER_CEPH, DUNE_US_BNL_SDCC, DUNE_US_FNAL_DISK_STAGE, FNAL_DCACHE, FNAL_DCACHE_STAGING, FNAL_DCACHE_TEST, NIKHEF, PRAGUE, QMUL, RAL-PP, RAL_ECHO, SURFSARA, T3_US_NERSC
Enabled sites BR_CBPF, CA_SFU, CA_Victoria, CERN, CH_UNIBE-LHEP, ES_CIEMAT, ES_PIC, FR_CCIN2P3, IN_TIFR, IT_CNAF, NL_SURFsara, UK_Bristol, UK_Brunel, UK_Durham, UK_Edinburgh, UK_Lancaster, UK_Manchester, UK_Oxford, UK_QMUL, UK_RAL-PPD, UK_RAL-Tier1, UK_Sheffield, US_Caltech, US_Colorado, US_FNAL-FermiGrid, US_FNAL-T1, US_Michigan, US_MIT, US_Nebraska, US_NotreDame, US_PuertoRico, US_SU-ITS, US_Swan, US_UChicago, US_UConn-HPC, US_UCSD, US_Wisconsin
Scopeusertests
Events for this stage

Output patterns

 DestinationPatternLifetimeFor next stageRSE expression
1https://fndcadoor.fnal.gov:2880/dune/scratch/users/smanthey/hd_1x2x6_centralAPA/06981/1*hist*.root

Environment variables

NameValue
INPUT_TAR_DIR_LOCAL/cvmfs/fifeuser2.opensciencegrid.org/sw/dune/3ba1bff89cc9b3d406f9fd78e51d95a1e8e1f791

File states

Total filesFindingUnallocatedAllocatedOutputtingProcessedNot foundFailed
10050000100500

Job states

TotalSubmittedStartedProcessingOutputtingFinishedNotusedAbortedStalledJobscript errorOutputting failedNone processed
18090000170000109000
Files processed0010010020020030030040040050050060060070070080080090090010001000May-16 11:00May-16 12:00May-16 13:00Files processedBin start timesNumber per binUK_RAL-PPDCERNUK_LancasterUK_RAL-Tier1US_FNAL-FermiG…US_FNAL-FermiGridUK_QMULUK_Manchester
Replicas per RSE1005480.24218020899116231.67532457754115207331.5128546446468311.61542067248416176288.92393773962834266.1671010046957131279.64783737053847216.80865629412835101289.3492174509523179.936640989872476305.6577995625229155.7610314581483263322.93494366954140.6814636026364349339.06225512313813131.6336205709783430351.30439434328036127.1055352159988330360.93653829926257124.7482149377312921369.2733088158203123.4986564028300515375.20694186495024123.039275242938677378.84264209713075122.93165931723509Replicas per RSEDUNE_US_FNAL_DISK_S…DUNE_US_FNAL_DISK_STAGE (52%)SURFSARA (10%)DUNE_UK_MANCHESTER_…DUNE_UK_MANCHESTER_CEPH (9%)RAL-PP (6%)DUNE_US_BNL_SDCC (5…DUNE_US_BNL_SDCC (5%)NIKHEF (3%)DUNE_IT_INFN_CNAF (…DUNE_IT_INFN_CNAF (3%)DUNE_CERN_EOS (2%)PRAGUE (1%)RAL_ECHO (1%)DUNE_FR_CCIN2P3_DIS…DUNE_FR_CCIN2P3_DISK (1%)DUNE_UK_LANCASTER_C…DUNE_UK_LANCASTER_CEPH (0%)QMUL (0%)

RSEs used

NameInputsOutputs
DUNE_US_FNAL_DISK_STAGE6470
DUNE_UK_MANCHESTER_CEPH1760
RAL-PP1310
RAL_ECHO280
DUNE_UK_LANCASTER_CEPH110
QMUL70
DUNE_CERN_EOS50

Stats of processed input files as CSV or JSON, and of uploaded output files as CSV or JSON (up to 10000 files included)

Jobscript

#!/bin/bash
:<<'EOF'

Use this jobscript to process files from the dataset fardet-hd:fardet-hd__hit-reconstructed__v09_91_04d00__reco1_supernova_dune10kt_1x2x6__prodmarley_nue_cc_flat_radiological_decay0_dune10kt_1x2x6_centralAPA__out1__v1_official
data and put the output in the $USER namespace (MetaCat) and saves the output in /scratch
Use this script by doing

source hd_1x2x6_centralAPA.sh

(use source fdhdcentral_test.sh first to test it locally - also a good idea to remove the folder created in /tmp afterwards).
This example uses a custom dune repository that is provided via a tar file.
I recommend sending the tar to cvmfs in advance, just to avoid any problem, with this command
INPUT_TAR_DIR_LOCAL=`justin-cvmfs-upload larsoft_v10_05_00.tar.gz`

The following optional environment variables can be set when creating the
workflow/stage: FCL_FILE, NUM_EVENTS, DUNE_VERSION, DUNE_QUALIFIER 

EOF

# fcl file and DUNE software version/qualifier to be used
FCL_FILE=${FCL_FILE:-solar_ana_marley_flash_radiological_decay0_dune10kt_1x2x6_centralAPA}
DUNE_VERSION=${DUNE_VERSION:-v10_05_00d00}
DUNE_QUALIFIER=${DUNE_QUALIFIER:-e26:prof}

# number of events to process from the input file
if [ "$NUM_EVENTS" != "" ] ; then
 events_option="-n $NUM_EVENTS"
fi

# First get an unprocessed file from this stage
did_pfn_rse=`$JUSTIN_PATH/justin-get-file`

if [ "$did_pfn_rse" = "" ] ; then
  echo "Nothing to process - exit jobscript"
  exit 0
fi

# Keep a record of all input DIDs, for pdjson2meta file -> DID mapping
echo "$did_pfn_rse" | cut -f1 -d' ' >>all-input-dids.txt

# pfn is also needed when creating justin-processed-pfns.txt
pfn=`echo $did_pfn_rse | cut -f2 -d' '`
echo "Input PFN = $pfn"

# Setup DUNE environment
source /cvmfs/dune.opensciencegrid.org/products/dune/setup_dune.sh
export PRODUCTS="${INPUT_TAR_DIR_LOCAL}/localProducts_larsoft_v10_05_00_e26_prof/:$PRODUCTS"
# Then we can set up our local products
setup dunesw "$DUNE_VERSION" -q "$DUNE_QUALIFIER"

# Construct outFile from input $pfn 
now=$(date -u +"%Y-%m-%dT_%H%M%SZ")
Ffname=`echo $pfn | awk -F/ '{print $NF}'`
fname=`echo $Ffname | awk -F. '{print $1}'`
outFile=fdhd_ana_${now}.root
outHistFile=fdhd_ana_${now}_hist.root

campaign="justIN.w${JUSTIN_WORKFLOW_ID}s${JUSTIN_STAGE_ID}"

# Here is where the LArSoft command is call it 
(
# Do the scary preload stuff in a subshell!
export LD_PRELOAD=${XROOTD_LIB}/libXrdPosixPreload.so
echo "$LD_PRELOAD"

lar -c $FCL_FILE $events_option -o $outFile -T $outHistFile "$pfn" > fdhd_ana_${now}.log 2>&1
)

echo '=== Start last 100 lines of lar log file ==='
tail -100 fdhd_ana_${now}.log
echo '=== End last 100 lines of lar log file ==='

# Subshell exits with exit code of last command
larExit=$?
echo "lar exit code $larExit"

if [ $larExit -eq 0 ] ; then
  # Success !
  echo "$pfn" > justin-processed-pfns.txt
  jobscriptExit=0
else
  # Oh :(
  jobscriptExit=1
fi

# Create compressed tar file with all log files 
#tar zcf `echo "$JUSTIN_JOBSUB_ID.logs.tgz" | sed 's/@/_/g'` *.log
exit $jobscriptExit
justIN time: 2025-05-22 23:06:23 UTC       justIN version: 01.03.01