Getting Started with MicroBooNE Computing

Getting Accounts and Logging In at Fermilab

You must be on the MicroBooNE Collaboration member list in order to get an account (check if you are on it at-):
Talk to your Institutional Board representative to get on it. The IB representative from each institution is listed on that webpage.

Instructions for getting accounts:

Fermilab uses Kerberos to implement strong authentication (no passwords on the internet) when logging in to Fermilab machines.
Make sure the Fermilab KDC's are in your /etc/krb5.conf file.

Make sure your ~/.ssh/config file has default login options like the following -

ForwardX11 yes
ForwardX11Trusted yes
GSSAPIAuthentication yes
GSSAPIDelegateCredentials yes

The following example shows how to log in from a Mac with kerberos and ssh installed -

my-mac$ type kinit
kinit is /usr/bin/kinit

# case matters for FNAL.GOV, login using your Kerberos password
kinit -af -r7d <your-kerberos-principal>@FNAL.GOV 
#Note that depending on your OS, you may have -A or -a to get an address-less ticket

# this is the OpenSSH client
my-mac$ type ssh
ssh is hashed (/usr/bin/ssh) 

ssh -K

Kerberos Tips and Info

Kerberos tickets (what you get with the kinit command) have a default lifetime of 26 hours after which they expire. If you use the -r option on the kinit line, then your ticket can be renewed instead of having to get a new one.

Users must have a valid kerberos ticket to access Fermilab computing at the time an attempt to log into a Fermilab machine. The ticket is obtained by executing the following command at a terminal prompt:

$ kinit <your_Kerberos_principal>@FNAL.GOV

where <your_Kerberos_principal> is the user's kerberos principal (i.e., username or uid). If a user is attempting to access the repository from a non-Fermilab machine, the following lines must be in the user's .ssh/config:

Host *
ForwardAgent yes
ForwardX11 yes
ForwardX11Trusted yes
GSSAPIAuthentication yes
GSSAPIDelegateCredentials yes

It is possible to allow other users (or yourself just on another machine or with another Kerberos identity) to access your account via a .k5login file in your $HOME directory. A warning however: If you create a .k5login file, make sure you put your own username in it or you can be locked out of your own account. It should have the line


in it.

Additional help (if you want to know more or need to troubleshoot) -- useful tips on logging in with Kerberos:

and an introductory explanation of tickets, certificates, and proxies is available at:

Some links which might be helpful for using non-Fermilab-managed Windows systems. These instructions have
not been tried by the authors of this wiki.

and some help with Redmine for Windows users:

UPS Tips and Info

UPS is environment management software for handling software products with many versions and different 'flavors' of components. You use it to make sure you are using the correct version of the product you need and any dependent products that one may rely on.

In the table below the following example values are used:
  • uboonecode - the product being setup
  • v06_26_01_17 - the version of the uboonecode product for this example. This may not be the latest or best version. (See the Releases page.)
  • e10:prof - the qualifiers for the version of uboonecode. Qualifiers further define something about the version of uboonecode, like how it was built (with profiling or debug) or the version of the gcc compiler used (e10,e14, etc.) for example. (Available qualifiers may vary with version, and can be discovered using the "ups list" command described below.)

Basic UPS Command Information

These commands are done after logging in to the MicroBooNE interactive nodes (e.g.,, ...,

Command Description Use
source /cvmfs/ Configure your environment, get access to software versions stored in UPS product areas on CVMFS Use once at login
setup uboonecode v06_26_01_17 -q e10:prof Set up a particular version of uboonecode and all dependencies Use once after login, after running
ups list -aK+ uboonecode Find out which versions and flavors of uboonecode exist on this node Use whenever you need to find out what's available
ups active Find out what has been setup Use when you want to
ups depend uboonecode v06_26_01_17 -q +e10:+prof Find out what depends on what for this version of uboonecode Use when you want to

After doing the setup of uboonecode you can see where the software is by looking at the UBOONECODE_DIR (<PRODUCT>_DIR>) variable.


Further Documentation

When you need to learn more about UPS, visit the following links.

Info on Qualifiers:
UPS Full Documentation:

Login Scripts

This section contains tips and best practices about what to include in your login scripts, such as .bashrc and .bash_profile. As a reminder, .bash_profile is executed at initialization for login shells, and .bashrc is executed at initialization for interactive non-login shells. Neither initialization script is executed for non-interactive shells (i.e. shell scripts).

  • Include in .bash_profile any "set once" environment variables that you never change.
  • It is normally a good idea to have your .bash_profile source your .bashrc. Simply add the following line to your .bash_profile.
    source .bashrc

You should not initialize ups or set up any ups product in your login scripts. Rather, define a convenient alias in your .bashrc to initialize ups, something like:

alias setup_uboone='source /grid/fermiapp/products/uboone/'


alias setup_uboone='source /cvmfs/'

Script does the following things:

  • Initializes ups (sets your $PRODUCTS path).
  • Sets up a few basic ups products, including git, gitflow, and mrb.
  • Sets several environment variables, including experiment-selecting environment variables used by other ups products ($EXPERIMENT, $SAM_EXPERIMENT, and $JOBSUB_GROUP), also the MicroBooNE-specific art-framework search path $FW_SEARCH_PATH.

Script does not set up uboonecode. Therefore, before embarking on code development or running jobs, you should set up your favorite version of uboonecode. Setting up uboonecode will set up many dependent tools products, including the following.

  • root
  • python
  • jobsub_client
  • sam_web_client
  • ifdhc
  • geant4
  • genie, and several other mc generators.
  • larsoft
  • art

General Overviews

Info on many of the tools you will use - Getting Started With FIFE Tools

Read the landing page to learn a little about art

Read an intro about gallery

Some Exercises to Help You Get Started

Log in to a MicroBooNE interactive node

Instructions to log in to using ssh are in the above section on Getting Accounts and Logging In at Fermilab.

Some Useful Commands to Work Through To Setup Your Environment

Run these commands on the MicroBooNE interactive node after you have logged in. They will show you some information and get the environment setup for the rest of the exercises. You will only have to do these once.

Command Description
df -h find out what disks are connected to the node and their capabilities
pwd what directory are you in
cd change directories to your home area
ls look at the files in your directory
echo $USER make sure the USER variable is set
export USER=`whoami` set the USER variable if not already set
ls /uboone/app/users/$USER Check to see if you have directories in the user area
mkdir /uboone/app/users/$USER If it does not exist, make your directory on the app disk
mkdir /uboone/data/users/$USER If it does not exist, make your directory on the BlueArc data disk (mounted no execute)
mkdir /pnfs/uboone/scratch/users/$USER If it does not exist, make your directory on the scratch dCache disk
mkdir /pnfs/uboone/persistent/users/$USER If it does not exist, make your directory on the persistent dCache disk

To understand more about storage volumes on interactive nodes, please refer to Understanding Storage Volumes

Run Simulation and Reconstruction

Execute the following commands after logging in to the MicroBooNE interactive node and creating the directories above -

cd /uboone/data/users/$USER

mkdir example_july_collab_2017
cd example_july_collab_2017

# setup the uboone environment
source /cvmfs/

# see what versions of uboonecode there are
ups list -aK+ uboonecode

# setup a recent one on the list:

setup uboonecode v06_10_00 -q e10:prof

# ups automatically sets up all the packages uboonecode depends on. Show a list a list of active packages:
ups active

# the main executable program is called "lar".  Find out where it is:
which lar

# root is also set up
which root

# run the generator for a single proton in the MicroBooNE geometry
# all lar commands, is successful, will end with a line - Art has completed and will exit with status 0.
lar -n 1 -c prod_p_0.02-1.5GeV_isotropic_uboone.fcl

# see what you got.  It's just the generator, which makes mctruth and mcparticle data
ls -lrt

# run GEANT4 on the output of the generator made above.
lar -n 1 -c standard_g4_uboone.fcl -s `ls $PWD/prod_p_0.02-1.5GeV_isotropic_uboone_*_gen.root`

# see what you got
ls -lrt

# run the detector simulation on the G4 output.
# This generates waveforms on the wires, adds noise, digitizes and zero-suppresses the output
lar -c standard_detsim_uboone.fcl -s `ls $PWD/prod_p_0.02-1.5GeV_isotropic_uboone_*_gen_*g4.root`

# see what you got.  Newest file is on the bottom
ls -lrt

# run the production reco stages on the output of the detsim stage
#  Note: as of June 1, 2018, you should not use "standard_reco_uboone.fcl" because it is not used
#  in production and does not work with the MCC8 uboonecode releases (v06.26.01.* series).

lar -c reco_uboone_mcc8_driver_stage1.fcl -s `ls $PWD/prod_p_0.02-1.5GeV_isotropic_*_detsim.root`
lar -c reco_uboone_mcc8_driver_stage2.fcl -s `ls $PWD/prod_p_0.02-1.5GeV_isotropic_*_reco2D.root`

Copy a File to dCache Mass Storage

Execute the following commands after logging in to the MicroBooNE interactive node, creating the directories above and executing the Simulation and Reconstruction commands -

# copy the output file to dCache scratch space
ifdh cp -D prod_muminus_0.1-5.0GeV_isotropic_*_reco.root /pnfs/uboone/scratch/users/$USER/

Execute SAMweb Commands

This exercise will start you accessing data files that have been defined to the MicroBooNE Data Catalog.
Execute the following commands after logging in to the MicroBooNE interactive node, creating the directories above -

mkdir samweb
cd samweb
samweb locate-file PhysicsRun-2017_7_3_13_53_23-0011921-00004_20170703T203136_bnb_hsnc0_20170703T222628_merged.root

this will give you output that looks like

which is the location of the file on tape. We can use this to copy the file from tape to our local disk.
NOTE: we are on the data disk (/uboone/data/users/...), you should never store data files on the app disk (/uboone/app/users/...).

ifdh cp -D /pnfs/uboone/data/uboone/raw/run2_swizzle_trigger_streams/mergebnb_hsnc0/prod_run2_v05_08_00_03/00/01/19/21/PhysicsRun-2017_7_3_13_53_23-0011921-00004_20170703T203136_bnb_hsnc0_20170703T222628_merged.root .

To get SAM metadata for a file for which you know the full name:
samweb get-metadata PhysicsRun-2017_7_3_13_53_23-0011921-00004_20170703T203136_bnb_hsnc0_20170703T222628_merged.root

To list raw data files for a given run:
samweb list-files "run_number=4024 and data_tier=raw" 

Look at the FIFE Monitoring

The FIFE Monitoring displays statistics on jobs, storage uses etc. This can be a useful place to look if things are not happening as you expect -

You will be prompted to log in with your Fermilab services username and password.
Click on the graphs to see what you can do.

Extra Credit: Looking Around

Execute the following commands after logging in to the MicroBooNE interactive node and running the setup commands above -

# There are four colon-separated search paths for finding stuff. They are built with the setup command above
# For commands:
echo $PATH

# for shared libraries:

# for fcl files (which steer a job)

# for other files that the framework needs (like GDML files for geometry, or photon libraries):

# ups sets up environment variables per package that point to the locations of the installed products

# subdirectories in there include source code, headers, pre-built and installed libraries, and fcl files

# look at them all!
env | grep DIR

Further reading