This Wiki describes the implementation of an instance of OSGMM at FNAL to manage references installations of the Geant4 software on the Open Science Grid for the purposes of validation.

Implementation details

An OSG client and an operating instance of OSGMM is installed and running currently on This handles the installation of the Geant4 reference software on OSG nodes supporting the Geant4 VO, and also is a node from which one could submit jobs to Geant4-VO-supporting sites on OSG.

A description of how patvm8 was installed and configured (and incidentally, how to repeat the exercise should there be a need) may be found at InstallConfigureInstructions.

OSG Match Maker quick reference.

  • See the installation and configuration Wiki page for details on how to get an installation going.
  • The OSGMM SourceForge page is the definitive place for information and documentation on the OSGMM package.
  • Check the log, maintenance-runs and verification-runs directories under ~osgmm/osg-match-maker/var for details on how the OSGMM instance is running, handling jobs and executing the maintenance and verification jobs on each site.

Managing the installation of the Geant4 reference packages.

The management of the Geant4 reference software on remote sites is handled by the script source:trunk/etc/extra.maintenance-script.fork, installed in ~osgmm/osg-match-maker/etc (/usr/local/osg/osg-match-maker/etc).

The variables version and source_path control the location from which the source packages are obtained for installation on each remote site. They should be obtained from the current release build manager at CERN, and they should have the following names according to architecture to be recognized by the installation script:

  • slc4_ia32_gcc34
  • slc4_amd64_gcc34
  • slc5_x86_64_gcc43

If the list changes and/or gets longer, you will need to alter the definition of the arch_list array. You may also want to revise / extend the identify_release() function. Since the name of the installation directory for each release is used by the execution scripts, you should ensure that the final installation directory names under $OSG_APP are recognized by the execution script, coordinating with the validation run manager as necessary.

You will need to place the tarballs in the correct place as specified by source_path using globus-url-copy. In order to do this you must be a member of the lcgadmin rĂ´le of the Geant4 VO. If you need to create new directories in the structure you will need to do this with, globus-job-run <jobmanager> <command>.

A note on WLCG-WMS and WLCG-BDII advertising.

It was folklore that in order for jobs to be executed on an OSG site, it had to be advertising to the WLCG-BDII. I'm not sure if this is the case, and neither is the current release manager. The CERN-based validation is currently run by giving an explicit list of sites to the WLCG-WMS rather than allowing it to pick them itself. One would have to obtain data from the release manager on whether jobs are running successfully on non-advertising sites in order to be sure. One can check whether a site is advertising to WLCG-BDII either with the source:trunk/old/geant4-check-bdii script, or by checking the ClassAd GEANT4_WLCG_BDII for that site using the condor_status command. The source:trunk/util-scripts/gkfind utility script is often handy in situations like this in order to compose the machine-specific constraint for the command.