Matthew, Nate, Michael, Kanika, Susan, Dominick, Duyang, Chris, Ryan, Craig.

News (Matthew)

Grid access to the bluearc /nova/app will remain, however /nova/data and /nova/ana access will now be restricted to grid tools only. Shouldn't be an issue for production. The migration effort is being chaired by Art Kreymer.

Mr-Brem (Duyang)

Slides posted to docdb: 11998. The cosmic removal brem module (BremShowerFilter) aims to study em showers induced by cosmic muons. It consists of a module that runs after cosmic track to produce raw digits, it also filters so that only 1% of events are processed.

After some discussion it was decided that perhaps the best way to implement this in production would be to include this filter in our standard reconstruction step. This will produce a second output stream that can then be fed back into reco, pid, caf. To do this we'll need to test the new, dual output FHiCL. Implement new data tiers to hold the output objects, then come up with a strategy to reconstruct the output data.

Action Item: Matthew will follow this up in an email.

Discussion of user tools for SAM analysis (Gavin et al.)

No Gavin, so we'll put this off to a later date (again).

Amazon cloud test (Paola)

Slides posted to docdb 11999. The Grid and Cloud services department at FNAL want to run a AWS & Fermicloud scalability test using NOvA jobs. The parameters of this test are 8,000 hours on Amazon, 1,000 VM's x 4 2hr jobs each, or something of this form. Need this in order to apply for an AWS science grant.

To run these tests we need to update to the new jobsub client (something we need to do anyway), we also need to define the parameters of this job.

Action Item: Paola and Ken Herner will work on updating some of our scripts to do this.
Action Item: Matthew to contact the CD people with defined parameters.

Ryans slides on keep up

Slides posted to docdb: 12000. His idea is to keep a small sample of keep-up files on blue-arc. He proposes to do this by running a cronjob which scrapes from the Official Datasets web page then downloads the last three files in each dataset on a daily basis. Nate had a similar idea involving the FTS. We decided that Ryan's idea was simpler and didn't involve altering the FTS to do something outside of it's scope so this will be implemented.

Action Item: Matthew, setup somewhere on SVN for this to live.

Raw2root keep-up audit (Jeny)

Her audit reveals that we are missing around 300 ND files (~100Mb), and around 100,000 FD files (~400Mb). Of these 50 ND files are in the NuMI stream as are 500 FD files. These files are spread across a large region in time. Suggest reprocessing them now.

PChits keep-up (Paola, Satish)

Paola is working on the scripts and expects that by next thursday they should be up and running.

Reco keep-up (Jeny, Dominick)

Dominick provided old script and Jeny will look at it this week.


  • Action item for data quality group: How should we define run periods? * The keep up tag is bad due to db problem, this is fixed in development and the need to backport a change to S14-09-09 will be discussed with Jon. * What do we need to generate official multi-point FD files?