Project

General

Profile

Support #21451

Minos SCPMT FY2019 computing support request

Added by Arthur Kreymer 12 months ago. Updated 10 months ago.

Status:
New
Priority:
Normal
Start date:
11/28/2018
Due date:
01/17/2019
% Done:

90%

Estimated time:
5.00 h
Duration: 51

Description

The requests for 2019/20/21 computing are due Dec 14.
Formal presentations are on Feb 25/26 2019 ( Minos does not present ).

History

#1 Updated by Arthur Kreymer 12 months ago

  • Due date set to 12/14/2018
  • % Done changed from 0 to 10
  • Estimated time set to 5.00 h

#2 Updated by Arthur Kreymer 12 months ago

Date: Thu, 15 Nov 2018 22:07:04 +0000
From: Margaret Votava <>
To: cs-liaison <>
Cc: scd_seniormgrs <>, Jon A Bakken <>
Subject: SCPMT 2019
Parts/Attachments:
1 OK ~2.9 KB Text
2 Shown ~12 KB Text
----------------------------------------

 

Dear Liaisons,

 

Based on the feedback we have had from the review committee, we are making several changes to the preparation material. Please
note the specifics in this email and notify your experiments and speakers/accordingly.

 

1. We are moving away from experiments filling out spreadsheets and having all experiments (whether or not presenting) prepare
a slide deck based on the following template [docs.google.com].
2. SCD will prepare a personalized deck for your experiment in the upcoming week with the most recent plots and tables updated
to the best of our knowledge. Please use that specific deck for your updates. FYI the source for the plots are here
[fifemon.fnal.gov].
3. Experiments must use this deck to both convey their requests AND for the presentations. Additional slides may be added to
the deck for any special needs or concerns you may have.
4. Final presentations are due by December 14th. Please post them in SCPMT sharepoint area. [fermipoint.fnal.gov]
5. We are slightly shifting the annual reporting periods. All dates will now reflect Calendar Year and not Fiscal Year.
6. The experiments that are presenting are: dune, uboone, sbnd, icarus, nova, gm2, mu2e, cms des, lsst. The proposed schedule
in is indico [indico.fnal.gov]. Please let me know your presenters by December 14th.
7. Slide 13 is where we really want to hear where you see computing heading for your experiment over the next few years: where
are you focusing your efforts and where you need the division to focus ours.

 

This is also the opportunity to audit various pieces of information that the division knows about your experiment. Please
confirm in email by Dec 31st that this information is current

  • LIaison information in sharepoin [fermipoint.fnal.gov]t. (this should be moving to SNOW sometime in the next year). * Members of <experiment>-support assignment group in SNOW. Should be able to find it in SNOW here [fermi.service-now.com].

 

 

Also note that the FIFE workshop is Jan 17th. The division will have had time to process your requests and feedback to you at
this time any concerns that we have as a division wrt to meeting your needs.

 

Thanks,

Margaret

#3 Updated by Arthur Kreymer 12 months ago

Date: Thu, 15 Nov 2018 22:07:04 +0000
From: Margaret Votava <>
To: cs-liaison <>
Cc: scd_seniormgrs <>, Jon A Bakken <>
Subject: SCPMT 2019
Parts/Attachments:
1 OK ~2.9 KB Text
2 Shown ~12 KB Text
----------------------------------------

 

Dear Liaisons,

 

Based on the feedback we have had from the review committee, we are making several changes to the preparation material. Please
note the specifics in this email and notify your experiments and speakers/accordingly.

 

1. We are moving away from experiments filling out spreadsheets and having all experiments (whether or not presenting) prepare
a slide deck based on the following template [docs.google.com].
2. SCD will prepare a personalized deck for your experiment in the upcoming week with the most recent plots and tables updated
to the best of our knowledge. Please use that specific deck for your updates. FYI the source for the plots are here
[fifemon.fnal.gov].
3. Experiments must use this deck to both convey their requests AND for the presentations. Additional slides may be added to
the deck for any special needs or concerns you may have.
4. Final presentations are due by December 14th. Please post them in SCPMT sharepoint area. [fermipoint.fnal.gov]
5. We are slightly shifting the annual reporting periods. All dates will now reflect Calendar Year and not Fiscal Year.
6. The experiments that are presenting are: dune, uboone, sbnd, icarus, nova, gm2, mu2e, cms des, lsst. The proposed schedule
in is indico [indico.fnal.gov]. Please let me know your presenters by December 14th.
7. Slide 13 is where we really want to hear where you see computing heading for your experiment over the next few years: where
are you focusing your efforts and where you need the division to focus ours.

 

This is also the opportunity to audit various pieces of information that the division knows about your experiment. Please
confirm in email by Dec 31st that this information is current

  • LIaison information in sharepoin [fermipoint.fnal.gov]t. (this should be moving to SNOW sometime in the next year). * Members of <experiment>-support assignment group in SNOW. Should be able to find it in SNOW here [fermi.service-now.com].

 

 

Also note that the FIFE workshop is Jan 17th. The division will have had time to process your requests and feedback to you at
this time any concerns that we have as a division wrt to meeting your needs.

 

Thanks,

Margaret

#5 Updated by Arthur Kreymer 11 months ago

Reply yesterday :

Date: Wed, 2 Jan 2019 21:03:24 +0000
From: Margaret Votava <>
To: Arthur E Kreymer <>
Cc: Stuart C Fuess <>
Subject: MINOS SCPMT slides

Hi Art,

 

I am looking over [your?] slides and have a few questions.

 

  • Slide two. I know the experiment is only doing analysis, but that’s it in terms of computing? All questions go to Tom? He is
    physics coordinator, release manager, database expert, etc? * Slide 3 – minos has not important conferences you are targeting results for? * Memory usage had doubled since the summer. Do you have a sense of why that is? * Slide 7. Given you’ve used less than 6M hours in 2008, why do you expect that to double in 2019? It seems that Minos has
    always been off by a factor of two. Since like it’s time to lower your quota? * Slide 8. You are OSG ready, but don’t appear to be running any jobs offsite. Do you know why? You are not HPC or cloud
    ready, but that’s ok. You may want to say you don’t expect analysis to ever run there. * Slide 12. We are not asking what limitations you may have wrt to tape migration, but what your priorities are – ie, what
    needs to get migrated vs what can actually stay on the older media which won’t be readable forever. * Is there a target date (ie, year) when analysis might be complete?Are you accepting new students?

 

 

Thanks,

Margaret

#6 Updated by Arthur Kreymer 11 months ago

  • % Done changed from 30 to 40

My draft answers to the questions . Corrections/improvements are welcome.
Particularly regarding OSG usage, and long term shutdown plans.

  • Slide two. I know the experiment is only doing analysis,
    but that's it in terms of computing?
    All questions go to Tom?
    He is physics coordinator, release manager, database expert, etc ?

Tom is the current Minos Offline Computing Coordinator.
All computing questions should go through him.

  • Slide 3 - minos has not important conferences you are targeting results for?

We plan no substantial compting pushes for conferences.

  • Memory usage had doubled since the summer.
    Do you have a sense of why that is?

Recent jobs have been dominated by needs of a new paper being published.
That has changed the mix of jobs.

  • Slide 7. Given you've used less than 6M hours in 2008,
    why do you expect that to double in 2019?
    It seems that Minos has always been off by a factor of two.
    Since like it's time to lower your quota?

The Minos Near detector, used by Minerva, shuts down in Summer 2019.
Minerva may want a full uniform analysis of the entire Minos Near dataset,
and associated Monte Carlo processing. These jobs would run as Minos.

  • Slide 8. You are OSG ready, but don't appear to be running any jobs offsite.
    Do you know why? You are not HPC or cloud ready, but that's ok.
    You may want to say you don't expect analysis to ever run there.

( Tom - have we tried OSG processing ? Were there issues ? )

So far, local resources have been sufficient for Minos analysis.
Historically, all Minos Monte Carlo generation has run on OSG sites.
Admittedly this was done before modern OSG submission tools existed.

Minos has no present plans for HPC or GPU processing.

  • Slide 12. We are not asking what limitations you may have wrt to
    tape migration, but what your priorities are
    ie, what needs to get migrated vs what can actually stay
    on the older media which won't be readable forever.

We will review Data Preservation plans during 2019.

  • Is there a target date (ie, year) when analysis might be complete?
    Are you accepting new students?

Minos has about 12 papers in progress,
which we should publish in the next couple of years, by 2020.
Computing infrastructure needs to remain in place to support Minerva.
So 2022 might be a good target for a full shutdown.

( Spokes - any thoughts ?
CDF and D0 are shutting down this month, in January 2019 )

#7 Updated by Arthur Kreymer 11 months ago

  • Due date changed from 12/14/2018 to 01/17/2019

Changed the Issue Due daee to Jan 17, the FIFE workshop.

#8 Updated by Arthur Kreymer 11 months ago

Date: Thu, 3 Jan 2019 22:23:19 +0000
From: Kenneth Richard Herner <>
To: Kenneth Richard Herner <>
Subject: Friendly reminder to register for the FIFE roadmap workshop on Jan 17

Hi everyone,
A Happy New Year to you all, and we're looking forward to welcoming you to the next FIFE Roadmap workshop on the 17th.
Please register so we can have an accurate count:

https://indico.fnal.gov/event/19031/

Registration is free. There will also be a Zoom connection available for those who can't make it in person.

Regards,

Ken for the FIFE Project

#9 Updated by Arthur Kreymer 10 months ago

I sent the following reply regarding the SCPMT planning slides:

Date: Tue, 15 Jan 2019 21:51:46 +0000 (GMT)
From: Arthur Kreymer <>
To: Margaret Votava <>
Cc: Stuart C Fuess <>, , , ,
Subject: Re: MINOS SCPMT slides

Slide two. I know the experiment is only doing analysis,
but that's it in terms of computing?
All questions go to Tom?
He is physics coordinator, release manager, database expert, etc ?

Tom is the current Minos Offline Computing Coordinator.
All computing questions should go through him.

Slide 3 - minos has not important conferences
you are targeting results for?

We plan no substantial computing pushes for conferences.

Memory usage had doubled since the summer.
Do you have a sense of why that is?

Recent jobs have been dominated by needs of a new paper being published.
That has changed the mix of jobs.

Slide 7. Given you've used less than 6M hours in 2008,
why do you expect that to double in 2019?
It seems that Minos has always been off by a factor of two.
Since like it's time to lower your quota?

The Minos Near detector, used by Minerva, shuts down in Summer 2019.
Minerva may want a full uniform analysis of the entire Minos Near dataset,
and associated Monte Carlo processing. These jobs would run as Minos.

Slide 8. You are OSG ready,
but don't appear to be running any jobs offsite.
Do you know why? You are not HPC or cloud ready, but that's ok.
You may want to say you don't expect analysis to ever run there.

So far, local resources have been sufficient for Minos analysis.
Historically, all Minos Monte Carlo generation has run on OSG sites.
Admittedly this was done before modern OSG submission tools existed.

Minos has no present plans for HPC or GPU processing.

Slide 12. We are not asking what limitations you may have wrt to
tape migration, but what your priorities are
ie, what needs to get migrated vs what can actually stay
on the older media which won't be readable forever.

We will review Data Preservation plans during 2019.
Most Minos data is on LT04 media, which will need to migrate soon.
Roughly 550 of 600 TB is on LT04.

Is there a target date (ie, year) when analysis might be complete?
Are you accepting new students?

Minos has about 12 papers in progress,
which we should publish in the next couple of years, by 2021.

Computing infrastructure needs to remain in place to support Minerva.
So around 2022 might be a good target for a full shutdown.

#10 Updated by Arthur Kreymer 10 months ago

Date: Tue, 15 Jan 2019 23:52:25 +0000
From: Margaret Votava <>
To: Jorge Chaves <>, Thomas J. Carroll <>
Cc: Jiyeon Han <>, Gabriel Nathan Perdue <>, Stuart C Fuess <>, Arthur E Kreymer <>
Subject: Minerva needing MINOS  

Dear Jorge and Tom,

We’re looking over SCPMT presentations.
MINOS is asking for resources for Minerva (because the jobs run as Minos).
From Art’s description, Minerva may want a full uniform analysis of the entire Minos Near dataset, and associated Monte Carlo processing. 

Even though the jobs run as MINOS,
it is better if the request for that processing and storage shows up in the MINERVA presentation
and taken out of MINOS.  
It is not so straightforward for the review committee to understand the complete needs for MINERVA
if it is split across presentations.

With kind regards,

Margaret

#11 Updated by Arthur Kreymer 10 months ago

  • % Done changed from 40 to 60

I updated the CPU estimates to 6, 3, 2 Million hours,
to reflect current and decreasing analsis needs,
removing contingency for Minerva usage, at the official slide site
https://docs.google.com/presentation/d/12OKVLW2di32Z8jUNcUjQkYcZfiwzWcFLjj62Uhcqqv8/edit?ts=5c0ea7f4#slide=id.g48805930d7_2_78

#12 Updated by Arthur Kreymer 10 months ago

  • % Done changed from 60 to 90

I revised the slides to show Minerva shutdown by April 2019,

All data presently on LTO4 needs to migrate to LTO8.



Also available in: Atom PDF