Project

General

Profile

Bug #3524

limited by the number of galaxies

Added by Brian Nord over 6 years ago. Updated over 6 years ago.

Status:
Closed
Priority:
Urgent
Assignee:
-
Category:
Speed
Target version:
Start date:
02/24/2013
Due date:
% Done:

0%

Estimated time:
Duration:

Description

We need a plan to deal with large numbers of galaxies

The pipeline slows at Sim Gal Spec Clean when creating enough memory to put all the reconstructed/simulated spectra into :
Ngal = len(id_fiber)
Nwave = len(tspec_v0[0,:])
flux = np.zeros((Nwave, Ngal)) #!!! The program takes a long time just to instantiate this 2d array

first fix: reduce the number of galaxies for testing purposes
second fix: append to empty array, instead of instantiate full array at beginning; this will probably only kick the can down the road.
longer-term fix: how do we do this for millions of galaxies by Friday?

History

#1 Updated by Laurenz Gamper over 6 years ago

It is trying to allocate approx. 4.3 TB of Ram (on my system, where a int is 24 bytes). Which is a lot, we are working to split them in parts which can be processed sequentially.

#2 Updated by Brian Nord over 6 years ago

  • Status changed from New to Closed


Also available in: Atom PDF