Project

General

Profile

DES Science Portal

Contact: <>

Introduction

The DES Science Portal is a web-based system being developed by DES-Brazil and LINEA for the DES collaboration.

The infrastructure provides a workflow system and an API to wrap external codes and submit jobs to a computing cluster with monitoring and provenance capabilities. Each execution is registered in the database and identified by a process id. Process results can be displayed as static HTML pages or through interatctive plots.

The portal can play an important role in the collaboration helping the quality assessment of data releases and creating science-ready data products.

The main infrastructure projects are:

  • Quick Reduce
    Quick assessment of the image quality during observations running at CTIO since October 2012
  • Data Release Interface
    Tools for visual inspection of the images and associated catalogs (DES Sky, DES Tiles, DES Targets, Upload Interface, Cutout Service and User Query). A prototype with SV and Y1 data is running Fermilab since April 2014 Fermilab and will be deployed to NCSA
https://des-portal.fnal.gov  (same redmine username/password)
  • Catalog Infrastructre
    The catalog infrastructure is composed by a set of pipelines for data installation, data preparation and to create science-ready catalogs for LSS, Cluster, GA and GE. The infrascture is running @ LIneA and will be deployed to NCSA to support the Science Release process.

Science-ready data products

Wiki page with available Science Portal products at NCSA database

Operations Model

Currently we have instances of the portal running at LIneA and Fermilab. Another instance is being installed at NCSA integrated with the DES Science Database and DES archive server. While this instance is not in production, the current operations model takes advantage of the computing infrastructure at LIneA to process large datasets and from the instance running at Fermilab (https://des-portal.fnal.gov) with open access to the collaboration.

Catalog Infrasctrucure

The infrastructure we are building consists of three stages:

1- the first stage, called Data Installation transfer the data products released by DESDM and ingest in the portal, there are also tools for image visualization and association with catalogs.

2- in the second stage is called Data Preparation with pipelines to create training sets, train and compute photo-z for several codes and run different S/G algorithms. These results are registered in a products database and used to create science-ready catalogs in an automated way.

3- the third stage is where we create Science-ready Catalogs, including:

  • science-ready catalogs where one chooses the set of parameters critical for a specific science application and prune the catalog using region and object selections, with default or pre-defined configurations from the Science Working groups
  • generic catalogs for distribution with user dedined cuts and the possibility of combining the release data with data from various photo-zs and S/G classification methods.
Stage Pipeline Description
Data Installation Upload Upload ancillary data to the portal like spectroscopic catalogs for creating training sets for training photo-z codes
Install Catalog Transfer and ingest coadd objects table produced by DESDM
Install Mangle Mask Transfer and ingest molygon, coadd_objects_molygon association and helpix map version of the mangle produced by DESDM into the portal
Install Deph Map Ingest deph maps created by the Science Release group into the portal
Install Bad Regions Map Ingest bad regions map created by the Science Release group into the portal
Systematic Maps Create systematic maps, implements method and code by Boris Lei
Zeropoint Adjustment Implements SLR (bigmacs) and applies SD98 or Planck extinction correction
Spectroscopic Sample Creates a spectroscopic sample unifying uploaded spectroscopic catalogs
Data Preparation S/G Separation Implements S/G methods Class Star, Spread Model, Modest, Y1 Modest v1, Y1 Modest v2, Y1 Modest v3
Training Set Maker Match the spectroscopic sample with coadd objects and creates training and validation samples
Photo-z Training Creates training files for ANNz2, MLZ/TPZ, DNF, LePhare, ArborZ, SkyNet, PofZ
Photo-z Compute Compute Photo-z for ANNz2, MLZ/TPZ, DNF and LePhare
Galaxy Properties Compute Galaxy Properties using LePhare
Science-ready catalogs LSS Creates a catalog for LSS science
Cluster Creates a catalog for Cluster science
GA Creates a catalog for GA science
GE Creates a catalog for GE science
Generic Creates a generic catalog

Useful Links: