The Dark Energy Survey (DES) collaboration will study cosmic acceleration with a 5000 deg<sup>2</sup> <i>griZY</i> survey in the
southern sky over 525 nights from 2011-2016. The DES data management (DESDM) system will be used to process
and archive these data and the resulting science ready data products. The DESDM system consists of an integrated
archive, a processing framework, an ensemble of astronomy codes and a data access framework. We are developing the
DESDM system for operation in the high performance computing (HPC) environments at the National Center for
Supercomputing Applications (NCSA) and Fermilab. Operating the DESDM system in an HPC environment offers
both speed and flexibility. We will employ it for our regular nightly processing needs, and for more compute-intensive
tasks such as large scale image coaddition campaigns, extraction of weak lensing shear from the full survey dataset, and
massive seasonal reprocessing of the DES data. Data products will be available to the Collaboration and later to the
public through a virtual-observatory compatible web portal. Our approach leverages investments in publicly available
HPC systems, greatly reducing hardware and maintenance costs to the project, which must deploy and maintain only the
storage, database platforms and orchestration and web portal nodes that are specific to DESDM. In Fall 2007, we tested
the current DESDM system on both simulated and real survey data. We used Teragrid to process 10 simulated DES
nights (3TB of raw data), ingesting and calibrating approximately 250 million objects into the DES Archive database.
We also used DESDM to process and calibrate over 50 nights of survey data acquired with the Mosaic2 camera.
Comparison to truth tables in the case of the simulated data and internal crosschecks in the case of the real data indicate
that astrometric and photometric data quality is excellent.
The Dark Energy Survey (DES; operations 2009-2015) will address the nature of dark energy using four independent and complementary techniques: (1) a galaxy cluster survey over 4000 deg<sup>2</sup> in collaboration with the South Pole Telescope Sunyaev-Zel'dovich effect mapping experiment, (2) a cosmic shear measurement over 5000 deg<sup>2</sup>, (3) a galaxy angular clustering measurement within redshift shells to redshift=1.35, and (4) distance measurements to 1900 supernovae Ia. The DES will produce 200 TB of raw data in four bands, These data will be processed into science ready images and catalogs and co-added into deeper, higher quality images and catalogs. In total, the DES dataset will exceed 1 PB, including a 100 TB catalog database that will serve as a key science analysis tool for the astronomy/cosmology community. The data rate, volume, and duration of the survey require a new type of data management (DM) system that (1) offers a high degree of automation and robustness and (2) leverages the existing high performance computing infrastructure to meet the project's DM targets. The DES DM system consists of (1) a gridenabled, flexible and scalable middleware developed at NCSA for the broader scientific community, (2) astronomy
modules that build upon community software, and (3) a DES archive to support automated processing and to serve DES catalogs and images to the collaboration and the public. In the recent DES Data Challenge 1 we deployed and tested the first version of the DES DM system, successfully reducing 700 GB of raw simulated images into 5 TB of reduced data products and cataloguing 50 million objects with calibrated astrometry and photometry.