The Dark Energy Survey (DES) collaboration will study cosmic acceleration with a 5000 deg2 griZY survey in the
southern sky over 525 nights from 2011-2016. The DES data management (DESDM) system will be used to process
and archive these data and the resulting science ready data products. The DESDM system consists of an integrated
archive, a processing framework, an ensemble of astronomy codes and a data access framework. We are developing the
DESDM system for operation in the high performance computing (HPC) environments at the National Center for
Supercomputing Applications (NCSA) and Fermilab. Operating the DESDM system in an HPC environment offers
both speed and flexibility. We will employ it for our regular nightly processing needs, and for more compute-intensive
tasks such as large scale image coaddition campaigns, extraction of weak lensing shear from the full survey dataset, and
massive seasonal reprocessing of the DES data. Data products will be available to the Collaboration and later to the
public through a virtual-observatory compatible web portal. Our approach leverages investments in publicly available
HPC systems, greatly reducing hardware and maintenance costs to the project, which must deploy and maintain only the
storage, database platforms and orchestration and web portal nodes that are specific to DESDM. In Fall 2007, we tested
the current DESDM system on both simulated and real survey data. We used Teragrid to process 10 simulated DES
nights (3TB of raw data), ingesting and calibrating approximately 250 million objects into the DES Archive database.
We also used DESDM to process and calibrate over 50 nights of survey data acquired with the Mosaic2 camera.
Comparison to truth tables in the case of the simulated data and internal crosschecks in the case of the real data indicate
that astrometric and photometric data quality is excellent.