New instrumentation that produces extremely large quantities of data presents a challenge in data processing and
management. The Solar Dynamics Observatory (SDO) will house instruments that will produce 1.4TB of data per day.
Processing and storing that quantity of data is a serious challenge. The instrument team for the Atmospheric Imaging
Assembly (AIA) that will fly on SDO spent the last 2 weeks in September doing a large-scale side-by-side comparison of
archive equipment from Apple, BlueArc, EMC, Network Appliance, SGI and Sun Microsystems. Each vendor provided
100TB of SATA disk space and the required servers to showcase their unique solutions to the problem of petabyte sized
archives. The results of the testing demonstrate some of the options available in this arena. We will discuss the results of
the testing, the differences and similarities between the vendors and the applicability of the technologies to various
System management for any organization can be a challenge, but satellite projects present their own issues. I will be presenting the network and system architecture chosen to support the scientists in the Chandra X-ray Center. My group provides the infrastructure for science data processing, mission planning, user support, archive support and software development. Our challenge is to create a stable environment with enough flexibility to roll with the changes during the mission. I'll discuss system and network choices, web service, backups, security and systems monitoring. Also, how to build infrastructure that's flexible, how to support a large group of scientists with a relatively small staff, what challenges we faced (anticipated and unanticipated) and what lessons we learned over the past 6 years since the launch of Chandra. Finally I'll outline our plans for the future including beowulf cluster support, an improved helpdesk system, methods for dealing with the explosive amount of data that needs to be managed.