Sophisticated robotic platforms with diverse sensor suites are quickly replacing the eyes and ears of soldiers on the complex battlefield. The Army Research Laboratory (ARL) in Adelphi, Maryland has developed a robot-based acoustic detection system that will detect an impulsive noise event, such as a sniper's weapon firing or door slam, and activate a pan-tilt to orient a visible and infrared camera toward the detected sound. Once the cameras are cued to the target, onboard image processing can then track the target and/or transmit the imagery to a remote operator for navigation, situational awareness, and target detection. Such a vehicle can provide reconnaissance, surveillance, and target acquisition for soldiers, law enforcement, and rescue personnel, and remove these people from hazardous environments. ARL's primary robotic platforms contain 16-in. diameter, eight-element acoustic arrays. Additionally, a 9- in. array is being developed in support of DARPA's Tactical Mobile Robot program. The robots have been tested in both urban and open terrain. The current acoustic processing algorithm has been optimized to detect the muzzle blast from a sniper's weapon, and reject many interfering noise sources such as wind gusts, generators, and self-noise. However, other detection algorithms for speech and vehicle detection/tracking are being developed for implementation on this and smaller robotic platforms. The collaboration between two robots, both with known positions and orientations, can provide useful triangulation information for more precise localization of the acoustic events. These robots can be mobile sensor nodes in a larger, more expansive, sensor network that may include stationary ground sensors, UAVs, and other command and control assets. This report will document the performance of the robot's acoustic localization, describe the algorithm, and outline future work.