Intelligence, surveillance, and reconnaissance (ISR) operations in urban environments can be particularly challenging due in part to the physical proximity and height of the buildings which can occlude sensor coverage. Operational and laboratory settings have shown it is very difficult for a single operator to manually track a moving target using a set of grounded, steerable sensors within an urban environment. Although computer vision technologies are available for autotracking, they are often unreliable due to variations in lighting, visibility, and visual clutter. As a result, the Air Force Research Laboratory (AFRL) is developing novel interface technologies that leverage automation to flexibly assist a human operator with the task of tracking one or more moving targets across an array of fixed pan-tilt-zoom (PTZ) electro-optical (EO) sensors in an urban environment. Automated functions being explored by this research effort focus on maintaining visual momentum and include automated sensor steering and system-recommended perspective switching. These automated functions were compared, in addition to a baseline (no automation) condition, and operator performance improved as the level of automated assistance increased. The results of this examination indicate a necessity for surveillance technologies to incorporate automation. Further research is recommended to identify additional operator functions that could be automated to overcome the common challenges associated with real-time target tracking in an urban environment.