Surveillance and spatial awareness systems necessitating the capture of panoramic imagery pose new challenges in respect of latency, data fusion and cost. Array sensor technologies support options to capture panoramic imagery. Fusion of imagery from multisensor arrays may be achieved through data capture and computer manipulation. This approach introduces many forms of latency to imagery of concern to the military, cost implications for volume market, size/power considerations for remote/mobile systems. For an array of image sensors aligned such that their overlapping fields of view share consecutive parts, in azimuth and or elevation, of a common scenario, then by staggering their relative synchronization in respect of line respectively frame, time continuous luminance information may be considered to exist around the array. Data fusion of such time continuous luminance information may be achieved live at source by tapping luminance from consecutive sensors, switching source at image correspondence. When such luminance information is punctuated by the introduction of new sync information then display of the fused and live panoramic imagery may be made on conventional displays. This paper addresses design considerations of such systems, latency, at source data fusion control loops, health monitoring and data rate processing extras.