As sensor technology advances, military systems, air- and space-borne, are being equipped with an increasing variety of sensors with highly specialized capabilities. These sensors perform specific tasks which are centered around the detection, identification, and tracking of various objects. In a multiple sensor environment a great deal of information is constantly being produced and is available, at some level, for further use by other sensors. However, the majority of today's sensors operate independently. The sharing of information between sensors, that could collectively improve sensor performance, is not being performed. The Wright Laboratory Armament Directorate has recently discovered greatly extended capabilities of a uniquely powerful set of Image Flow/Inertial algorithms being developed for the Smart Tactical Autonomous Guidance program. These new capabilities hold great promise for direct application to many aspects of the information sharing problem. Target and scene image transformation algorithms are under development which may permit translation of target and scene information to a person (or machine) located at some other vantage point. It is believed that this translation can be done such that the person (or machine) receiving the image can view it, properly scaled and transformed, as if it were being obtained directly from the perspective of the new vantage point.