The Air Force program, Smart Tactical Autonomous Guidance (STAG), has as its central concept, the use of passive millimeter wave imagery (PMMW) to enable an autonomous vehicle to perform its own smart guidance and attack. The algorithms on board the vehicle use image flow to derive the necessary range information for obtaining real time navigation updates. The results of a natural-imagery feasibility program will be reported, aimed at validating the STAG approach. PMMW imagery will be taken from land- based vantage points that mimic the geometry of an airborne, down looking, sensor. Essentially, hardware-in-the-loop simulation will be performed, where PMMW data will be real, and the loop extends over many miles of outdoor terrain. The only departure from an actual mission will be that it is not real-time. All imagery will be gathered using frame times consistent with existing camera capabilities. Image flow and other data processing will be done off-line. The key ingredients will be the sequences of imagery and the computer processing of that imagery. The means for accomplishing both have been developed under the STAG program. The camera to be used operates at W-band and consists of an f/1, refractive, telecentric, image forming system with a 30 cm diameter input aperture. Image flow involves a model whose parameters are determined via automated pixel tracking from frame to frame. Passive range maps are then generated, and navigation is accomplished through the subsequent correlation of these maps with reference elevation maps. Automatic target recognition is also addressed.