Unmanned aircraft systems (UAS) have gained utility in the Navy for many purposes, including facility needs, security, and intelligence, surveillance, and reconnaissance (ISR). UAS surveys can be employed in place of personnel to reduce safety risks, but they generate significant quantities of data that often require manual review. Research and development of automated methods to identify targets of interest in this type of imagery data can provide multiple benefits, including increasing efficiency, decreasing cost, and potentially saving lives through identification of hazards or threats. This paper presents a methodology to efficiently and effectively identify cryptic target objects from UAS imagery. The approach involves flight and processing of airborne imagery in low-light conditions to find low-profile objects (i.e., birds) in beach and desert-like environments. The object classification algorithms combat the low-light conditions and low-profile nature of the objects of interest using cascading models and a tailored deep convolutional neural network (CNN) architecture. Models were able to identify and count endangered birds (California least terns) and nesting sites on beaches from UAS survey data, achieving negative/positive classification accuracies from candidate images upwards of 97% and an <i>f</i><sub>1</sub> score for detection of 0:837.