Autonomous Air Vehicles (AUV) are used for survey, patrol and exploration purposes. Command and control of such vehicles is usually done by a human operator through use of joystick and visual camera. Most of the perception, target detection, and maneuvering is also done by a human. This approach does not allow for autonomy to evolve much further. Additionally, path planning algorithms are limited in scope and do not account for real-time events. A proposed fly eye antenna array system, gives an opportunity to scan its surroundings and detect multiple targets. It consists of angularly – spaced directional and overlapping antennas with wide – area coverage. Each directional antenna is coupled with front end circuit, and a digital processor. Signals coming from such an antenna array records information about position, direction, target proximity with good penetration through scattered media . Object detection using radio frequency signals from such sensors is challenging. The object reflects electromagnetic signals, coming from multitude of the fly-eye antennas. Based on the correlative values obtained from the sensors, a determination is made of the location, shape, and size of the object. Such patterns are recorded and used for convolutional neural network training purposes. The reflected and positional information of the radar sensors provides valuable information to the pattern system: 9 sensor covering 360 C view, with a correlative signal strength matrix. The objective of using signals from antenna array of signals with together with artificially labeled data makes a system intelligent. This approach is much simpler and we think it will have better generalization and performance then other RF signal processing approaches.