The deep neural network algorithm has been widely used in remote sensing image classification. However, training classifiers require a large number of marked samples, which are costly. We propose a method based on active learning deep neural network. Firstly, deep neural network algorithm uses training samples to obtain the initial classifier, and then active learning is used to choose the most informative samples from unmarked samples to be marked by experts, the marked samples will be rejoined into training samples, in this way to update the classifier iteratively. This method requires only a small amount of training samples to achieve or even exceed the classification accuracy that a large number of training samples can achieve.
Since standard parts are necessary components in mechanical structure like bogie and connector. These mechanical structures will be shattered or loosen if standard parts are lost. So real-time standard parts inspection systems are essential to guarantee their safety. Researchers would like to take inspection systems based on deep learning because it works well in image with complex backgrounds which is common in standard parts inspection situation. A typical inspection detection system contains two basic components: feature extractors and object classifiers. For the object classifier, Region Proposal Network (RPN) is one of the most essential architectures in most state-of-art object detection systems. However, in the basic RPN architecture, the proposals of Region of Interest (ROI) have fixed sizes (9 anchors for each pixel), they are effective but they waste much computing resources and time. In standard parts detection situations, standard parts have given size, thus we can manually choose sizes of anchors based on the ground-truths through machine learning. The experiments prove that we could use 2 anchors to achieve almost the same accuracy and recall rate. Basically, our standard parts detection system could reach 15fps on NVIDIA GTX1080 (GPU), while achieving detection accuracy 90.01% mAP.