Translator Disclaimer
14 May 2019 Physically realizable adversarial examples for convolutional object detection algorithms
Author Affiliations +
In our work, we make two primary contributions to the field of adversarial example generation for convolutional neural network based perception technologies. First of all, we extend recent work on physically realizable adversarial examples to make them more robust to translation, rotation, and scale in real-world scenarios. Secondly, we demonstrate attacks against object detection neural networks rather than considering only the simpler problem of classification, demonstrating the ability to force these networks to mislocalize as well as misclassify. We demonstrate our method on multiple object detection frameworks, including Faster R-CNN, YOLO v3, and our own single-shot detection architecture.
Conference Presentation
© (2019) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
David R. Chambers and H. Abe Garza "Physically realizable adversarial examples for convolutional object detection algorithms", Proc. SPIE 10988, Automatic Target Recognition XXIX, 109880R (14 May 2019);

Back to Top