Translator Disclaimer
9 August 2018 Detection of sticker based adversarial attacks
Author Affiliations +
Proceedings Volume 10806, Tenth International Conference on Digital Image Processing (ICDIP 2018); 108066Y (2018) https://doi.org/10.1117/12.2503219
Event: Tenth International Conference on Digital Image Processing (ICDIP 2018), 2018, Shanghai, China
Abstract
Adversarial examples revealed and important aspect of convolutional neural networks and are getting more and more attention in machine learning. It was shown that not only small perturbations, covering the whole image can be applied but also sticker based attacks, concentrated on small regions of the image can cause misclassification. Meanwhile the first type of attack is theoretical the later can be applied in practice and lead tomisclassification in image processing pipelines. In this paper we show a method how sticker based adversarial samples can be detected by calculating the responses of the neurons in the last layers and estimating the measure of region based classification consistency.
© (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
András Horváth and Csanád Egervári "Detection of sticker based adversarial attacks", Proc. SPIE 10806, Tenth International Conference on Digital Image Processing (ICDIP 2018), 108066Y (9 August 2018); https://doi.org/10.1117/12.2503219
PROCEEDINGS
5 PAGES


SHARE
Advertisement
Advertisement
Back to Top