Paper
18 November 2019 Dynamic-stride-net: deep convolutional neural network with dynamic stride
Author Affiliations +
Abstract
It is crucial to reduce the cost of deep convolutional neural networks while preserving their accuracy. Existing methods adaptively prune DNNs in a layer-wise or channel-wise manner based on the input image. In this paper, we develop a novel dynamic network, namely Dynamic-Stride-Net, to improve residual network with layer-wise adaptive strides in the convolution operations. Dynamic-Stride-Net leverages a gating network to adaptively select the strides of convolutional blocks based on the outputs of the previous layer. To optimize the selection of strides, the gating network is trained by reinforcement learning. The floating point operations per second (FLOPS) is significantly reduced by adapting the strides to convolutional layers without loss of accuracy. Dynamic-Stride-Net reduces the computational cost by 35%-50% with equivalent accuracy of the original model on CIFAR-10 and CIFAR-100 datasets. It outperforms the state-of-the-art dynamic networks and static compression methods.
© (2019) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Zerui Yang, Yuhui Xu, Wenrui Dai, and Hongkai Xiong "Dynamic-stride-net: deep convolutional neural network with dynamic stride", Proc. SPIE 11187, Optoelectronic Imaging and Multimedia Technology VI, 1118707 (18 November 2019); https://doi.org/10.1117/12.2537799
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Convolutional neural networks

Convolution

Data modeling

Visualization

Back to Top