Paper
29 August 2016 A dropout distribution model on deep networks
Fengqi Li, Helin Yang
Author Affiliations +
Proceedings Volume 10033, Eighth International Conference on Digital Image Processing (ICDIP 2016); 1003360 (2016) https://doi.org/10.1117/12.2243971
Event: Eighth International Conference on Digital Image Processing (ICDIP 2016), 2016, Chengu, China
Abstract
Dropout is proved to have a good ability of controlling overfitting and improving deep networks’ generalization. However, dropout adopts a constant rate to train the parameters of each layer, reducing the classification accuracy and efficiency. Aiming at this problem, the paper proposes a dropout rate distribution model by analyzing the relationship between the dropout rate and the layers of the deep network. First, we gave the formal description of the dropout rate to reveal the relationship between the dropout rate and the layers of the deep network. Second, we proposed a distribution model for determining the dropout rate in each layer training. Experiments are performed on MNIST and CIFAR-10 datasets to evaluate the performance of the proposed model by comparison with networks of constant dropout rates. Experimental results demonstrate that our proposed model performs better than the conventional dropout in classification accuracy and efficiency.
© (2016) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Fengqi Li and Helin Yang "A dropout distribution model on deep networks", Proc. SPIE 10033, Eighth International Conference on Digital Image Processing (ICDIP 2016), 1003360 (29 August 2016); https://doi.org/10.1117/12.2243971
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Neural networks

Performance modeling

Data modeling

Visualization

Binary data

Object recognition

Software development

Back to Top