SparseNet: Deep Convolutional Network with Sparse Connections between Blocks

Konferenz: ICMLCA 2021 - 2nd International Conference on Machine Learning and Computer Application
17.12.2021 - 19.12.2021 in Shenyang, China

Tagungsband: ICMLCA 2021

Seiten: 5Sprache: EnglischTyp: PDF

Persönliche VDE-Mitglieder erhalten auf diesen Artikel 10% Rabatt

Autoren:
Wang, Honglin; Yang, Jinping; Kou, Wanting; Chen, Xinru; Li, Jun; Li, Yunfan (Shenyang Ligong University, School of Information Science and Engineering, Shenyang China)

Inhalt:
With the developing of Deep Learning technology, optimizing structure of Deep Convolutional Neural Networks has become one of the hottest research. This paper basing on previous achievements advances a new theory of SparseNet. Compared to traditional network models, the most important characteristic of SparseNet is to pick out some blocks of model instead of the whole model to train at training time. Under the condition of ensuring the accuracy of the model, the actual training parameters are greatly reduced and the demand of computational power of training model is reduced. In this paper, the model is composed of microcell convolution blocks which are composed of a Convolutional layer and a Normalization layer, and the Convolutional layer uses small size convolution kernel. For the selection of training blocks, Evolutionary Blocking Screening Algorithm: screening blocks trained in a way of maintaining an array recording weight is proposed in this paper. In the experimental part, three SparseNet models with different depths are constructed. Meanwhile, using DenseNet and ResNet of different depths makes a comparison. The results proved the correctness of the optimized model by CIFAR Dataset. Compared with the current classic network structure, using SparseNet and Evolutionary Blocking Screening Algorithm has obvious advantages in training parameters and training speed, which established requirements.