Fernandes, Jr., Francisco E.
Yen, Gary G.
Total Authors: 2
 Oklahoma State Univ, Stillwater, OK 74078 - USA
 Univ Sao Paulo, Inst Math Sci & Comp, BR-13566590 Sao Carlos, SP - Brazil
 Oklahoma State Univ, Sch Elect & Comp Engn, Stillwater, OK 74078 - USA
Total Affiliations: 3
Web of Science Citations:
Currently, Deep Convolutional Neural Networks (DCNNs) are used to solve all kinds of problems in the field of machine learning and artificial intelligence due to their learning and adaptation capabilities. However, most successful DCNN models have a high computational complexity making them difficult to deploy on mobile or embedded platforms. This problem has prompted many researchers to develop algorithms and approaches to help reduce the computational complexity of such models. One of them is called filter pruning, where convolution filters are eliminated to reduce the number of parameters and, consequently, the computational complexity of the given model. In the present work, we propose a novel algorithm to perform filter pruning by using a Multi-Objective Evolution Strategy (MOES) algorithm, called DeepPruningES. Our approach avoids the need for using any knowledge during the pruning procedure and helps decision-makers by returning three pruned CNN models with different trade-offs between performance and computational complexity. We show that DeepPruningES can significantly reduce a model's computational complexity by testing it on three DCNN architectures: Convolutional Neural Networks (CNNs), Residual Neural Networks (ResNets), and Densely Connected Neural Networks (DenseNets). (C) 2020 Elsevier Inc. All rights reserved. (AU)