Emergent Mind

Model Pruning Based on Quantified Similarity of Feature Maps

(2105.06052)
Published May 13, 2021 in cs.CV and cs.LG

Abstract

Convolutional Neural Networks (CNNs) has been applied in numerous Internet of Things (IoT) devices for multifarious downstream tasks. However, with the increasing amount of data on edge devices, CNNs can hardly complete some tasks in time with limited computing and storage resources. Recently, filter pruning has been regarded as an effective technique to compress and accelerate CNNs, but existing methods rarely prune CNNs from the perspective of compressing high-dimensional tensors. In this paper, we propose a novel theory to find redundant information in three-dimensional tensors, namely Quantified Similarity between Feature Maps (QSFM), and utilize this theory to guide the filter pruning procedure. We perform QSFM on datasets (CIFAR-10, CIFAR-100 and ILSVRC-12) and edge devices, demonstrate that the proposed method can find the redundant information in the neural networks effectively with comparable compression and tolerable drop of accuracy. Without any fine-tuning operation, QSFM can compress ResNet-56 on CIFAR-10 significantly (48.7% FLOPs and 57.9% parameters are reduced) with only a loss of 0.54% in the top-1 accuracy. For the practical application of edge devices, QSFM can accelerate MobileNet-V2 inference speed by 1.53 times with only a loss of 1.23% in the ILSVRC-12 top-1 accuracy.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.