Document Type

Thesis - Open Access

Award Date

2022

Degree Name

Master of Science (MS)

Department / School

Electrical Engineering and Computer Science

First Advisor

Kwanghee Won

Abstract

Convolutional Neural Network (CNN) is a neural network developed for processing image data. CNNs have been studied extensively and have been used in numerous computer vision tasks such as image classification and segmentation, object detection and recognition, etc. [1] Although, the CNNs-based approaches showed humanlevel performances in these tasks [2], they require heavy computation in both training and inference stages, and the models consist of millions of parameters. This hinders the development and deployment of CNN-based models for real world applications. Neural Network Pruning and Compression techniques have been proposed [3, 4] to reduce the computation complexity of trained CNNs by removing less important filters or weights. However, the performance of a pruned network is affected by many factors such as the complexity of the model before pruning, dataset, and initial weights of the models. In this thesis, the effects of the initial model complexity, the compression ratio, the target task (dataset) and the initial weight values of the models to the performance after model pruning are considered. Several complex models have been trained and compressed to compare the testing results. Herein, four VGG networks [5] have been used (VGG11, VGG13, VGG16, and VGG19) to carry out the study of the complexity of Scale-dependent and Scale-invariant CNN-based classification models with various pruning ratios. The fine-tuned VGG Networks with the target dataset have also been used to show effect of initial weight values on pruned network. The Gradient-weighted Class Activation Mapping (Grad-CAM) approach [6] is used to visualize the activation map of pruned network to explain the accuracy changes along with the compression of the network. The experimental results showed that the initial model complexity and pre-trained weights can affect the performance of the pruned models.

Number of Pages

58

Publisher

South Dakota State University

Share

COinS
 

Rights Statement

In Copyright