Event Details

Improving Capsule Networks using Zero-Skipping and Pruning

Presenter: Ramin Sharifi
Supervisor:

Date: Mon, October 18, 2021
Time: 09:00:00 - 10:00:00
Place: ZOOM - Please see below.

ABSTRACT

Zoom meeting link:  https://uvic.zoom.us/j/83975244723?pwd=V2dvVlpiU0VkYjdqTVJwTWhtVFZvQT09

Meeting ID: 839 7524 4723
Password:  275387

Note: Please log in to Zoom via SSO and your UVic Netlink ID

 


Abstract:

Capsule Networks are the next generation of image classifiers. Although they have several advantages over conventional Convolutional Neural Networks, they remain computationally heavy. Since inference on Capsule Networks is time-consuming, its usage becomes limited to tasks in which latency is not essential. Approximation methods in Deep Learning help networks lose redundant parameters to increase speed and lower energy consumption.

In the first part of this work, we go through an algorithm called “zero-skipping.” More than 50% of trained CNNs consist of zeros or values small enough to be considered zero. Since multiplication by zero is a trivial operation, the zero-skipping algorithm can play a massive role in speed increase throughout the network. We investigate the eligibility of Capsule Networks for this algorithm on two different datasets. Our results suggest that Capsule Networks contain enough zeros in their Primary Capsules to benefit from this algorithm.


In the second part of this work, we investigate pruning as one of the most popular Neural Network approximation methods. Pruning is the act of finding and removing neurons that have low or no impact on the output. We run our experiments on four different datasets. Pruning Capsule Networks results in the loss of redundant Primary Capsules. Our outcome shows a significant increase in speed with no or minuscule drop of accuracy. Also, we discuss how dataset complexity affects the pruning strategy.