Background

SNIP: Single-shot Network Pruning based on Connection Sensitivity

GitHub - MingSun-Tse/Awesome-Pruning-at-Initialization: [IJCAI'22 Survey] Recent Advances on Neural Network Pruning at Initialization.

FracTrain: Fractionally Squeezing Bit Savings Both Temporally and Spatially for Efficient DNN Training

Picking Winning Tickets Before Training by Preserving Gradient Flow

Progressive Skeletonization: Trimming more fat from a network at initialization

The lottery ticket refers to a phenomenon in training Deep Neural Network models that models that are pruned at initialization may still recover to the original accuracy as if it was a dense model that is trained from scratch. The lottery ticket started a field that named ‘prune-at-init’, and this project aims to make a contribution to this field.

The later evolution of this filed, ‘prune-at-init’, has focused on how to use gradient information at the start of training to better guide a sparsity mask, so that later training can benefit from it.

Project aim

The core element of this project is then to replicate some STOA methods in this ‘prune-at-init’ field, such as SNIP and Progressive Skeletonization. The investigation later would focus on 1) whether we can use a dynamic pruning mask to help sparse training? and 2) can we progressively change the sparsity so that we can maximise the sparsity while maintaining the original accuracy.

Skill requirements

The candidate should be experienced in Object Orientated Programming in Python. Ideally, the candidate should have experience or at least willing to learn various Machine Learning frameworks in Python (such as Pytorch and Pytorch Lightning).

Meeting notes (Dec 13th)

Setups and Todos