SNIP: Single-shot Network Pruning based on Connection Sensitivity
Picking Winning Tickets Before Training by Preserving Gradient Flow
Progressive Skeletonization: Trimming more fat from a network at initialization
The lottery ticket refers to a phenomenon in training Deep Neural Network models that models that are pruned at initialization may still recover to the original accuracy as if it was a dense model that is trained from scratch. The lottery ticket started a field that named ‘prune-at-init’, and this project aims to make a contribution to this field.
The later evolution of this filed, ‘prune-at-init’, has focused on how to use gradient information at the start of training to better guide a sparsity mask, so that later training can benefit from it.
The core element of this project is then to replicate some STOA methods in this ‘prune-at-init’ field, such as SNIP and Progressive Skeletonization. The investigation later would focus on 1) whether we can use a dynamic pruning mask to help sparse training? and 2) can we progressively change the sparsity so that we can maximise the sparsity while maintaining the original accuracy.
The candidate should be experienced in Object Orientated Programming in Python. Ideally, the candidate should have experience or at least willing to learn various Machine Learning frameworks in Python (such as Pytorch and Pytorch Lightning).