Meng, M. H., Bai, G., Teo, S. G., & Dong, J. S. (2023, June 14). Supervised Robustness-preserving Data-free Neural Network Pruning. 2023 27th International Conference on Engineering of Complex Computer Systems (ICECCS). https://doi.org/10.1109/iceccs59891.2023.00013
Abstract:
When deploying pre-trained neural network models
in real-world applications, model consumers often encounter
resource-constraint platforms such as mobile and smart devices.
They typically use the pruning technique to reduce the size
and complexity of the model, generating a lighter one with
less resource consumption. Nonetheless, most existing pruning
methods are proposed with a premise that the model after being
pruned has a chance to be fine-tuned or even retrained based on
the original training data. This may be unrealistic in practice,
as the data controllers are often reluctant to provide their model
consumers with the original data.
In this work, we study the neural network pruning in the data-
free context, aiming to yield lightweight models that are not only
accurate in prediction but also robust against undesired inputs in
open-world deployments. Considering the absence of fine-tuning
and retraining that can fix the mis-pruned units, we replace the
traditional aggressive one-shot strategy with a conservative one
that treats model pruning as a progressive process. We propose
a pruning method based on stochastic optimization that uses
robustness-related metrics to guide the pruning process. Our
method is evaluated with a series of experiments on diverse
neural network models. The experimental results show that
it significantly outperforms existing one-shot data-free pruning
approaches in terms of robustness preservation and accuracy.
License type:
Publisher Copyright
Funding Info:
There was no specific funding for the research done