Jun 9, 2019 · We analyze the behavior of pruning over the course of training, finding that pruning's benefit to generalization increases with pruning's instability.
Pruning neural network parameters is often viewed as a means to compress models, but pruning has also been motivated by the desire to prevent overfitting.
Pruning deep neural network (DNN) parameters to reduce mem- ory/computation requirements is an area of much interest, but a variety.
Sep 25, 2019 · The generalization-stability tradeoff is essentially a positive correlation between test accuracy at convergence (i.e., generalization), and the ...
Pruning neural network parameters to reduce model size is an area of much interest, but the orig- inal motivation for pruning was the prevention of.
The Generalization-Stability Tradeoff In Neural Network Pruning
proceedings.neurips.cc › paper › file
Summary and Contributions: The paper addresses the commonly-observed phenomenon that test error tends to increase after a small amount of pruning. The paper ...
Pruning neural network parameters is often viewed as a means to compress models, but pruning has also been motivated by the desire to prevent overfitting.
Jun 9, 2019 · This work analyzes the behavior of pruning over the course of training, finding that pruning's benefit to generalization increases with ...
People also ask
How does pruning affect generalization?
Why is the state of neural network pruning so confusing on the fairness comparison setup and trainability in network pruning?
What are the different types of pruning in neural networks?
What are the pruning techniques for AI?
Oct 14, 2020 · These results explain the compatibility of pruning-based generalization improvements and the high generalization recently observed in ...
Dec 6, 2020 · We demonstrate that this "generalization-stability tradeoff" is present across a wide variety of pruning settings and propose a mechanism for ...