Jun 9, 2019 · We analyze the behavior of pruning over the course of training, finding that pruning's benefit to generalization increases with pruning's instability.
Pruning neural network parameters is often viewed as a means to compress models, but pruning has also been motivated by the desire to prevent overfitting.
Pruning deep neural network (DNN) parameters to reduce mem- ory/computation requirements is an area of much interest, but a variety.
Pruning neural network parameters to reduce model size is an area of much interest, but the orig- inal motivation for pruning was the prevention of.
Summary and Contributions: The paper addresses the commonly-observed phenomenon that test error tends to increase after a small amount of pruning. The paper ...
Pruning neural network parameters is often viewed as a means to compress models, but pruning has also been motivated by the desire to prevent overfitting.
Jun 9, 2019 · This work analyzes the behavior of pruning over the course of training, finding that pruning's benefit to generalization increases with ...
People also ask
Oct 14, 2020 · These results explain the compatibility of pruning-based generalization improvements and the high generalization recently observed in ...
Dec 6, 2020 · We demonstrate that this "generalization-stability tradeoff" is present across a wide variety of pruning settings and propose a mechanism for ...