E-resources
Peer reviewed
-
Guo, Suhan; Lai, Bilan; Yang, Suorong; Zhao, Jian; Shen, Furao
Pattern recognition, August 2023, 2023-08-00, Volume: 140Journal Article
•We integrate the sensitivity measure from SNIP into the “training while fine-tuning” framework to form a more powerful pruning strategy by adapting the unstructured pruning measure from SNIP to allow filterlevel compression. In practice, the sensitivity score can be easily computed as the gradient of the connection mask applied to the weight matrix. Independent of the model structure, the sensitivity score can be applied to most neural networks for pruning purposes.•We mitigate the sampling bias in the single-shot influence score by introducing the difference between the learned pruning strategy and the single-shot strategy as the second loss component. Filter influence is measured on batched data, where a convolutional layer is used to discover the robust influence from the noise of the batch. The learning process is guided by the score provided by the influence measure.•Our algorithm can dynamically tweak the training goal between improving model accuracy and pruning more filters. We add a selfadaptive hyper-parameter Display omitted As neural networks get deeper for better performance, the demand for deployable models on resource-constrained devices also grows. In this work, we propose eliminating less sensitive filters to compress models. The previous method evaluates neuron importance using the connection matrix gradient in a single shot. To mitigate the sampling bias, we integrate this measure into the previously proposed “pruning while fine-tuning” framework. Besides classification errors, we introduce the difference between the learned and the single-shot strategy as the second loss component with a self-adjustive hyper-parameter that balances the training goal between improving accuracy and pruning more filters. Our Sensitivity Pruner (SP) adapts the unstructured pruning saliency metric to structured pruning tasks and enables the strategy to be derived sequentially to accommodate the updating sparsity. Experimental results demonstrate that SP significantly reduces the computational cost and the pruned models give comparable or better performance on CIFAR10, CIFAR100, and ILSVRC-12 datasets.
Shelf entry
Permalink
- URL:
Impact factor
Access to the JCR database is permitted only to users from Slovenia. Your current IP address is not on the list of IP addresses with access permission, and authentication with the relevant AAI accout is required.
Year | Impact factor | Edition | Category | Classification | ||||
---|---|---|---|---|---|---|---|---|
JCR | SNIP | JCR | SNIP | JCR | SNIP | JCR | SNIP |
Select the library membership card:
If the library membership card is not in the list,
add a new one.
DRS, in which the journal is indexed
Database name | Field | Year |
---|
Links to authors' personal bibliographies | Links to information on researchers in the SICRIS system |
---|
Source: Personal bibliographies
and: SICRIS
The material is available in full text. If you wish to order the material anyway, click the Continue button.