4.5 Article

Exploration of block-wise dynamic sparseness

Journal

PATTERN RECOGNITION LETTERS
Volume 151, Issue -, Pages 187-192

Publisher

ELSEVIER
DOI: 10.1016/j.patrec.2021.08.013

Keywords

Neural network; Dynamic sparseness; Block-wise matrix multiplication

Ask authors/readers for more resources

This paper introduces a new method for dynamic sparseness that combines sparsity with block-wise matrix-vector multiplications to improve efficiency. Unlike static sparseness, this method preserves the full network capabilities and outperforms static sparseness baselines in the task of language modeling.
Neural networks have achieved state of the art performance across a wide variety of machine learning tasks, often with large and computation-heavy models. Inducing sparseness as a way to reduce the memory and computation footprint of these models has seen significant research attention in recent years. In this paper, we present a new method for dynamic sparseness , whereby part of the computations are omitted dynamically, based on the input. For efficiency, we combined the idea of dynamic sparseness with block-wise matrix-vector multiplications. In contrast to static sparseness, which permanently zeroes out selected positions in weight matrices, our method preserves the full network capabilities by potentially accessing any trained weights. Yet, matrix vector multiplications are accelerated by omitting a pre-defined fraction of weight blocks from the matrix, based on the input. Experimental results on the task of language modeling, using recurrent and quasi-recurrent models, show that the proposed method can outperform static sparseness baselines. In addition, our method can reach similar language modeling perplexities as the dense baseline, at half the computational cost at inference time. (c) 2021 Published by Elsevier B.V.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available