SWIS–Shared Weight bIt Sparsity for Efficient Neural Network Acceleration

Published in TinyML Research Symposium (TinyML), 2021

Recommended citation: Li, S., Romaszkan, W., Graening, A. and Gupta, P., 2021. SWIS--Shared Weight bIt Sparsity for Efficient Neural Network Acceleration. arXiv preprint arXiv:2103.01308 https://arxiv.org/pdf/2103.01308.pdf

A systolic arrary design that implements bit-serial computation to remove multipliers and make the neural network execution more efficient.

Download paper here

BibTeX citation:

@article{li2021swis, title={SWIS–Shared Weight bIt Sparsity for Efficient Neural Network Acceleration}, author={Li, Shurui and Romaszkan, Wojciech and Graening, Alexander and Gupta, Puneet}, journal={arXiv preprint arXiv:2103.01308}, year={2021} }