CaltechAUTHORS
  A Caltech Library Service

LNS-Madam: Low-Precision Training in Logarithmic Number System using Multiplicative Weight Update

Zhao, Jiawei and Dai, Steve and Venkatesan, Rangharajan and Zimmer, Brian and Ali, Mustafa and Liu, Ming-Yu and Khailany, Brucek and Dally, William J. and Anandkumar, Anima (2022) LNS-Madam: Low-Precision Training in Logarithmic Number System using Multiplicative Weight Update. IEEE Transactions on Computers, 71 (12). pp. 3179-3190. ISSN 0018-9340. doi:10.1109/tc.2022.3202747. https://resolver.caltech.edu/CaltechAUTHORS:20221202-906480600.2

Full text is not posted in this repository. Consult Related URLs below.

Use this Persistent URL to link to this item: https://resolver.caltech.edu/CaltechAUTHORS:20221202-906480600.2

Abstract

Representing deep neural networks (DNNs) in low-precision is a promising approach to enable efficient acceleration and memory reduction. Previous methods that train DNNs in low-precision typically keep a copy of weights in high-precision during the weight updates. Directly training with low-precision weights leads to accuracy degradation due to complex interactions between the low-precision number systems and the learning algorithms. To address this issue, we develop a co-designed low-precision training framework, termed LNS-Madam, in which we jointly design a logarithmic number system (LNS) and a multiplicative weight update algorithm (Madam). We prove that LNS-Madam results in low quantization error during weight updates, leading to stable performance even if the precision is limited. We further propose a hardware design of LNS-Madam that resolves practical challenges in implementing an efficient datapath for LNS computations. Our implementation effectively reduces energy overhead incurred by LNS-to-integer conversion and partial sum accumulation. Experimental results show that LNS-Madam achieves comparable accuracy to full-precision counterparts with only 8 bits on popular computer vision and natural language tasks. Compared to FP32 and FP8, LNS-Madam reduces the energy consumption by over 90% and 55%, respectively.


Item Type:Article
Related URLs:
URLURL TypeDescription
https://doi.org/10.1109/TC.2022.3202747DOIArticle
ORCID:
AuthorORCID
Zhao, Jiawei0000-0002-5726-6040
Dai, Steve0000-0002-5045-1964
Zimmer, Brian0000-0001-9997-3141
Ali, Mustafa0000-0002-4452-6464
Liu, Ming-Yu0000-0002-2951-2398
Khailany, Brucek0000-0002-7584-3489
Dally, William J.0000-0003-4632-2876
Anandkumar, Anima0000-0002-6974-6797
Issue or Number:12
DOI:10.1109/tc.2022.3202747
Record Number:CaltechAUTHORS:20221202-906480600.2
Persistent URL:https://resolver.caltech.edu/CaltechAUTHORS:20221202-906480600.2
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:118207
Collection:CaltechAUTHORS
Deposited By: Research Services Depository
Deposited On:04 Jan 2023 17:16
Last Modified:04 Jan 2023 17:16

Repository Staff Only: item control page