Open Access System for Information Sharing

Login Library

 

Conference
Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Symmetry Regularization and Saturating Nonlinearity for Robust Quantization

Title
Symmetry Regularization and Saturating Nonlinearity for Robust Quantization
Authors
Park, SeinJang, YeongsangPark, Eunhyeok
Date Issued
2022-10-27
Publisher
Springer Science and Business Media Deutschland GmbH
Abstract
Robust quantization improves the tolerance of networks for various implementations, allowing reliable output in different bit-widths or fragmented low-precision arithmetic. In this work, we perform extensive analyses to identify the sources of quantization error and present three insights to robustify a network against quantization: reduction of error propagation, range clamping for error minimization, and inherited robustness against quantization. Based on these insights, we propose two novel methods called symmetry regularization (SymReg) and saturating nonlinearity (SatNL). Applying the proposed methods during training can enhance the robustness of arbitrary neural networks against quantization on existing post-training quantization (PTQ) and quantization-aware training (QAT) algorithms and enables us to obtain a single weight flexible enough to maintain the output quality under various conditions. We conduct extensive studies on CIFAR and ImageNet datasets and validate the effectiveness of the proposed methods.
URI
https://oasis.postech.ac.kr/handle/2014.oak/116833
ISSN
0302-9743
Article Type
Conference
Citation
17th European Conference on Computer Vision, ECCV 2022, page. 206 - 222, 2022-10-27
Files in This Item:
There are no files associated with this item.

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Views & Downloads

Browse