English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Paper

Random and Adversarial Bit Error Robustness: Energy-Efficient and Secure DNN Accelerators

MPS-Authors
/persons/resource/persons228449

Stutz,  David
Computer Vision and Machine Learning, MPI for Informatics, Max Planck Society;

/persons/resource/persons45383

Schiele,  Bernt
Computer Vision and Machine Learning, MPI for Informatics, Max Planck Society;

External Resource
No external resources are shared
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)

arXiv:2104.08323.pdf
(Preprint), 3MB

Supplementary Material (public)
There is no public supplementary material available
Citation

Stutz, D., Chandramoorthy, N., Hein, M., & Schiele, B. (2021). Random and Adversarial Bit Error Robustness: Energy-Efficient and Secure DNN Accelerators. Retrieved from https://arxiv.org/abs/2104.08323.


Cite as: https://hdl.handle.net/21.11116/0000-0009-8108-C
Abstract
Deep neural network (DNN) accelerators received considerable attention in
recent years due to the potential to save energy compared to mainstream
hardware. Low-voltage operation of DNN accelerators allows to further reduce
energy consumption significantly, however, causes bit-level failures in the
memory storing the quantized DNN weights. Furthermore, DNN accelerators have
been shown to be vulnerable to adversarial attacks on voltage controllers or
individual bits. In this paper, we show that a combination of robust
fixed-point quantization, weight clipping, as well as random bit error training
(RandBET) or adversarial bit error training (AdvBET) improves robustness
against random or adversarial bit errors in quantized DNN weights
significantly. This leads not only to high energy savings for low-voltage
operation as well as low-precision quantization, but also improves security of
DNN accelerators. Our approach generalizes across operating voltages and
accelerators, as demonstrated on bit errors from profiled SRAM arrays, and
achieves robustness against both targeted and untargeted bit-level attacks.
Without losing more than 0.8%/2% in test accuracy, we can reduce energy
consumption on CIFAR10 by 20%/30% for 8/4-bit quantization using RandBET.
Allowing up to 320 adversarial bit errors, AdvBET reduces test error from above
90% (chance level) to 26.22% on CIFAR10.