Acharya, J., Iyer, L. R., & Jiang, W. (2022). Low Precision Local Learning for Hardware-Friendly Neuromorphic Visual Recognition. ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). https://doi.org/10.1109/icassp43922.2022.9746618
Abstract:
Quantization is an important approach in making hardware-friendly implementation. However, while various quantization techniques have been extensively explored in deep learning for reducing the memory and computational footprint of the models, similar investigations are few in neuromorphic computing, which is supposed to demonstrate high power and memory efficiency over its more traditional counterpart. In this work, we explore quantization-aware-training (QAT) for SNNs as well as fully quantized transfer-learning using the DECOLLE learning algorithm as the basis system, whose local loss based learning is bio-plausible, avoids complex back-propagation-through-time and potentially hardware-friendly. We also evaluate different rounding functions, and analyze their effects on learning. We validate our results on two datasets, DVS-Gestures, and N-MNIST, where we reach within 0.3% difference from full precision accuracy for both datasets using only 3-bit weights with a convolutional neural network. We are currently exploring other datasets to understand the generalizability of the explored quantization schemes.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the A*STAR - RIE2020 - Advanced Manufacturing and Engineering
Grant Reference no. : A1687b0033