|Biologically inspired Spiking Neural Networks (SNNs) offer a promising path toward achieving energy-efficient artificial intelligence systems. However, in the hardware field, the deployment of deep SNNs has been stagnant, where the wide range of membrane potential of spiking neuron poses a significant challenge to hardware efficiency. To address this issue, this work proposes a guideline and a novel hardware-friendly method to constrain the membrane potential, reducing the associated hardware overhead while fully maintaining the inference accuracy. Experiments demonstrate that the proposed method is effective and achieves substantial memory usage reduction for a 20-layer ResNet model. This work paves the way toward the efficient hardware implementation of even deeper SNNs.
*** Title, author list and abstract as seen in the Camera-Ready version of the paper that was provided to Conference Committee. Small changes that may have occurred during processing by Springer may not appear in this window.