User: Guest  Login
Document type:
Konferenzbeitrag 
Author(s):
Etienne Mueller, Daniel Auge, Simon Klimaschka, Alois Knoll 
Title:
Neural Oscillations for Energy-Efficient Hardware Implementation of Sparsely Activated Deep Spiking Neural Networks 
Abstract:
Spiking neural networks offer the potential for ultra-low power computations on specialized hardware, as they only consume energy when information is being transmitted. High energy efficiency is ensured when sparse temporal codes are used in which every neuron emits at most one single action potential. We present a mathematically lossless approach to convert a conventionally trained neural network to a temporal-coded spiking network. By analyzing the characteristics of spiking neurons, we elimin...    »
 
Keywords:
Spiking Neural Networks, Conversion, Temporal Coding 
Book / Congress title:
Association for the Advancement of Artificial Intelligence (AAAI) 
Congress (additional information):
Practical Deep Learning in the Wild 
Year:
2022