User: Guest  Login
Title:

Neural Oscillations for Energy-Efficient Hardware Implementation of Sparsely Activated Deep Spiking Neural Networks

Document type:
Konferenzbeitrag
Author(s):
Mueller, Etienne; Auge, Daniel; Klimaschka, Simon; Knoll, Alois
Abstract:
Spiking neural networks offer the potential for ultra-low power computations on specialized hardware, as they only consume energy when information is being transmitted. High energy efficiency is ensured when sparse temporal codes are used in which every neuron emits at most one single action potential. We present a mathematically lossless approach to convert a conventionally trained neural network to a temporal-coded spiking network. By analyzing the characteristics of spiking neurons, we elimin...     »
Keywords:
Spiking Neural Networks, Conversion, Temporal Coding
Book / Congress title:
Association for the Advancement of Artificial Intelligence (AAAI)
Congress (additional information):
Practical Deep Learning in the Wild
Year:
2022
Reviewed:
ja
Fulltext / DOI:
https://doi.org/%3Catt%3Adoi%3E
Notes:
Accepted: 6.12.2021 Date of Conference: 28.02.2022
 BibTeX