Benutzer: Gast  Login
Dokumenttyp:
Konferenzbeitrag
Autor(en):
Mueller, Etienne; Auge, Daniel; Klimaschka, Simon; Knoll, Alois
Titel:
Neural Oscillations for Energy-Efficient Hardware Implementation of Sparsely Activated Deep Spiking Neural Networks
Abstract:
Spiking neural networks offer the potential for ultra-low power computations on specialized hardware, as they only consume energy when information is being transmitted. High energy efficiency is ensured when sparse temporal codes are used in which every neuron emits at most one single action potential. We present a mathematically lossless approach to convert a conventionally trained neural network to a temporal-coded spiking network. By analyzing the characteristics of spiking neurons, we elimin...     »
Stichworte:
Spiking Neural Networks, Conversion, Temporal Coding
Kongress- / Buchtitel:
Association for the Advancement of Artificial Intelligence (AAAI)
Kongress / Zusatzinformationen:
Practical Deep Learning in the Wild
Jahr:
2022
Reviewed:
ja
Volltext / DOI:
https://doi.org/%3Catt%3Adoi%3E
Hinweise:
Accepted: 6.12.2021 Date of Conference: 28.02.2022
 BibTeX