Spiking neural networks can leverage the high efficiency of temporal coding by converting architectures that were previously learnt with the backpropagation algorithm. In this work, we present the application of a time-coded neuron model for the conversion of classic artificial neural networks that reduces the computational complexity in the synaptic connections. By adapting the ReLU activation function, the network achieved a sparsity of 0.142 spikes per neuron. The classification of handwritten digits from the MNIST dataset show that the neuron model is able to convert convolutional neural networks with several hidden layers.
«
Spiking neural networks can leverage the high efficiency of temporal coding by converting architectures that were previously learnt with the backpropagation algorithm. In this work, we present the application of a time-coded neuron model for the conversion of classic artificial neural networks that reduces the computational complexity in the synaptic connections. By adapting the ReLU activation function, the network achieved a sparsity of 0.142 spikes per neuron. The classification of handwritte...
»