“Plasticity and Temporal Coding in Spiking Neural Networks Applied to Representation Learning”.
During this CONECT seminar, Adrien Fois did present his recent work on “Plasticity and Temporal Coding in Spiking Neural Networks Applied to Representation Learning”:
The brain is a highly efficient computational system, capable of delivering 600 petaFlops while consuming only 20 W of energy, comparable to that of a light bulb. Computation is based on neural impulses, involving information encoding in the form of spikes and learning based on these spikes. According to the dominant paradigm, information is encoded by the number of spikes. However, an alternative paradigm suggests that information is contained in the precise timing of the spikes, offering significant advantages in terms of energy efficiency and information transfer speed.
My work aims to extract representations from temporal codes using event-based learning rules that are both spatially and temporally local. In particular, I will present a learning model that learns representations not in synaptic weights, but in transmission delays, which inherently operate in the temporal domain. Learning delays prove to be particularly relevant for processing temporal codes and enable the activation of a key function of spiking neurons: the detection of temporal coincidences.