We contribute to the field of neural networks, and recurrent ones in particular, in three ways. First we show how neural networks can be used to process not only points, but random variables summarised by their expectations and variances. Second, a framework to reduce sequences to points is introduced. Third, leveraging variational inference we find latent state representations of sequences, enabling arbitrarily complex distributions of sequences. The methods are experimentally verified for human motion prediction, music generation and other domains.
«
We contribute to the field of neural networks, and recurrent ones in particular, in three ways. First we show how neural networks can be used to process not only points, but random variables summarised by their expectations and variances. Second, a framework to reduce sequences to points is introduced. Third, leveraging variational inference we find latent state representations of sequences, enabling arbitrarily complex distributions of sequences. The methods are experimentally verified for huma...
»