— Despite the importance of the linear consensus
algorithm for networked systems, yet, there is no agreement on
the intrinsic mathematical structure that supports the observed
exponential averaging behavior among n agents for any initial
condition. Here we add to this discussion in linear consensus
theory by introducing relative entropy as a novel Lyapunov
function. We show that the configuration space of consensus
systems is isometrically embedded into a statistical manifold.
On projective n-1-space relative entropy is a common timeinvariant Lyapunov function along solutions of the time-varying
algorithm. For cases of scaled symmetry of the update law, we
expose a gradient flow structure underlying the dynamics that
evolve relative entropy in a steepest descent gradient scheme.
On that basis we provide exact performance rates and upper
bounds based on spectral properties of the update law governing
the behavior on the statistical manifold. The condition of scaled
symmetry allows to exhibit gradient flow structures for cases
where the original update law is neither doubly stochastic, nor
self-adjoint. The results related to the gradient flow structure
are obtained by exploiting lossless passivity properties. We show
that lossless passivity of a dynamical system implies a gradient
flow structure on a manifold and vice versa. Exploiting lossless
passivity amounts to constructing the combination of dissipation
(pseudo)metric with Lyapunov function.
«
— Despite the importance of the linear consensus
algorithm for networked systems, yet, there is no agreement on
the intrinsic mathematical structure that supports the observed
exponential averaging behavior among n agents for any initial
condition. Here we add to this discussion in linear consensus
theory by introducing relative entropy as a novel Lyapunov
function. We show that the configuration space of consensus
systems is isometrically embedded into a statistical manifold.
On projecti...
»