Many important tensor network algorithms can benefit from orthogonality constraints on the tensors of the network or even require that tensors form isometries when considered as matrices. We explore different methods to orthogonalize tensor networks and present a canonical form that can be used to flexibly shift centers of orthogonality. After methods to orthogonalize tensor networks have been established, we clarify how a network can be considered as an element of a product of Riemannian manifolds to eventually introduce a modified line-search method based on gradient descent that can be used to minimize complex-valued functions on tensor networks.
«
Many important tensor network algorithms can benefit from orthogonality constraints on the tensors of the network or even require that tensors form isometries when considered as matrices. We explore different methods to orthogonalize tensor networks and present a canonical form that can be used to flexibly shift centers of orthogonality. After methods to orthogonalize tensor networks have been established, we clarify how a network can be considered as an element of a product of Riemannian manifo...
»