Kernel-based methods and their underlying structure of reproducing kernel Hilbert spaces (RKHS) are widely used in many areas of
applied mathematics, such as spatial statistics, machine learning and approximation theory. In this thesis, we provide an overview over RKHS of vector-valued functions and their corresponding operator-valued kernels. We show the link between
conditionally positive definite operator-valued kernels and reproducing kernel Pontryagin spaces. Further on, we provide a method to construct parameterized matrix-valued kernels. Moreover, we transfer concepts for qualitative estimates in approximation and statistical learning to the vector-valued
setting. To be precise, we demonstrate how stability and error estimates from approximation theory lead to estimates of covering numbers used in statistical learning.
«Kernel-based methods and their underlying structure of reproducing kernel Hilbert spaces (RKHS) are widely used in many areas of
applied mathematics, such as spatial statistics, machine learning and approximation theory. In this thesis, we provide an overview over RKHS of vector-valued functions and their corresponding operator-valued kernels. We show the link between
conditionally positive definite operator-valued kernels and reproducing kernel Pontryagin spaces. Further on, we provide a meth...
»