Applied and Computational Mathematics Seminar
Wednesday, June 22, 2016 - 11:00am
1 hour (actually 50 minutes)
Symmetric positive definite (SPD) matrices play important roles in numerous areas of mathematics, statistics, and their applications in machine learning, optimization, computer vision, and related fields. Among the most important topics in the study of SPD matrices are the distances between them that can properly capture the geometry of the set of SPD matrices. Two of the most widely used distances are the affine-invariant distance and the Log-Euclidean distance, which are geodesic distances corresponding to two different Riemannian metrics on this set. In this talk, we present our recently developed concept of Log-Hilbert-Schmidt (Log-HS) distance between positive definite Hilbert-Schmidt operators on a Hilbert space.This is the generalization of the Log-Euclidean distance between SPD matrices to the infinite-dimensional setting. In the case of reproducing kernel Hilbert space (RKHS) covariance operators, we obtain closed form formulas for the Log-HS distance, expressed via Gram matrices. As a practical example, we demonstrate an application of the Log-HS distance to the problem of image classification in computer vision.