Speaker
Description
Randomized algorithms have gained significant attention in numerical linear algebra during the last decade. In particular, randomized sketching is used as a simple but effective technique in which a random matrix acts as a dimension reduction map: a problem that features a potentially large input matrix $A \in \mathbb{R}^{n \times n}$ is reduced to a smaller one by replacing $A$ with $A \Omega$, where $\Omega \in \mathbb{R}^{n \times \ell}$ is a random matrix with $\ell \ll n$. This has been used with great success in, e.g., randomized SVD and subspace projection methods for large-scale eigenvalue problems, such as FEAST.
Algorithms based on sketching typically draw random matrices $\Omega$ from standard distributions, such as Gaussian. However, in certain applications it may be advantageous to run computations with $\Omega$ that is compatible with the underlying structure of the problem. In this talk we discuss algorithms that use random Khatri-Rao product matrices: each column of $\Omega$ is generated as the Kronecker product of two Gaussian random vectors. This will allow for faster operations when the matrix $A$ is represented as a short sum of Kronecker products, which arises frequently, e.g., from the discretization of PDEs on tensor product domains. We focus on applications in large-scale eigenvalue computation, and provide theoretical and numerical evidence that the use of random Khatri-Rao product matrices $\Omega$ instead of unstructured Gaussian random matrices leads to good estimates.
This is joint work with Luka Grubišić, Daniel Kressner and Hei Yin Lam.