▶ Topic:Low-rank approximation from via Partial Matrix Sampling: Assumption-free Local Minimum Analysis and Applications in Memory-efficient Kernel PCA
Author:Ji Chen and Xiaodong Li
Time & Date :9:30-10:30, August 29, Wednesday
Venue:Room 207, Chengdao Building
Speaker:Xiaodong Li
UC Davis, Statistics Department
Abstract
讲座摘要
In this talk, we study nonconvex matrix completion from a perspective of assumption-free approximation: with no assumptions on the underlying positive semidefinite matrix in terms of rank, eigenvalues or eigenvectors, we established the low-rank approximation error based on any local minimum of the proposed objective function. As interesting byproducts, when certain assumptions are imposed on the rank, eigenvalues, eigenvectors, and the sampling rates, corollaries of our main theorem improve the state-of-the-art results in the literature of nonconvex matrix completion with no spurious local minima. We also discussed how the proposed low-rank approximation framework is applied to memory-efficient Kernel PCA, and numerical experiments also show that our approach is competitive in terms of approximation accuracy compared to the well-known Nystrom algorithm.
Biography
主讲人简介
Dr. Xiaodong Li is an assistant professor in the statistics department at UC Davis. Prior to that, he worked in the statistics department of Wharton School at University of Pennsylvania for two years. He got Ph.D of mathematics at Stanford University in 2013, and BS at Peking University in 2008. He has general research interests in machine learning, statistics, optimization and signal processing. Particularly, he is interested in the connection between optimization/spectral methods and the underlying low-rank/spectral structures. His papers have been published in various journals of statistics, mathematics and engineering such as AoS, ACHA, FOCM, JACM, IEEE TIT, etc.
领取专属 10元无门槛券
私享最新 技术干货