Title: Matrix Learning Problems and First-Order Optimization
Speaker: Dr. Andreas Argyriou
Affiliation: Toyota Technological Institute at Chicago
Location: Room 217 Huxley Building
Time: 2:00pm
Abstract. In the past few years, there has been significant interest in nonsmooth convex optimization problems involving matrices, especially in the areas of machine learning, statistics and control. Instances of such problems are multitask learning and matrix completion, robust PCA, sparse inverse covariance selection etc. I will present PRISMA, a new optimization algorithm for minimizing a convex objective which decomposes into three parts: a smooth part, a simple non-smooth Lipschitz part, and a simple nonsmooth non-Lipschitz part. Our algorithm combines the methodologies of smoothing and accelerated proximal methods. Moreover, our convergence result removes the assumption of bounded domain required by Nesterov's smoothing methods. I will show how PRISMA can be applied to the problems of max-norm regularized matrix completion and clustering, robust PCA and sparse inverse covariance selection, and compare to state of the art methods.
About the speaker. Andreas Argyriou has received degrees in Computer Science from MIT and a PhD in Computer Science from UCL. The topic of his PhD work has been on machine learning methodologies integrating different tasks and data sources. He has held postdoctoral and research faculty positions at UCL, TTI Chicago, KU Leuven and is currently in Ecole Centrale Paris with an RBUCE-UP fellowship. His current interests are in the areas of machine learning with big and complex data, compressed sensing and convex optimization methods.

