Mathematics, Information, and Computation (MIC) Seminar
A seminar of the NYU Math and Data Group


2017-30-21:
Liza Rebrova (University of Michigan)
March 21st, 2.30pm-3.30pm at Center for Data Science (60 5th Av) in room 650
Title: Local and global obstructions for the random matrix norm regularization
Abstract: We study large n by n random matrices A with i.i.d. entries. If the distribution of the entries have mean zero and at least gaussian decay, then the operator norm ||A|| is at most of order sqrt(n) with high probability. However, for the distributions with heavier tails we cannot expect the same norm bound any more. So, we are motivated by the question: under what conditions operator norm of a heavy-tailed matrix can be improved by modifying just a small fraction of its entries (a small sub-matrix of A)? I will explain why this happens exactly when the entries of A have zero mean and bounded variance. I will also discuss the almost optimal dependence between the size of the removed sub-matrix and the resulting operator norm that we've obtained. This is a joint work with Roman Vershynin, inspired by the methods developed recently by Can Le and R. Vershynin and in our joint work with Konstantin Tikhomirov.