Optimization-based Data Analysis

Instructor: Carlos Fernandez-Granda (cfgranda@cims.nyu.edu)

This course covers data-analysis methods that exploit low-dimensional structure, captured by sparse or low-rank models, to extract information from data using optimization.

Announcements

  • In problem 2 of Homework 2 it is highly recommended that you use CVX.

  • The deadline to send the project report has been extended to May 12.

  • The lecture notes on sparse regression have been updated to include a section on group sparsity.

  • A description of the project report has been added to the Project section.

  • There will not be presentations during the last lecture, instead I will give a review of the main ideas in the course.

  • Please ask for an appointment during April to discuss your project.

Syllabus

  • Convex optimization, optimization algorithms, denoising, learning representations, sparse regression, concentration of measure, compressed sensing, super-resolution, randomized linear algebra, low-rank models, phase retrieval.

  • See the schedule for more details.

General Information

Prerequisites

Linear algebra and probability. Some programming skills and some exposure to statistics, machine learning and/or optimization are desirable.

Lecture

Monday 1:25-3:15 pm, CIWW 517

Office hours

By appointment via email. You are required to schedule at least two appointments to discuss the project.

Grading policy

Homework (40%) + Project proposal (10%) + Project (50%)

Books

We will provide self-contained notes. In addition, the book Statistical Learning with Sparsity The Lasso and Generalizations by Hastie, Tibshirani and Wainwright has been placed on reserve in the library. It is also available online.

Additional references are listed in the schedule.

Other courses

Professor Overton is offering the course Convex and Nonsmooth Optimization this semester. I recommend that you take it. The contents of the two courses are very complementary.