DS-GA 1014: Optimization and Computational Linear Algebra for Data Science
NYU Center for Data Science, Fall 2020
Instructor: Léo Miolanelm4271
Lectures: Thu. 3:30pm-5:10pm, 19 West 4th Street, Room 101.
Léo's Office Hours: Tuesdays 9-10 AM, 8:30-9:30 PM, Fridays 4-5PM
Section leaders
Irina Espejo Morales iem244 (Sec. 3, online)
Alex Weng Dong awd275 (Sec. 2 & 4, in-person)
Carles Domingo cd2754 (Sec. 2 & 4, remote)


  • Section 2: Tuesday's 4-4:50pm. In person with Alex at 194 Mercer Street, Room 206. Remote with Carles. Section 2 & 4 material
  • Section 3: Wednesday's 8-8:50am. Remote with Irina. Section 3 material
  • Section 4: Wednesday's 10:30-11:20am. In person with Alex at Meyer Hall, Room 102. Remote with Carles. Section 2 & 4 material


  • NYU classes: links to Zoom meetings and recording, seating assignments, link to pre-recorded videos (in the 'Overview section').
  • Piazza: online forum used for asking/answering questions, make announcements... Access code available on NYU classes
  • Gradescope: this is where you have to upload your homeworks and do the weekly quizzes.


This course covers the basics of optimization and computational linear algebra used in Data Science. About 66% of the lectures will be about linear algebra and ~33% about convex optimization. The first 5 lectures will cover basic linear algebra. Then we will study applications: Markov chains and PageRank, PCA and dimensionality reduction, spectral clustering, linear regression. Lastly, we will go over convex functions, optimality conditions and gradient descent. See the syllabus and outline for an overview of the contents of the course.
Important: This course is "proof-based", meaning that we will prove theorems and that you will have to prove things in homeworks/exams.
This course was previously taught by Afonso Bandeira. The material of this page was inspired by his Fall 2016 and Fall 2018 lectures.

Questions and feedback

I am here to help: please feel free to ask me any questions you may have, in class, during office hours or by email.

Feedback: If you have any comment or feedback on the class (it's going too fast, too slow...) please let me know (in person or through email) or submit an anonymous comment to this Google form. Having direct feedback from you is the best way for me to try give lectures that you like!


There will be weekly homeworks, weekly quizzes, a midterm and a final exam. Checkout the syllabus for more details. You will find exams from past years in the Archive section.
  • Midterm: Thursday, October 29. The questions have to be downloaded from Gradescope (between 0:01AM and 9:59PM). Then you have to upload your scanned work within the next 2 hours.
  • Midterm review questions: review problems. Solutions: solutions.
  • Midterm: midterm. Solutions: solutions.
  • Final review questions: you can use the 2018 review questions and the 2019 review questions . You will also find the final exams of the years 2018 and 2019 at the bottom of this page. There will be no fully typed solutions to the review questions and to the 2018 final (note that solutions to the 2019 final are available). However, here is are hints for the problems that should be enough for you to find the solutions. If you still have trouble solving some problems, please send me an email or join the office hours.
  • Final: Thursday, December 17. The questions have to be downloaded from Gradescope (between 0:01AM and 9:59PM). Then you have to upload your scanned work within the next 2 hours.

Grade = 40% Homework + 5% Quizzes + 20% Midterm + 35% Final. The exams are open book/notes.

Books (optional, some references are also given in the notes)

We will not follow any book. If you are looking for practice exercises, I would recommend to look at the course's archive. However, if you are looking for further development about linear algebra and optimization, you can refer to the following books:
  • Strang: Introduction to Linear Algebra (there are very good lecture videos available on YouTube)
  • Boyd & Vandenberghe: Introduction to applied linear algebra (available online here)
  • Nocedal & Wright: Numerical Optimization (should be available online via NYU here)
  • Boyd & Vandenberghe: Convex Optimization (available online here)
There are also great videos from Stephen Boyd teaching Linear Algebra for linear dynamical systems.


Warning: These notes are not meant to be proper lecture notes! They only gather the main concepts and results from the lecture, without any additional explanation, motivation, examples, figures... Also, most of them are preliminary versions and will be updated.


Slides of the videos and lectures will be posted here. Also, you can find the slides without the annotations on the GitHub Repository.
  1. Vector spaces : Lecture, Video 1.1, Video 1.2, Video 1.3.
  2. Linear transformations: Lecture, Video 2.1, Video 2.2, Video 2.3.
  3. Rank: Lecture, Video 3.1, Video 3.2.
  4. Norm & dot product: Lecture, Video 4.1, Video 4.2.
  5. Matrices & orthogonality: Video 5.1, Video 5.2.
  6. Eigenvalues, eigenvectors & Markov chains: Lecture, Lecture bis, Video 6.1.
  1. Spectral Theorem, PCA, Singular values decomposition: Lecture, Video 7, Video 8.
  2. Graphs and linear algebra: Lecture.
  3. Convex functions: Lecture, Video 9.1, Video 9.2.
  4. Regression: Lecture.
  5. Optimality conditions: Lecture, Video 11.
  6. Gradient descent: Lecture, Gradient descent demo notebook.
  7. Stochastic gradient descent: Lecture, SGD demo notebook.


Homeworks are typically weekly and will be posted here.
You are encouraged to write your homeworks using Latex, the most popular way to produce nice and clean scientific documents. That is not difficult! If you are new to Latex I would recommend you:
  1. Sign up on Overleaf, an online Latex editor.
  2. Create a new document and paste inside the following template.

Archive from past years