Mathematical Foundation of Data Science and Machine Learning

Course Objective: An introduction to mathematical foundations of machine learning. Prerequisites include calculus, linear algebra, and probability theory at the undergraduate level.

Instructor: Shuyang LING (sl3635@nyu.edu)

Lecture Time/Location: 1:15PM - 2:30PM on Mondays and Wednesdays.

Discussion/Recitation: 1:15PM - 2:30PM on Fridays. This part will be used to discuss course projects.

Office: Room 1162-3

Textbook: I will provide lecture notes and reading materials throughout out the course. Here are several references we will use:

Grading policy:

  • Homework 40%

  • Final project 60%

Course schedule: The lecture note will be updated after each lecture.

Date Topics
Feb 07 (M) Singular value decomposition
Feb 09 (W) Singular value decomposition
Feb 14 (M) Principal component analysis
Feb 16 (W) Random projection
Feb 21 (M) Statistical learning theory
Feb 23 (W) Statistical learning theory
Feb 28 (M) Model selection and regularization
Mar 02 (W) Ridge regression and Lasso
Mar 07 (M) Rademacher complexity
Mar 09 (W) Uniform law of large numbers
Mar 14 (M) VC-dimension and excess risk
Mar 16 (W) Generalization for ridge regression and Lasso
Mar 21 (M) Support vector machine
Mar 23 (W) Support vector machine
Mar 28 (M) Generalization for SVM
Mar 30 (W) Kernel methods: feature, Hilbert space, kernel
Apr 04 (M) Kernel methods: PSD kernel and Mercer theorem
Apr 06 (W) Kernel methods: Bochner's theorem and random features
Apr 11 (M) Reproducing kernel Hilbert space
Apr 13 (W) Representer theorem
Apr 18 (M) Convex function
Apr 20 (W) Gradient descent
Apr 24 (U) Subgradient descent
Apr 25 (M) Stochastic gradient method
Apr 27 (W) Projected gradient method
May 04 (W) Neural network, approximation and generalization
May 09 (M) Interpolation, random feature model, NTK
May 11 (W) Optimization in shallow networks