Date. Topics. Notes. Coursework.
8/28 Administrivia; perceptron. pdf. hw0 out: tex, pdf.
8/30 Perceptron; decomposition of learning problems. pdf.
9/6 Failure of linear; box apx (linear over boxes, decision trees). pdf. hw0 due!
9/11 End of box apx: boosted decision trees, branching programs, 3-layer ReLU nets. Start of poly-fit: Stone-Weierstrass! pdf.
9/13 Polynomial fit via Stone-Weierstrass: sums of exponentials, RBF kernels, 2-layer networks. pdf.
9/18 RKHS interlude. pdf.
9/20 Succinct deep networks; multiplication with networks; networks and polynomials, smooth functions; Wasserstein distance, probability modeling, and GANs.
Convexity bootcamp; gradient descent in the smooth case; subgradient descent in the bounded+Lipschitz case; mirror descent and geometry; GLORIOUS MAUREY SPARSIFICATION; consistency of convex risk minimization; something non-convex; clustering.
Concentration bootcamp; Finite classes and primitive covering; Symmetrization and Rademacher complexity; Lipschitz losses, margin losses, finite classes; Full covering, Dudley, and Sudakov; Linear predictors via covers and Rademacher; VC dimension and VC for neural networks; Rademacher and covering bounds for neural networks;
Depends on how much time remains! I hope: heavy tails; online learning; reinforcement learning; spectral methods.

Homework policies

Project policies


Other learning theory-ish classes. All of these courses are different, and all have good material, and there are many I neglected to include!

Textbooks and surveys. Again, there are many others, but here are a key few.