Hello, Friend.

My name is Matus Telgarsky.

I am an assistant professor at UIUC.

I received my PhD at UCSD in 2013 under Sanjoy Dasgupta.

I study mathematical aspects of machine learning.

Current interests.

Approximation/representation power of deep networks. I proved there exist deep networks which can only be approximated by shallower networks if they have exponentially as many nodes (arXiv, video, lecture notes one and two), and continue to work on related questions (e.g., rational functions, and generative networks (with Bolton Bailey)).

Generalization of deep networks. The empirically-observed excess risk correlates with the Lipschitz constant of networks, and yields a generalization bound (arXiv, poster, video).

Optimization and implicit regularization of deep networks. In grad school I studied AdaBoost, and found that taking the step size to 0 leads to margin maximization (arXiv). Ziwei Ji and I have been studying margin maximization for deep networks, first proving it for logistic regression (arXiv), and then for deep linear networks (arXiv).

Other interests. In the past I focused on boosting and clustering; in the future i hope to study reinforcement learning.

Teaching.

Machine learning theory (CS 598 TEL): fall 2019, fall 2018, fall 2017, fall 2016.

Machine learning (CS 446): spring 2020, spring 2019, spring 2018.

Miscellaneous.

I have two glorious PhD students: Bolton Bailey, Ziwei Ji.

My research is funded by an NSF CAREER award, and an NVIDIA GPU grant.

During summer 2019 I am co-organizing a Simons Institute summer program on deep learning; I was also at the Simons Institute during Spring 2017.

I co-founded the Midwest ML Symposium (MMLS) and moreover co-chaired the 2017 and 2018 editions, all together with glorious Po-Ling Loh.

I have a degree in violin performance from Juilliard, but hardly play any more.

I coded a screensaver, a 3-d plotting tool, and a few other things if you know where to look.

My desk is always messy.

I like scifi books, pencils, ramen, and aphex twin.