Knowledge Distillation as Semiparametric Inference

  Рет қаралды 1,887

Microsoft Research

Microsoft Research

Күн бұрын

More accurate machine learning models often demand more computation and memory at test time, making them difficult to deploy on CPU- or memory-constrained devices. Knowledge distillation alleviates this burden by training a less expensive student model to mimic the expensive teacher model while maintaining most of the original accuracy. To explain and enhance this phenomenon, we cast knowledge distillation as a semiparametric inference problem with the optimal student model as the target, the unknown Bayes class probabilities as nuisance, and the teacher probabilities as a plug-in nuisance estimate. By adapting modern semiparametric tools, we derive new guarantees for the prediction error of standard distillation and develop two enhancements-cross-fitting and loss correction-to mitigate the impact of teacher overfitting and underfitting on student performance. We validate our findings empirically on both tabular and image data and observe consistent improvements from our knowledge distillation enhancements.
Lester is a statistical machine learning researcher at Microsoft Research New England and an adjunct professor at Stanford University. He received his Ph.D. in Computer Science (2012), his M.A. in Statistics (2011) from UC Berkeley, and his B.S.E. in Computer Science (2007) from Princeton University. Before joining Microsoft, Lester spent three wonderful years as an assistant professor of Statistics and, by courtesy, Computer Science at Stanford and one as a Simons Math+X postdoctoral fellow, working with Emmanuel Candes. Lester’s Ph.D. advisor was Mike Jordan, and his undergraduate research advisors were Maria Klawe and David Walker. He got his first taste of research at the Research Science Institute and learned to think deeply of simple things at the Ross Program. Lester’s current research interests include statistical machine learning, scalable algorithms, high-dimensional statistics, approximate inference, and probability. Lately, he’s been developing and analyzing scalable learning algorithms for healthcare, climate forecasting, approximate posterior inference, high-energy physics, recommender systems, and the social good.
Learn more about the 2020-2021 Directions in ML: AutoML and Automating Algorithms virtual speaker series: www.microsoft.com/en-us/resea...

Пікірлер: 4
@ericlaufer5804
@ericlaufer5804 3 жыл бұрын
a random forest of 500 tree IS my favorite classifier
@bradykorbin3868
@bradykorbin3868 3 жыл бұрын
pro trick : watch movies at Kaldrostream. I've been using it for watching a lot of movies these days.
@damarishawn2151
@damarishawn2151 3 жыл бұрын
@Brady Korbin definitely, been watching on KaldroStream for since december myself :)
@sayantandasgupta9605
@sayantandasgupta9605 2 жыл бұрын
Where are the slides available?
🌊Насколько Глубокий Океан ? #shorts
00:42
LOVE LETTER - POPPY PLAYTIME CHAPTER 3 | GH'S ANIMATION
00:15
NERF WAR HEAVY: Drone Battle!
00:30
MacDannyGun
Рет қаралды 39 МЛН
Khó thế mà cũng làm được || How did the police do that? #shorts
01:00
Deep Learning Basics: Introduction and Overview
1:08:06
Lex Fridman
Рет қаралды 2,3 МЛН
Support Vector Machines: All you need to know!
14:58
Intuitive Machine Learning
Рет қаралды 135 М.
MIT Introduction to Deep Learning (2023) | 6.S191
58:12
Alexander Amini
Рет қаралды 1,9 МЛН
MIT Introduction to Deep Learning | 6.S191
1:09:58
Alexander Amini
Рет қаралды 325 М.
Gradient Descent Explained
7:05
IBM Technology
Рет қаралды 59 М.
1. Probability Models and Axioms
51:11
MIT OpenCourseWare
Рет қаралды 1,2 МЛН
The Essential Main Ideas of Neural Networks
18:54
StatQuest with Josh Starmer
Рет қаралды 896 М.
Recurrent Neural Networks (RNNs), Clearly Explained!!!
16:37
StatQuest with Josh Starmer
Рет қаралды 499 М.
[Seminar Series] Deep Learning with Noisy Labels
1:08:29
GamePad İle Bisiklet Yönetmek #shorts
0:26
Osman Kabadayı
Рет қаралды 219 М.
Собери ПК и Получи 10,000₽
1:00
build monsters
Рет қаралды 2,1 МЛН
В России ускорили интернет в 1000 раз
0:18
Короче, новости
Рет қаралды 1,2 МЛН
Best mobile of all time💥🗿 [Troll Face]
0:24
Special SHNTY 2.0
Рет қаралды 2 МЛН