10 minutes paper (episode 2); Offline knowledge distillation.

  Рет қаралды 1,339

AIology

2 жыл бұрын

In this video, I will bring a brief theory of knowledge distillation, then I will use a Keras example to show how it is simple to be implemented.
In summary, knowledge distillation means transferring knowledge from a big model (teacher) to a small model(student). The lottery ticket hypothesis may justify why knowledge distillation can work.
original code: keras.io/examples/vision/knowledge_distillation/

Пікірлер: 4
@jeromeeusebius
@jeromeeusebius 3 ай бұрын
Thank you for sharing this short and concise video. It is helpful to understand knowledge distillation and for showing an example in code and for also walking through the code examples.
@sakurasadkj5839
@sakurasadkj5839 7 ай бұрын
Thank you for your sharing!
@ehxanhaq2883
@ehxanhaq2883 Жыл бұрын
Thank you. Loved the coding part and your explanation. Will try to replicate this in PyTorch.
@auresdz701
@auresdz701 5 ай бұрын
What if we don't have access to the teacher's data?
Василиса наняла личного массажиста 😂 #shorts
00:22
Денис Кукояка
Рет қаралды 10 МЛН
Osman Kalyoncu Sonu Üzücü Saddest Videos Dream Engine 170 #shorts
00:27
Alat Seru Penolong untuk Mimpi Indah Bayi!
00:31
Let's GLOW! Indonesian
Рет қаралды 15 МЛН
ИГРОВОВЫЙ НОУТ ASUS ЗА 57 тысяч
25:33
Ремонтяш
Рет қаралды 331 М.