Knowledge Distillation Explained with Keras Example |

  Рет қаралды 3,863

Rithesh Sreenivasan

Rithesh Sreenivasan

3 жыл бұрын

Knowledge Distillation Explained with Keras Example
In this video I will be explaining the concept of Knowledge Distillation. In machine learning, knowledge distillation is the process of transferring knowledge from a large model to a smaller one. While large models (such as very deep neural networks or ensembles of many models) have higher knowledge capacity than small models, this capacity might not be fully utilized. It can be computationally just as expensive to evaluate a model even if it utilizes little of its knowledge capacity. Knowledge distillation transfers knowledge from a large model to a smaller model without loss of validity. As smaller models are less expensive to evaluate, they can be deployed on less powerful hardware (such as a mobile device)
If you like such content please subscribe to the channel here:
kzfaq.info...
If you like to support me financially, It is totally optional and voluntary.
Buy me a coffee here: www.buymeacoffee.com/rithesh
Relevant links:
en.wikipedia.org/wiki/Knowled...
keras.io/examples/vision/know...
cs231n.stanford.edu/reports/20...
arxiv.org/pdf/1503.02531.pdf
blog.floydhub.com/knowledge-d...
stats.stackexchange.com/quest...
colab.research.google.com/dri...

Пікірлер: 9
@VLM234
@VLM234 3 жыл бұрын
Great content. I think hugging face uses the same approach to compress the model. Thank you so much for sharing such a valuable content...
@RitheshSreenivasan
@RitheshSreenivasan 3 жыл бұрын
Thanks!! Yes I want to cover the Distill BERT model in another video
@VLM234
@VLM234 3 жыл бұрын
@@RitheshSreenivasan that would be great...if possible try to cover custom training of Bert or any transformer model as well.....🙏
@shaikbalajibabu4408
@shaikbalajibabu4408 3 жыл бұрын
Great explanation sir
@RitheshSreenivasan
@RitheshSreenivasan 3 жыл бұрын
Thank You!!
@mohdoimomjoim7975
@mohdoimomjoim7975 2 жыл бұрын
you explanation is soooooooooooo good
@RitheshSreenivasan
@RitheshSreenivasan 2 жыл бұрын
Thank You!!
@anikde9800
@anikde9800 Жыл бұрын
You explained a figure on Knowledge Distillation, but I couldn't find it in any of the resources. Can you help me on that?
@RitheshSreenivasan
@RitheshSreenivasan Жыл бұрын
Look in the description of the video.All links are present
What is Federated Learning   #machinelearning #tutorial
15:50
Rithesh Sreenivasan
Рет қаралды 18 М.
Вечный ДВИГАТЕЛЬ!⚙️ #shorts
00:27
Гараж 54
Рет қаралды 13 МЛН
The child was abused by the clown#Short #Officer Rabbit #angel
00:55
兔子警官
Рет қаралды 22 МЛН
Жайдарман | Туған күн 2024 | Алматы
2:22:55
Jaidarman OFFICIAL / JCI
Рет қаралды 1,5 МЛН
Must-have gadget for every toilet! 🤩 #gadget
00:27
GiGaZoom
Рет қаралды 12 МЛН
Quantization vs Pruning vs Distillation: Optimizing NNs for Inference
19:46
Distilling the Knowledge in a Neural Network
19:05
Kapil Sachdeva
Рет қаралды 19 М.
What are Transformer Models and how do they work?
44:26
Serrano.Academy
Рет қаралды 103 М.
10 minutes paper (episode 2); Offline knowledge distillation.
14:50
Knowledge Distillation: A Good Teacher is Patient and Consistent
12:35
Prompt Engineering, RAG, and Fine-tuning: Benefits and When to Use
15:21
Knowledge Distillation in Deep Neural Network
4:10
EscVM
Рет қаралды 3,4 М.
ResNet (actually) explained in under 10 minutes
9:47
rupert ai
Рет қаралды 83 М.
7 Years of Software Engineering Advice in 18 Minutes
18:32
Knowledge Distillation in Deep Learning - Basics
9:51
Dingu Sagar
Рет қаралды 18 М.
Вечный ДВИГАТЕЛЬ!⚙️ #shorts
00:27
Гараж 54
Рет қаралды 13 МЛН