Рет қаралды 19,056
This is the first and foundational paper that started the research area of Knowledge Distillation.
Knowledge Distillation is a study of methods and techniques to extract the information from a cumbersome model (also called the Teacher model) and provide it to a simpler model (also called the Student model). Student models are the ones that are used for inference (especially on resource-constrained devices) and are supposed to excel at both accuracy and speed of prediction
Link to the paper:
arxiv.org/abs/1503.02531
Link to the summary of the paper:
towardsdatascience.com/paper-summary-distilling-the-knowledge-in-a-neural-network-dc8efd9813cc
#KnowledgeDistillation
#deeplearning
#softmax
#machinelearning