Stochastic Depth for Neural Networks Explained

  Рет қаралды 1,026

Machine Learning Explained

Machine Learning Explained

Күн бұрын

Stochastic depth is a powerful regularization method applied on residual neural networks that speed up training and enhances test performance.
In this tutorial, we'll go through the methodology and a Pytorch implementation.
Table of Content
- Introduction: 0:00
- Background and Context: 0:30
- Questions: 2:46
- Architecture Changes: 3:00
- Why use the resnet architecture for stochastic depth?: 5:07
- Difference between Drop out and Drop path: 5:23
- Speedup and performance: 6:06
- Stochastic Depth example: 6:33
- How do they manage speedup and better performance with stochastic depth?: 7:27
- Data sets: 7:44
- Main Results: 8:09
- Analytical Experiment Result: 10:10
- Code Walkthrough: 11:22
- Conclusion: 26:10
📌 Github: github.com/yacineMahdid/deep-...
📌 Original Code: github.com/shamangary/Pytorch...
📌 Paper: arxiv.org/pdf/1603.09382
Abstract:
"Very deep convolutional networks with hundreds of layers have led to significant reductions in error on competitive benchmarks. Although the unmatched expressiveness of the many layers can be highly desirable at test time, training very deep networks comes with its own set of challenges.
The gradients can vanish, the forward flow often diminishes, and the training time can be painfully slow. To address these problems, we propose stochastic depth, a training procedure that enables the seemingly contradictory setup to train short networks and use deep networks at test time.
We start with very deep networks but during training, for each mini-batch, randomly drop a subset of layers and bypass them with the identity function. This simple approach complements the recent success of residual networks.
It reduces training time substantially and improves the test error significantly on almost all data sets that we used for evaluation. With stochastic depth we can increase the depth of residual networks even beyond 1200 layers and still yield meaningful improvements in test error (4.91% on CIFAR-10)."
---
Join the Discord for general discussion: / discord
---
Follow Me Online Here:
GitHub: github.com/yacineMahdid
LinkedIn: / yacinemahdid
---
Have a great week! 👋

Пікірлер: 1
@machinelearningexplained
@machinelearningexplained 28 күн бұрын
Hey, fyi I had to reshoot some of the section on this video because I couldn’t stop saying Drop Path (from Fractal Net) instead of Stochastic Depth. There is still 1 wrong mention of drop path in there that I wasn’t able to fix haha That's what you get from reading two paper simultaneously!😅
What are Highway Networks in Deep Learning?
8:59
Machine Learning Explained
Рет қаралды 156
Put Yourself In An Animated Film For FREE With This Tool
8:36
100😭🎉 #thankyou
00:28
はじめしゃちょー(hajime)
Рет қаралды 57 МЛН
MOM TURNED THE NOODLES PINK😱
00:31
JULI_PROETO
Рет қаралды 34 МЛН
We Got Expelled From Scholl After This...
00:10
Jojo Sim
Рет қаралды 14 МЛН
A Very Simple Transformer Encoder for Protein Classification in PyTorch
14:19
Let's Learn Transformers Together
Рет қаралды 71
What are 1x1 Convolutions in Deep Learning?
7:43
Machine Learning Explained
Рет қаралды 1,1 М.
Deep Neural Network from Scratch in Python | Fully Connected Feedforward Neural Network
13:04
Creative AI Video Upscaling Will Blow Your Mind!
12:35
Theoretically Media
Рет қаралды 12 М.
Transformer Neural Networks, ChatGPT's foundation, Clearly Explained!!!
36:15
StatQuest with Josh Starmer
Рет қаралды 587 М.
Learn Backpropagation Derivation Step by Step
11:37
Machine Learning Explained
Рет қаралды 3,6 М.
Img2Vec and UMAP to Visualize High Dimensional Data
50:04
Machine Learning Explained
Рет қаралды 386
Recurrent Neural Networks (RNNs), Clearly Explained!!!
16:37
StatQuest with Josh Starmer
Рет қаралды 483 М.
100😭🎉 #thankyou
00:28
はじめしゃちょー(hajime)
Рет қаралды 57 МЛН