No video

138 - The need for scaling, dropout, and batch normalization in deep learning

  Рет қаралды 18,024

DigitalSreeni

DigitalSreeni

Күн бұрын

Пікірлер: 35
@hudaankara5616
@hudaankara5616 2 жыл бұрын
You are the best one that explains the deep learning concepts .. thank you very much dear teacher
@yashodhanvivek8086
@yashodhanvivek8086 3 жыл бұрын
Following your videos will make sure person will be excellent at ML and DL architecture..... You are pin pointing some of the issues which are not addresed by others... excellent work.. thanks... best wishes
@lokeshbaviskar3206
@lokeshbaviskar3206 2 жыл бұрын
Awesome explanation !!!! Keep sharing your valuable knowledge
@leamon9024
@leamon9024 2 жыл бұрын
Awesome! Thanks for sharing your knowledge. It's very informative.
@bishnukarki1577
@bishnukarki1577 Жыл бұрын
Thank you so much for your effort. It helped me a lot.
@boatengalexander5178
@boatengalexander5178 Жыл бұрын
Wonderful tutorial. Many thanks, Sir
@percydiegolicaresascona4991
@percydiegolicaresascona4991 2 жыл бұрын
Thanks a lot for your videos. :)
@JaydeepSinghTindori
@JaydeepSinghTindori 6 ай бұрын
Nice lecture but why the dropout is used before the max pooling later. ?
@BeketMyrzanov
@BeketMyrzanov Жыл бұрын
You can totally replace scaling with Batch Normalization. If you add Batch Normalization layer as a first layer in your network, then your inputs will be normalized a.k.a scaled right before being fed into neural network.
@DigitalSreeni
@DigitalSreeni Жыл бұрын
Even if Batch Normalization is added as the first layer in the neural network, it is still recommended to normalize the input data during preprocessing. The reason for this is that Batch Normalization is designed to normalize the activations of each layer in the network, but it does not normalize the input data.
@salarghaffarian4914
@salarghaffarian4914 3 жыл бұрын
Thank you so much!
@DigitalSreeni
@DigitalSreeni 3 жыл бұрын
You're welcome!
@rabieelhakouni1167
@rabieelhakouni1167 Жыл бұрын
i have a question , i have a lager dataset contains 2381 features (ember dataset) when i want to use it and converted to an image grayscale 48*48 , i must remove 77 features but I want to remove the unuseful features how i van remove these 77 features pleas
@thejll
@thejll 7 ай бұрын
How do you scale inputs when it comes to using a trained model?
@mr.shouvikdey8482
@mr.shouvikdey8482 2 жыл бұрын
Normalizing is ok but while using the trained model we need to normalize the input with the same scaling parameters. How do we do that?.
@danielniels22
@danielniels22 2 жыл бұрын
thank you sir. im new into ur channel 💙
@abderrahmaneherbadji5478
@abderrahmaneherbadji5478 4 жыл бұрын
First, thank you so much. Second, I would like to ask, how can I visualize the output of Conv layer of the trained model?
@surajshah4317
@surajshah4317 4 жыл бұрын
Uhh...can you tell me what is the output of your training network??
@abderrahmaneherbadji5478
@abderrahmaneherbadji5478 4 жыл бұрын
@@surajshah4317 For example, the last output of the CNN is a label class, but I need to visualize the output of Conv layer
@DigitalSreeni
@DigitalSreeni 4 жыл бұрын
You just extract the single convolutional layer and make a new model with just that layer. Then supply an input image and visualize the responses. This is the easiest way I can think of. I will record a video on this topic soon... may be in a week.
@abderrahmaneherbadji5478
@abderrahmaneherbadji5478 4 жыл бұрын
@@DigitalSreeni Looking forward to your video
@indointanchannel
@indointanchannel Жыл бұрын
Sir, why do I get error when I was running BatchNormalization. I look for the solution but I did not find it. So Can you help me? ImportError: cannot import name 'BatchNormalization' from 'keras.layers.normalization' (/usr/local/lib/python3.8/dist-packages/keras/layers/normalization/__init__.py). Is there the other file for BatchNormalization?or it is a function? method? I installed all, Keras, TensorFlow, batch, and normalization. I have a confusing. Thanks before. Greeting.
@DigitalSreeni
@DigitalSreeni Жыл бұрын
If you are trying to perform batch normalization, try keras.layers.BatchNormalization
@indointanchannel
@indointanchannel Жыл бұрын
@@DigitalSreeni Thank you sir. I'll try it.
@heshamabdelghany536
@heshamabdelghany536 3 жыл бұрын
Thanks for the great video. Do you have results also by applying dropout after batch normalization and would the order of applying {dropout and batch normalization} matter? I am more interested also in MLP case. Thanks!
@teeg-wendezougmore6663
@teeg-wendezougmore6663 2 жыл бұрын
Thanks for sharing!!Can we use dropout in time series forecasting with deep learning methods ?
@DigitalSreeni
@DigitalSreeni 2 жыл бұрын
Yes, you can
@tilkesh
@tilkesh 2 жыл бұрын
Thanks
@lendixful7932
@lendixful7932 4 жыл бұрын
When u finish the explanation of this topics could you make some videos about VAEs? 😬
@DigitalSreeni
@DigitalSreeni 4 жыл бұрын
I'll try. Thanks for the suggestion.
@yepnah3514
@yepnah3514 3 жыл бұрын
hey i have the same lamp haha
@DigitalSreeni
@DigitalSreeni 3 жыл бұрын
You seem to be a wise man with good taste in lamps 😌
@yepnah3514
@yepnah3514 3 жыл бұрын
@@DigitalSreeni I have a question. so i trained a model (typical doc/cat). I am using the saved model file to see how it performs when I modify the weights/bias and test the arrays at various standard deviation values (from .005-.01) during inference. for example, I use a for loop to run the program at .005 sd for 50 times, then I save the number of correct images that came out each time. I do this for each sd value. I then print out the graph of what it looks like. The problem is that I think I should get higher number of correct images closer to .005 and lower correct number of images the farther the sd gets. but that's not what happens. i get different results each time I run the program. is this expected/normal? sorry i hope this makes sense. this is part of my senior capstone project and I have no idea what is causing this behavior.
@fadilyassin4661
@fadilyassin4661 3 жыл бұрын
hi if you kindly put another link to yann.lecan.com/exdb/punlis/pdf/lecun-98b.pdf as this link transferes you to a chinese with chinese language nothing to see or downlaod thank you
@matancadeporco
@matancadeporco 2 жыл бұрын
its completely wrong ur hiperlink
139 - The topology of deep neural networks, designing your model.
26:37
Batch normalization | What it is and how to implement it
13:51
AssemblyAI
Рет қаралды 59 М.
Joker can't swim!#joker #shorts
00:46
Untitled Joker
Рет қаралды 41 МЛН
🩷🩵VS👿
00:38
ISSEI / いっせい
Рет қаралды 28 МЛН
а ты любишь париться?
00:41
KATYA KLON LIFE
Рет қаралды 3,6 МЛН
Batch normalization
15:33
Brandon Rohrer
Рет қаралды 10 М.
134 - What are Optimizers in deep learning? (Keras & TensorFlow)
8:36
Normalizing Activations in a Network (C2W3L04)
8:55
DeepLearningAI
Рет қаралды 111 М.
How Does Batch Normalization Work
13:23
Mısra Turp
Рет қаралды 4,4 М.
Batch Normalization in neural networks - EXPLAINED!
17:00
CodeEmporium
Рет қаралды 3,5 М.
Should You Scale Your Data ??? : Data Science Concepts
7:57
ritvikmath
Рет қаралды 11 М.