No video

165 - An introduction to RNN and LSTM

  Рет қаралды 76,030

DigitalSreeni

DigitalSreeni

Күн бұрын

Пікірлер: 85
@elisavasta2684
@elisavasta2684 2 жыл бұрын
One of the best explanation ever on LSTM! Greetings from Politecnico di Milano!
@sonhdang
@sonhdang 3 жыл бұрын
I've watched dozen of videos on LSTM and this is the best one so far. Thank you so much sir. Greetings from UCLA!
@DigitalSreeni
@DigitalSreeni 3 жыл бұрын
Glad it was helpful!
@pfever
@pfever 2 жыл бұрын
Best LSTM explanation I have watched! All your videos are superb! I want to watch them all from beginning to end! Thank you for such detailed and intuitive explanations! :D
@christophbrand9015
@christophbrand9015 3 жыл бұрын
The first youtube tutorial I saw which explains a LSTM in detail, e.g. why a Sigmoid or why a tanh is used within the cell. Great!
@Balakrish-cl9kq
@Balakrish-cl9kq 2 жыл бұрын
I feel very gifted that I got the suggestion from KZfaq, the right video....
@DigitalSreeni
@DigitalSreeni 2 жыл бұрын
I am glad you found my channel :)
@edwardbowora
@edwardbowora Жыл бұрын
Best teacher ever.
@DigitalSreeni
@DigitalSreeni Жыл бұрын
Thanks
@learn2know79
@learn2know79 2 жыл бұрын
I was struggling to understand the basic concept of LSTM and watched dozen of videos and finally found the best one so far. Thank you so much for letting us understand. Greetings from GIST!
@DigitalSreeni
@DigitalSreeni 2 жыл бұрын
Great to hear!
@saaddahmani1870
@saaddahmani1870 Жыл бұрын
Good, thanks a lot.
@DigitalSreeni
@DigitalSreeni Жыл бұрын
You are welcome
@AshishBamania95
@AshishBamania95 2 жыл бұрын
Can't believe that this is free. Thanks a lot. You are building a community of future researchers and innovators here!
@DigitalSreeni
@DigitalSreeni 2 жыл бұрын
My pleasure!
@user-fd9tv9fq8z
@user-fd9tv9fq8z 4 ай бұрын
I get valuable Understanding. I realy appriciate the way of your explanation.
@mehdisdikiene8752
@mehdisdikiene8752 3 жыл бұрын
I've watched many videos and read a lot about LSTM but this is the first time i really understand how LSTM works. Thumbs up thank you!
@DigitalSreeni
@DigitalSreeni 3 жыл бұрын
Great to hear!
@dizhang947
@dizhang947 Жыл бұрын
amazing work, thank you so much!
@lh2738
@lh2738 2 жыл бұрын
Nice video, so well explained and not too long, along with a full tutorial. Probably one of the best ones about LSTM. Thanks and please keep up the good work! Greetings from France!
@yangfarhana3660
@yangfarhana3660 3 жыл бұрын
I've viewed several vids on LSTM but this breakdown is the best!!
@AhmedFazals
@AhmedFazals 4 ай бұрын
Awesome! Thanks sir.
@jahanzaibasgher1275
@jahanzaibasgher1275 2 жыл бұрын
Thank you so much :) Subscribed after watching your first video.
@gadisaadamuofficial2946
@gadisaadamuofficial2946 Жыл бұрын
really, thank you for your more clarification!
@davidomarparedesparedes8718
@davidomarparedesparedes8718 3 ай бұрын
Great explanation! Thank you so much!! : )
@omniscienceisdead8837
@omniscienceisdead8837 2 жыл бұрын
Best explanation out there, i understood, what is happening both conceptually and mathematically
@claybowlproductions
@claybowlproductions Жыл бұрын
Sir you are a gem!
@aristideirakoze8098
@aristideirakoze8098 2 жыл бұрын
We are infinitely grateful
@DigitalSreeni
@DigitalSreeni 2 жыл бұрын
Thank you :)
@Toss3geek
@Toss3geek 9 ай бұрын
谢谢老师
@alteshaus3149
@alteshaus3149 3 жыл бұрын
Thank you very much for this video sir!
@zhiyili6707
@zhiyili6707 2 жыл бұрын
Thank you very much! It is well explained!
@s.e.7268
@s.e.7268 3 жыл бұрын
I am so happy to discover this channel! :)
@indranisen5877
@indranisen5877 Жыл бұрын
Thank you Sir, Nice explanations.
@kukuhiksanmusyahada7615
@kukuhiksanmusyahada7615 2 жыл бұрын
Great presentation sir! thank you so much!
@RAHUDAS
@RAHUDAS 2 жыл бұрын
At 19:31, he mentioned how many units of LSTM , the units parameters is not for how many units of LSTM in any layer, it is for hidden state dimension. And for how many LSTM depends on input shape[0].
@Droiduxx
@Droiduxx Жыл бұрын
So if I understand well, if we consider the input to be a sequence of x elements, each "LSTM" unit contains x states, and returns a list of x vectors passed to the LSTM units of the next hidden layer. Am I right ?
@RAHUDAS
@RAHUDAS Жыл бұрын
@@Droiduxx yes, but consider return_sequence, and return_stae arguments also, their default values false , to see the full picture, kindly turn on return sequence. Example - x = tf.range(60) x = tf.reshape(x,(5,3,2)) # shape - ( batch, time, num-features) lstm = tf.Keras.Layes.LSTM( 7, return_sequence= True) Output = lstm(x) Print(Output.shape) # answer (5,3,7)
@gakhappy
@gakhappy 3 жыл бұрын
Great work sir. keep on doing great job
@-mle566
@-mle566 2 жыл бұрын
thank you, nice video for LSTM new learners :)
@rolandosantos7755
@rolandosantos7755 3 жыл бұрын
i love your video...i am just starting to learn machine learning and its very useful'
@aminasgharisooreh9243
@aminasgharisooreh9243 3 жыл бұрын
Thank you, it is really helpful
@DigitalSreeni
@DigitalSreeni 3 жыл бұрын
You’re welcome.
@alex-beamslightchanal8743
@alex-beamslightchanal8743 2 жыл бұрын
Nice tutorial! Thank you!
@nisa_ssa
@nisa_ssa 3 жыл бұрын
Thank you so much for this video...
@jolittevillaruz5234
@jolittevillaruz5234 3 жыл бұрын
Very intuitive video!
@karamjeetsinghmakkar3323
@karamjeetsinghmakkar3323 Жыл бұрын
Dear Dr. S. Sreeni, Thanku for your informational videos regarding cnn. Kindly make LSTM for image classification tasks. Thanku.
@DigitalSreeni
@DigitalSreeni Жыл бұрын
LSTM is primarily used for processing sequential data. While it is possible to use LSTM for image classification tasks, it is generally not the best choice as it is designed to model sequential dependencies in data, whereas images are inherently spatial and do not have an obvious sequential structure. Images are typically processed using CNNs, which are specifically designed to handle spatial data and can effectively extract features from images using convolutions.
@VCodes
@VCodes 3 жыл бұрын
great. thx a lot
@rahuliron1635
@rahuliron1635 3 жыл бұрын
awesome explanation thank you very much
@DigitalSreeni
@DigitalSreeni 3 жыл бұрын
Glad it was helpful!
@sadafmehdi2991
@sadafmehdi2991 3 жыл бұрын
Nice Explanation Sir!
@cryptodude5359
@cryptodude5359 2 жыл бұрын
Amazing tutorial! I got a question: At 14:59 you explain the forget gate. In the lower-left corner, the cell gets ht-1 (last timestep) as input. Is it possible to have a sequence of past days as input? For example ht-1 & ht-2 & ht-3 ... etc. to spot potential trends in the data. Maybe with multiple variables. Giving every single timestep an additional weight.
@dantec.dagandanan3732
@dantec.dagandanan3732 2 жыл бұрын
Thanks!
@dantec.dagandanan3732
@dantec.dagandanan3732 2 жыл бұрын
I know this little amount of money is not enough to say thank you. Keep the good works ser, 🥰
@DigitalSreeni
@DigitalSreeni 2 жыл бұрын
Thank you very much. No amount of money is little. Every penny counts :) Bulk of the money goes to charities that help with cancer research and eye surgeries for poor people. So the society benefits from any amount that is contributed. Thanks again.
@manideepgupta2433
@manideepgupta2433 3 жыл бұрын
Amazing Sir.
@aomo5293
@aomo5293 Жыл бұрын
Thank you, honestly it s very clear. Please I am looking for a tutorial on image classification but using local images dataset. Have y made a one before. Thank you again
@nicolamenga8943
@nicolamenga8943 Жыл бұрын
Thank you for the video. I have a question. The number of units (50) is the number of the so called "hidden units", also known as "hidden size"?
@awesome-ai1714
@awesome-ai1714 Жыл бұрын
11:40 What is going on with the arrows? Signal from previous cell merges with current Xt, but there is no operator. Signal from left and signal from bottom Xt. And they both go to 3 gates? Edit: ok I see, its explained later
@tchintchie
@tchintchie 3 жыл бұрын
I can´t help but find this channel incredibly undersubscribed!!!
@DigitalSreeni
@DigitalSreeni 3 жыл бұрын
I’m glad you like the content. I rely on you guys to spread the word :)
@kanui3618
@kanui3618 3 жыл бұрын
nice explanation!
@DigitalSreeni
@DigitalSreeni 3 жыл бұрын
Thanks! 😃
@vzinko
@vzinko Жыл бұрын
Why is there a dropout after the final LSTM layer?
@ziyuelu1734
@ziyuelu1734 2 жыл бұрын
thanks!
@AveRegina_
@AveRegina_ 2 жыл бұрын
I'm using RNN for my PG thesis work. I've a query. Do we have to run stationarity test for our time series data before feeding it in the neural network model... or this step is only required in traditional time series models like ARIMA?
@DigitalSreeni
@DigitalSreeni 2 жыл бұрын
RNNs are capable of learning nonlinearities (compared to ARIMA) and therefore should be able to learn from the input data without doing any stationarity pre-processing. This is especially true if you use LSTMs. Also, please note that you need lot more training data for RNNs compared to ARIMA. You may find this blog useful to understand the effectiveness of RNNs: karpathy.github.io/2015/05/21/rnn-effectiveness/
@sherrlynrasdas8387
@sherrlynrasdas8387 Жыл бұрын
Can you teach us how to use LSTM and ARIMA in ensemble learning in forecasting time series data?
@hudaankara5616
@hudaankara5616 3 жыл бұрын
Hi sir. thank you for much for all your videos. Could you provide us with tutorial to implement LSTM & RNN with Python Please?
@DigitalSreeni
@DigitalSreeni 3 жыл бұрын
Yes... they should be out this week.
@stevenzhou7358
@stevenzhou7358 2 жыл бұрын
Thanks for your videos! It's really helpful. I have a small question. Could you explain a little more about the meaning of units? Is it mean the number of hidden layers or the number of neurons in a layer?
@DigitalSreeni
@DigitalSreeni 2 жыл бұрын
May be this helps... stats.stackexchange.com/questions/241985/understanding-lstm-units-vs-cells
@stevenzhou7358
@stevenzhou7358 2 жыл бұрын
@@DigitalSreeni Thanks a lot! It's very helpful.
@JJGhostHunters
@JJGhostHunters Жыл бұрын
Hi DigitalSreeni...I am a PhD candidate investigating applications of MLPs, CNNs and LSTMs. I see that you have amazing graphics for these model types in your videos. Would you be willing to share these graphics for the model architectures with me so that I may use them in my dissertation and defense presentation? I certainly would give you credit for them. Thank you for your time!
@bobaktadjalli6516
@bobaktadjalli6516 Жыл бұрын
Hi, well explained! Could I have your slides?
@ramchandracheke
@ramchandracheke 3 жыл бұрын
First like a video then watch it !
@DigitalSreeni
@DigitalSreeni 3 жыл бұрын
Thanks for your blind confidence in the video, I hope your opinion doesn’t change after watching the video :)
@aminasgharisooreh9243
@aminasgharisooreh9243 3 жыл бұрын
please make a video about attention in images
@DigitalSreeni
@DigitalSreeni 3 жыл бұрын
I got your attention :)
@XX-vu5jo
@XX-vu5jo 3 жыл бұрын
Lol ever heard of transformers???
@DigitalSreeni
@DigitalSreeni 3 жыл бұрын
Now sure what your meant by your comment, was that a question?
@pattiknuth4822
@pattiknuth4822 3 жыл бұрын
His continuing use of "ok?" "ok?" "ok?" "ok?" is incredibly annoying.
@adhoc3018
@adhoc3018 3 жыл бұрын
And you are not annoying at all.
@DigitalSreeni
@DigitalSreeni 3 жыл бұрын
Poor choice to comment on personal trait rather than content of the tutorial, ok?
166 - An introduction to time series forecasting - Part 5 Using LSTM
23:42
Recurrent Neural Networks (RNNs), Clearly Explained!!!
16:37
StatQuest with Josh Starmer
Рет қаралды 533 М.
КТО ЛЮБИТ ГРИБЫ?? #shorts
00:24
Паша Осадчий
Рет қаралды 3,7 МЛН
Before VS during the CONCERT 🔥 "Aliby" | Andra Gogan
00:13
Andra Gogan
Рет қаралды 10 МЛН
Transformer Neural Networks Derived from Scratch
18:08
Algorithmic Simplicity
Рет қаралды 136 М.
LSTM Recurrent Neural Network (RNN) | Explained in Detail
19:32
Coding Lane
Рет қаралды 57 М.
LSTM is dead. Long Live Transformers!
28:48
Seattle Applied Deep Learning
Рет қаралды 527 М.
MIT Introduction to Deep Learning | 6.S191
1:09:58
Alexander Amini
Рет қаралды 504 М.
[핵심 머신러닝] RNN, LSTM, and GRU
1:17:39
‍김성범[ 교수 / 산업경영공학부 ]
Рет қаралды 17 М.
Long Short-Term Memory (LSTM), Clearly Explained
20:45
StatQuest with Josh Starmer
Рет қаралды 538 М.
180 - LSTM Autoencoder for anomaly detection
26:53
DigitalSreeni
Рет қаралды 88 М.