0:00 Intro 2:50 Basic Example 13:45 Medium Example 16:11 Sentiment Analysis Code Notebook : github.com/ritvikmath/KZfaq... RNN Theory : • Recurrent Neural Netwo... Backpropagation : • Backpropagation : Data...
Пікірлер: 14
@jkim527 Жыл бұрын
welcome back king!
@_alt Жыл бұрын
ritvik big thanks that you came back! I'm preparing as ML-eng to Yandex for last 6 months, and your channel is literature best resource. Diamond. You re doing really big job here.
@matthewchunk3689 Жыл бұрын
Great to see you back!
@khudadatbaluch7884 Жыл бұрын
u r the best, please go on for ever, one of the great, if you understand it then you you can explain it, but this is not the case in most of the cases, thanks
@randyng3336 Жыл бұрын
Love it!
@ChocolateMilkCultLeader Жыл бұрын
Love that you're doing RNN tutorials when transformers are the meta these days. You should look into attention btw
@mndhamod2 ай бұрын
Question: I find it interesting that in most examples it is the output of the RNN, not the hidden state, is the input to the dense layer. You would think it should be the embedding that is used to represent the sentence. I understand they are only different by a matrix multiplication. Still, I wonder why it is more often than not that people choose the output rather than the embedding.
@danielmyers76 Жыл бұрын
You mentioned a while back if anyone wanted to here about listwise learn to rank to speak up. I do! I do! I have a problem I know how to tackle with pairwise. Don’t know if listwise is something doable for my problem because I just can’t figure it all out! If I knew a little more, I might know if I could!
@RAHUDAS Жыл бұрын
Really Elegant. I was thinking about LSTM layers, Are they only used in sequence to sequence problem??
@dawitabdisa7262 Жыл бұрын
welcome back! how to apply SVM model to classify an alpha data, to realize the detection of driver’s sleepless? Thank you
@hws9999 Жыл бұрын
In this example, does vectorized layer just make one hot vector?
@tamirfri1 Жыл бұрын
please do a video on transformers and GPT
@khudadatbaluch7884 Жыл бұрын
what i believe is our ml are not as smart is they should be that is because it is not done by people like you, we need to improve our tensor understanding, you can do it, just see throu the depth ...... thanks which you may not need, regard less, is it understood that is what matters!!