Neural Networks Pt. 2: Backpropagation Main Ideas

  Рет қаралды 477,272

StatQuest with Josh Starmer

StatQuest with Josh Starmer

Күн бұрын

Backpropagation is the method we use to optimize parameters in a Neural Network. The ideas behind backpropagation are quite simple, but there are tons of details. This StatQuest focuses on explaining the main ideas in a way that is easy to understand.
NOTE: This StatQuest assumes that you already know the main ideas behind...
Neural Networks: • The Essential Main Ide...
The Chain Rule: • The Chain Rule
Gradient Descent: • Gradient Descent, Step...
LAST NOTE: When I was researching this 'Quest, I found this page by Sebastian Raschka to be helpful: sebastianraschka.com/faq/docs...
For a complete index of all the StatQuest videos, check out:
statquest.org/video-index/
If you'd like to support StatQuest, please consider...
Buying my book, The StatQuest Illustrated Guide to Machine Learning:
PDF - statquest.gumroad.com/l/wvtmc
Paperback - www.amazon.com/dp/B09ZCKR4H6
Kindle eBook - www.amazon.com/dp/B09ZG79HXC
Patreon: / statquest
...or...
KZfaq Membership: / @statquest
...a cool StatQuest t-shirt or sweatshirt:
shop.spreadshirt.com/statques...
...buying one or two of my songs (or go large and get a whole album!)
joshuastarmer.bandcamp.com/
...or just donating to StatQuest!
www.paypal.me/statquest
Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
/ joshuastarmer
0:00 Awesome song and introduction
3:55 Fitting the Neural Network to the data
6:04 The Sum of the Squared Residuals
7:23 Testing different values for a parameter
8:38 Using the Chain Rule to calculate a derivative
13:28 Using Gradient Descent
16:05 Summary
#StatQuest #NeuralNetworks #Backpropagation

Пікірлер: 511
@statquest
@statquest 2 жыл бұрын
The full Neural Networks playlist, from the basics to deep learning, is here: kzfaq.info/get/bejne/edd_mcxllrLKdKs.html Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
@motherisape
@motherisape 2 жыл бұрын
Bamm
@gbchrs
@gbchrs 2 жыл бұрын
@seanleith5312
@seanleith5312 10 ай бұрын
Quit the singing, please
@statquest
@statquest 10 ай бұрын
@@seanleith5312 Noted
@dcscla
@dcscla 3 жыл бұрын
Man, your promotions are not shameless! Actually, what you do is a gift for us, for the price that you charge and for the level of the content, we are being gifted and not a buying something. You are far better than a lot of paid (and expensive) courses. Just check out your video comments to see how people few happy when they discover your videos!! Great work as always. Thank you so much!!!👏🏻👏🏻👏🏻👏🏻
@statquest
@statquest 3 жыл бұрын
Thank you very much! :)
@Luxcium
@Luxcium 11 ай бұрын
He is using the concept of reverse psychology by presenting great stuff at a good price and as you mentioned theses promotions are not shameless… They are shameful, as you hinted he should indeed be ashamed of giving us such a good and advantageous offer… 😅😅😅😅
@jonforce360
@jonforce360 3 жыл бұрын
You released this video just in time for my AI exam! Thank you. Sometimes I think professors use really complex notation just to feel smarter than students, it doesn't help learning. I love your content.
@statquest
@statquest 3 жыл бұрын
Thank you very much!
@sarazahoor9133
@sarazahoor9133 2 жыл бұрын
I want to copy-paste this comment! :D
@puppergump4117
@puppergump4117 2 жыл бұрын
Ain't that right. They must be mad that they don't understand the actually smart people so they don't want to be understood either.
@zhongtianjackwang5346
@zhongtianjackwang5346 Жыл бұрын
lol, that is exactly what I want to say
@katwoods8514
@katwoods8514 3 жыл бұрын
omg yay! I just discovered that you've made a million videos on ML. I'm going to go binge all of them now :D
@statquest
@statquest 3 жыл бұрын
Hope you enjoy!
@motherisape
@motherisape 2 жыл бұрын
Bamm
@ElNick09
@ElNick09 2 жыл бұрын
I have been a student my entire life and have taught college level courses myself, and I must say you are one of the finest lecturers I have ever seen. This statquest is a gem. Your work is so succinct and clear its as much art as it is instruction. Thank you for this incredible resource!
@statquest
@statquest 2 жыл бұрын
Thank you very much! :)
@ksrajavel
@ksrajavel 3 жыл бұрын
Finally. The wait is overBAM!!!
@statquest
@statquest 3 жыл бұрын
TRIPEL BAM!!!
@mariolira9279
@mariolira9279 2 жыл бұрын
F I F T H B A M!
@syco_Rax
@syco_Rax 2 жыл бұрын
SUPER BAM!!!
@akeslx
@akeslx 7 ай бұрын
I finished business school 25 years ago where I studied statistics and math. So happy to see that neural networks are fundamentally just a (much) more advanced regression analysis.
@statquest
@statquest 7 ай бұрын
BAM!!! Thank you for supporting StatQuest! Yes, neural networks are a lot like regression, but now we can fit non-linear shapes to the data, and we don't have to know in advance what that shape should be. Given enough activation functions and hidden layers, the neural network can figure it out on its own.
@iskrega
@iskrega 2 жыл бұрын
I just want you to know your channel has been instrumental in helping me towards my Data Science degree, I'm currently in my last semester. I'll be forever grateful for your channel and the time you take to make these videos. Thank you so much.
@statquest
@statquest 2 жыл бұрын
Thank you and good luck with your final semester! BAM! :)
@mot7
@mot7 2 жыл бұрын
You are the best. I wish every ML learner find you first. I am going to do my part and tweet about you. Thanks for making these videos! Wish you more success.
@statquest
@statquest 2 жыл бұрын
Wow! Thank you very much! I really appreciate the support. BAM! :)
@hamzasaaran3011
@hamzasaaran3011 Жыл бұрын
I am studying for a Master's degree in bioinformatics now, and as someone who knows little about statistics, I really can't thank you enough for your videos and the effort that you have put into them.
@statquest
@statquest Жыл бұрын
Thank you!
@ML-jx5zo
@ML-jx5zo 3 жыл бұрын
Now Iam reading backpropagation, I worried about this vedio didn't came for long time , And finally I got a treasure.
@statquest
@statquest 3 жыл бұрын
bam! :)
@advaithsahasranamam6170
@advaithsahasranamam6170 Жыл бұрын
This is excellent stuff! As a visual learner, your channel is a BLESSING. Thank you so much for your fantastic work on breaking down concepts into small, bite-sized pieces. It's much less intimidating, and you deserve so much more appreciation . You also gained my subscription to your channel! Keep doing a great job, and thank you SO MUCH for having my back!
@statquest
@statquest Жыл бұрын
Thank you very much!!! :)
@mohammadrahman1126
@mohammadrahman1126 3 жыл бұрын
Amazing explanation! I've spent years trying to learn this and it always went too quickly into the gory mathematical details. Aha moment for me was when green squiggle equal blue plus orange squiggles lol Thank you for this Josh!!!
@statquest
@statquest 3 жыл бұрын
Glad it was helpful!
@amandak1396
@amandak1396 2 жыл бұрын
Kind of like how Feyman reduced gory math in physics to actual squiggle, double bam!
@katwoods8514
@katwoods8514 3 жыл бұрын
Love this! You've explained it far better than anywhere else I've seen, and you made it entertaining at the same time! Thank you so much for making this.
@statquest
@statquest 3 жыл бұрын
Awesome, thank you!
@evangeliamm
@evangeliamm 3 жыл бұрын
You have no idea how much I appreciate your work. Your explanations are so fun and simple, I'm just so grateful!
@statquest
@statquest 3 жыл бұрын
Thank you very much! :)
@yasameenmohammed4366
@yasameenmohammed4366 Жыл бұрын
My Machine Learning exam is tomorrow and re-watching your videos to review concepts is helping me so much! Thank you!!!
@statquest
@statquest Жыл бұрын
Good luck! BAM! :)
@raunak5344
@raunak5344 3 ай бұрын
I just iterated on a gradient descent and found that this is the best possible way to teach this topic and no other lecture in the entire existence is better than this one
@statquest
@statquest 3 ай бұрын
bam!
@jennystephens3215
@jennystephens3215 3 жыл бұрын
Josh, this is amazing. You really make things so easy to visualise which is crazy considering the hidden networks are meant to be so hard that they are referred to as black box! Thanks for all your videos. I have used heaps over the last twelve months. Thank you again.
@statquest
@statquest 3 жыл бұрын
Hooray!!! I'm so glad that you like my videos. :)
@babarali4313
@babarali4313 Жыл бұрын
Its the teacher who makes the Subject easy or difficult and the way you explained Neural Network, I am speechless
@statquest
@statquest Жыл бұрын
Thanks!
@dinara8571
@dinara8571 3 жыл бұрын
JUST WOW! Thank you so much, Josh! I cannot express the feeling I had when EVERYTHING made sense!!! TRIPLE BAM! Never thought I would be extremely excited to pause the video and try to solve everything by hand before I look at the next steps
@statquest
@statquest 3 жыл бұрын
BAM! :)
@NadaaTaiyab
@NadaaTaiyab 2 жыл бұрын
oh that's a good idea!
@Amir-gc8re
@Amir-gc8re Жыл бұрын
Finally a proper, detailed, step by step explanation. This guy is absolutely AMAZING ! Thank you so much for all the hard work in putting these videos together for us.
@statquest
@statquest Жыл бұрын
Thank you very much! :)
@jays9591
@jays9591 2 жыл бұрын
May I say .... You are such a good teacher that it is most enjoyable to watch your videos. I am proficient in statistics (via university econometrics 101) ... and I did not realise all those fancy terms in machine learning are actually concepts that are common items in the stats that I learned in the 1970s, e.g., biases and weights, label, activation functions etc. Anyway, I can see that a lot of viewers appreciate your work and teaching. I have also 'updated' myself. Thank you.
@statquest
@statquest 2 жыл бұрын
Thank you very much!
@tagoreji2143
@tagoreji2143 Жыл бұрын
Teaching such complicated topics in a simple, Easily Understandable way.👏👏👏.Thank you, Professor
@statquest
@statquest Жыл бұрын
Thanks!
@perhaps467
@perhaps467 Жыл бұрын
Thank you so much for this series! I haven’t been able to find any other videos that really break down the mechanics of neural networks like this.
@statquest
@statquest Жыл бұрын
Thanks!
@mjcampbell1183
@mjcampbell1183 Жыл бұрын
Wow! This is an incredible video. Thank you SO MUCH for making this for us. This is one of the best videos I've seen to explain this concept. The hard work you have put into this is something that I am incredibly appreciative of. Thanks, man.
@statquest
@statquest Жыл бұрын
Wow, thank you!
@hyonnj9563
@hyonnj9563 2 ай бұрын
Honestly you do a much better job at teaching using a pre recorded video than my instructors using both written and live materials that I'm paying for.
@statquest
@statquest 2 ай бұрын
I'm glad my videos are helpful! :)
@manalisingh1128
@manalisingh1128 2 жыл бұрын
Wow Josh way to go!!!! You have the concepts so clear in your own head that it seems a piece of cake for us 🍰♥️ Love from India! 🇮🇳
@statquest
@statquest 2 жыл бұрын
Thanks so much!!
@maliknauman3566
@maliknauman3566 2 жыл бұрын
How amazing is the way you convey complex concepts.
@statquest
@statquest 2 жыл бұрын
Thank you!
@mashmesh
@mashmesh 3 жыл бұрын
Omg, protect this man at all costs, this was pure gold!!! Also, thank you, sir, for talking so slowly because if my brain squiggles need to work faster they will burn up x)
@statquest
@statquest 3 жыл бұрын
Glad you enjoyed it!
@jblacktube
@jblacktube Жыл бұрын
I didn't even get through the jingle before I gave a thumbs up. Thanks for the chuckle, can't wait to watch the rest of this!
@statquest
@statquest Жыл бұрын
BAM! :)
@lukasaudir8
@lukasaudir8 10 ай бұрын
I am really glad that people like you exist!! Thank you so much for those incredible lessons
@statquest
@statquest 10 ай бұрын
Glad you like them!
@BlochSphere
@BlochSphere 4 ай бұрын
The level of detailing in this video is just 🤯 I hope i can try to make my Quantum Computing videos this clear!
@statquest
@statquest 4 ай бұрын
Good luck!
@iliasaarab7922
@iliasaarab7922 3 жыл бұрын
Best explanation that I've seen so far on backpropagation!
@statquest
@statquest 3 жыл бұрын
Thank you! :)
@madghostek3026
@madghostek3026 Жыл бұрын
9:00 at this moment I realised I'm watching the best math content on earth, because you never see simple stuff like this being given attention to. Luckily I already know how summation symbol works, but I didn't know it in the past, and nobody cared to explain. But it's just not about the summation symbol, imagine the other 1000 small things somebody might not understand, and doesn't realise they don't understand, because it's been skimmed over
@statquest
@statquest Жыл бұрын
Thank you so much! I really appreciate it! :)
@nojoodothmanal-ghamdi1026
@nojoodothmanal-ghamdi1026 Жыл бұрын
I . JUST . LOVE . YOUR . CHANNEL !! you literly explain things very clearly and step by step! I just cannot thank you enough really
@statquest
@statquest Жыл бұрын
Wow, thank you!
@O5MO
@O5MO 2 жыл бұрын
I never understood backpropagation. I knew some things from other tutorials, but as for beigginer, it was very hard to understand. This video (and probably series) is the best i could find. Thank you.
@statquest
@statquest 2 жыл бұрын
Glad it was helpful!
@user-re1bi2bc8b
@user-re1bi2bc8b 3 жыл бұрын
Incredible. Sometimes I need a refresher on these topics. There’s much to remember as a data scientist. I’m so glad I found your channel!
@statquest
@statquest 3 жыл бұрын
Bam!
@codeman2
@codeman2 3 жыл бұрын
I searched neural net and again your video popped that too just 4 month old, love to get your helpful videos right before my semester
@statquest
@statquest 3 жыл бұрын
:)
@deepanjan1234
@deepanjan1234 3 жыл бұрын
This is really awesome. I thank you for your effort in developing this highly enriched content. BAM !!!
@statquest
@statquest 3 жыл бұрын
Thank you!
@wong4359
@wong4359 2 жыл бұрын
I found your explanation is far more easier to understand than the edx online course I am taking, BAM !!!
@statquest
@statquest 2 жыл бұрын
bam!
@sarazahoor9133
@sarazahoor9133 2 жыл бұрын
For the first time ever in history, I have understood the concept behind Neural Networks! BAM!!!! :D Thanks Josh, so grateful :)
@statquest
@statquest 2 жыл бұрын
BAM! :)
@aviknash
@aviknash 3 жыл бұрын
Excellent job Josh!!! Just loved it!!! Thanks a ton for your fun-filled tutorials :)
@statquest
@statquest 3 жыл бұрын
Glad you like them!
@David5005ful
@David5005ful 2 жыл бұрын
The type of in depth video I’ve always wanted!
@statquest
@statquest 2 жыл бұрын
Thank you!
@mrglootie101
@mrglootie101 3 жыл бұрын
I've been waiting for this all the time checking the notification haha
@statquest
@statquest 3 жыл бұрын
Hooray! The wait is over.
@yiliu5403
@yiliu5403 Жыл бұрын
Best Neural Networks Lectures! Just ordered the book from Amazon to support!
@statquest
@statquest Жыл бұрын
Wow! Than you very much! :)
@knt2112
@knt2112 9 ай бұрын
Hello sir, Thanks for such an simple explanation, never understood back propagation in such a depth at this ease. 🎉
@statquest
@statquest 9 ай бұрын
Thank you!
@josephif
@josephif Жыл бұрын
Lecture was awesome,more affective and easy to understand Thanks
@statquest
@statquest Жыл бұрын
Thank you! :)
@ucanhnguyen4751
@ucanhnguyen4751 3 жыл бұрын
Thank you for this video. I have been waiting for this all the time. Finally, it appeared just 1 day before my exam. You are a life saver!!
@statquest
@statquest 3 жыл бұрын
Good luck with your exam! :)
@yonasbefirdu5575
@yonasbefirdu5575 Жыл бұрын
You rescued me from the unknown!! Much Love from Ethiopia
@statquest
@statquest Жыл бұрын
Bam! :)
@evie389
@evie389 Жыл бұрын
I was reading an article based on Backpropagation and I did not understand a single word. I had to watch all your videos starting from Chain Rule, Gradient Descent, NNs...I re-read the article and understood everything!!! But now I can't get the beep--boop and small/double/triple/ bam out of my head lol.
@statquest
@statquest Жыл бұрын
BAM! I'm glad my videos were helpful! :)
@TheClearwall
@TheClearwall 3 жыл бұрын
Who else is using these videos to put together a semester project? So far, I've put Regression Trees, K-fold CV, complexity pruning, and now Neural networks as my final model construction. Josh is worth a double bam every time.
@statquest
@statquest 3 жыл бұрын
BAM! Good luck with your project.
@royazullay7556
@royazullay7556 11 ай бұрын
That Josh guy is just awsome !! Definitely will support !!
@statquest
@statquest 11 ай бұрын
Thank you!
@chrislee4531
@chrislee4531 Жыл бұрын
I learn more from four of your videos than 200 pages of textbook gibberish
@statquest
@statquest Жыл бұрын
Thanks!
@aryabartarout5697
@aryabartarout5697 7 ай бұрын
You have cleared my doubt on back propagation, gradient descent and chain rule. Triple Bam !
@statquest
@statquest 7 ай бұрын
:)
@alexissanchezbro
@alexissanchezbro 3 жыл бұрын
Your getting better and better. Thank you
@alexissanchezbro
@alexissanchezbro 3 жыл бұрын
BAAAAAAAMMM
@statquest
@statquest 3 жыл бұрын
:)
@viethoalam9958
@viethoalam9958 Ай бұрын
give respect to my math teacher, but this is so much easier to understand.
@statquest
@statquest Ай бұрын
bam! :)
@preetikharb8283
@preetikharb8283 2 жыл бұрын
This video made my day, thank you so much, Josh!!
@statquest
@statquest 2 жыл бұрын
Thanks!
@cthutu
@cthutu 4 ай бұрын
Excellent content, excellent delivery - just bought your book!
@statquest
@statquest 4 ай бұрын
Thank you so much for supporting StatQuest! BAM! :)
@terrepus9856
@terrepus9856 3 жыл бұрын
The time couln't be more perfect ... 3 hours before my machine learning exam !! Thank you!!!!
@statquest
@statquest 3 жыл бұрын
Good luck with your exam! I hope it goes well.
@JamesWasTakenOhWell
@JamesWasTakenOhWell Жыл бұрын
Thank you for the amazing effort you put into this video and BAM!!! as always!
@statquest
@statquest Жыл бұрын
Thanks!
@marpin6162
@marpin6162 3 жыл бұрын
Thank you. Now everything is more clear.
@statquest
@statquest 3 жыл бұрын
BAM! :)
@aminmoghaddam7624
@aminmoghaddam7624 2 ай бұрын
I wish our lecturers watched these videos before trying to make their own teaching slides! (With acknowledgement of course!)
@statquest
@statquest 2 ай бұрын
bam!
@jieunboy
@jieunboy Жыл бұрын
insane teaching quality, thanks !
@statquest
@statquest Жыл бұрын
Glad you think so!
@amirhossientakeh5540
@amirhossientakeh5540 2 жыл бұрын
perfect you explain complicated things very underatandable it's amazing
@statquest
@statquest 2 жыл бұрын
Thank you very much! :)
@ge13r
@ge13r 23 күн бұрын
Saludos desde San Cristóbal, Venezuela!!!
@statquest
@statquest 23 күн бұрын
:)
@akashsoni5870
@akashsoni5870 3 жыл бұрын
Thanks a lot Sir, was waiting for this
@statquest
@statquest 3 жыл бұрын
Bam! :)
@d_polymorpha
@d_polymorpha 5 ай бұрын
Hello, thank you for the video! This series has been really helpful to learn about deep learning. I have a couple of questions. 1. When using gradient descent and backpropagation, do we always use SSR to measure how good a fit the parameter we are estimating is? Or are there other ways? 2. The second question is when using chain rule for calculating derivatives. The first part is d SSR/ d Predicted. In that first part @ 11:25 are you using chain rule again within that first part? And when deriving the inside Observed - Predicted @ 11:34 where do you get 0 and 1 from?
@statquest
@statquest 5 ай бұрын
1. The "loss function" we use for gradient descent depends on the problem we are trying to solve. In this case, we can use the SSR. However, another commonly used "loss function" is called Cross Entropy. You can learn more about cross entropy here: kzfaq.info/get/bejne/bKeihtykmtescYk.html and kzfaq.info/get/bejne/rqh1m5lnu5_LiqM.html 2. You can learn how the chain rule works (and understand the 0 and 1) here: kzfaq.info/get/bejne/rdJhoNyp19q1eIU.html
@DanielRamBeats
@DanielRamBeats 9 ай бұрын
This is finally all making sense to me thank you
@statquest
@statquest 9 ай бұрын
Thanks!
@superk9059
@superk9059 2 жыл бұрын
Thank you very much for your video~ Your videos make me feel that studying English make so much sense, otherwise I can't enjoy such beautiful thing~ 👍👍👍❤❤❤
@statquest
@statquest 2 жыл бұрын
WOW! Thank you very much!!! And thank you for your support!!! :)
@pranjalpatil9659
@pranjalpatil9659 2 жыл бұрын
Perfect explanation!
@statquest
@statquest 2 жыл бұрын
Thank you!
@Luxcium
@Luxcium 11 ай бұрын
Wow 😮 I didn't knew I had to watch the *Gradient Descent Step-by-Step!!!* before I can watch the video related to *Neural Networks part 2* that I must watch before I can watch the *The StatQuest Introduction To PyTorch...* before I can watch the *Introduction to coding neural networks with PyTorch and Lightning* 🌩️ (it’s something related to the cloud I understand) I am genuinely so happy to learn about that stuff with you Josh❤ I will go watch the other videos first and then I will back propagate to this video...
@statquest
@statquest 11 ай бұрын
Getting warmer...
@mahmoudkhadijeh4885
@mahmoudkhadijeh4885 2 жыл бұрын
You did a great job!. Thank you so much
@statquest
@statquest 2 жыл бұрын
Thanks!
@willw4096
@willw4096 9 ай бұрын
Thanks for the great video!! My notes: 7:23 8:11 8:48 10:00 10:22❗,11:13 - 11:48, 11:56 12:08 13:30❗,
@statquest
@statquest 9 ай бұрын
BAM! :)
@alhaanali2502
@alhaanali2502 11 ай бұрын
You got the best way to teach thank you❤
@statquest
@statquest 11 ай бұрын
Thanks!
@miriza2
@miriza2 3 жыл бұрын
BAM! Thanks Josh! You’re the best! Got myself a pink T-shirt 😍😍😍
@statquest
@statquest 3 жыл бұрын
Hooray! And thank you for supporting StatQuest!!!
@harishbattula2672
@harishbattula2672 2 жыл бұрын
great explanation sir. Tripple BAM........kudos to your presentation.
@statquest
@statquest 2 жыл бұрын
Thank you! :)
@kamaleshsenthilmurugan1561
@kamaleshsenthilmurugan1561 2 жыл бұрын
Awesome lecture
@vokoramusyuriy106
@vokoramusyuriy106 Жыл бұрын
Thanks a lot, Josh!
@statquest
@statquest Жыл бұрын
My pleasure!
@epistemophilicmetalhead9454
@epistemophilicmetalhead9454 6 ай бұрын
Back propagation (aka finding w's and b's) start with b_final=0. you'll notice that error = (actual - predicted)^2 is really high. so you find the gradient descent of squared error wrt b_final and find out the value of b_final for which the squared error is minimum. that is your optimal b_final. gradient descent: derivative of sum of squared errors wrt b_final = derivative of sum of squared errors wrt predicted value y * derivative of y wrt b_final. d(y observed - y predicted)^2/d(y predicted) = -2*(y observed - y predicted) d(y predicted)/d(b_final) = d(sum of all those previous curves obtained through each node of the layer + b_final)/d(b_final) = 0+0+0....+0+1=1 take the predicted curve ke x values and find the derivative/slope. step size = slope*learning rate. new b_final = old b_final - step size. keep repeating until slope touches 0. this is how gradient descent works and you've found your optimal b_final.
@statquest
@statquest 6 ай бұрын
double bam
@SPLICY
@SPLICY 3 жыл бұрын
The understated BAM at 4:40 cracked me up 😂
@statquest
@statquest 3 жыл бұрын
SPLICY in the house!!! BAM! :)
@igorg4129
@igorg4129 3 жыл бұрын
Thanks Josh! you simply the best
@statquest
@statquest 3 жыл бұрын
Thank you very much. I can't wait to get the other videos out soon.
@louisrose7823
@louisrose7823 2 жыл бұрын
Great video !
@statquest
@statquest 2 жыл бұрын
Thank you!
@mike___-fi5kp
@mike___-fi5kp Жыл бұрын
You always are the best.
@statquest
@statquest Жыл бұрын
Thanks!
@user-re4xt7dd7d
@user-re4xt7dd7d 3 жыл бұрын
so good. can't wait for the next one!
@statquest
@statquest 3 жыл бұрын
Bam! It should be out soon.
@mauriciobonetti8152
@mauriciobonetti8152 2 жыл бұрын
This is amazing thank you very much!
@statquest
@statquest 2 жыл бұрын
Glad you like it!
@richfilms6307
@richfilms6307 2 ай бұрын
Unbelievable! Thank you!!
@statquest
@statquest 2 ай бұрын
Thanks!
@Morais115
@Morais115 3 жыл бұрын
I'm buying the shirt! Kudos to you sir.
@statquest
@statquest 3 жыл бұрын
Awesome! Thank you!
@constantthomas3830
@constantthomas3830 3 жыл бұрын
Thank you from France
@statquest
@statquest 3 жыл бұрын
Merci! :)
@VishalKhopkar1296
@VishalKhopkar1296 Жыл бұрын
you taught this better than professors at CMU, not kidding
@statquest
@statquest Жыл бұрын
Thank you! :)
@igorg4129
@igorg4129 3 жыл бұрын
Josh, finished watching. Thank you again 1 If I as a researcher know +/- which range of inputs I am going to insert, and which range of outputs I expect to get in the end, will I want to adjust somehow from the very beginning the weights range, maybe weights distribution, same thing about biases and same about activation functions, or today we let the algorithm to do this job? 2 most interesting question: Lets say that while finding the prediction curve we kind of discover some "hidden truth". I think our curve might never be exact also because we do not know all of the independent variables which in nature affect our dependent variable. Say we know one, but there is another one which we do not know about. If so, will it be right to say that when neural network with one input splits the input by different weights into two neurons of a hidden layer (from which the final output is calculated), it is like simulating somehow presence of another "secret independent variable" even without knowing what it is? Thanks
@statquest
@statquest 3 жыл бұрын
I'll be honest, I'm not sure how to answer question #1. I don't know. I do know that some of the methods used for initializing the weights with random values increase the variation allowed in the values based on how many layers are in the neural network - so that might do the trick. As for the second question: Adding the second node in the hidden layer allows the squiggle to go up *and* go down. If I just had one node, I would only be able to go up *or* down. So, in some sense, that is sort of like adding a secret independent variable.
@igorg4129
@igorg4129 3 жыл бұрын
@@statquest Also thought this way. Thank you again and again you do here a titanic job Josh. If not you I wasn't here to ask new questios. :)!
@ZachariahRosenberg
@ZachariahRosenberg 3 жыл бұрын
@@igorg4129 It's tempting to want to initialize weights to a target range in the hopes of speeding up convergence, however this actually might be counter productive. The weights of individual nodes do not have to conform to the same distribution as your output. When you use an appropriate (adaptive) optimizer, it should be able to tune the weights pretty quickly, considering that the first few passes will likely have larger gradients.
@masoudheydari2119
@masoudheydari2119 2 жыл бұрын
your channel is amazing...
@statquest
@statquest 2 жыл бұрын
Thanks!
@xuantungnguyen9719
@xuantungnguyen9719 3 жыл бұрын
StatQuest is the best
@statquest
@statquest 3 жыл бұрын
Thank you very much! :)
@xiaoshengli8205
@xiaoshengli8205 2 жыл бұрын
u r amazing! ur effort r so impressive!
@statquest
@statquest 2 жыл бұрын
Thank you so much 😀
@sivanschwartz3813
@sivanschwartz3813 2 жыл бұрын
Josh thank you so much, great as usual :) Im having a question, would it be more "fully explained" to say: dSSR/db_3 = dSSR/dResiduals * dResiduals/dPredicted * dPredicted/db_3 ? Thanks!
@statquest
@statquest 2 жыл бұрын
I'm not sure. It doesn't change the derivative, so perhaps it's just a matter of taste or preference.
@richarda1630
@richarda1630 3 жыл бұрын
Where were you 5 years ago???!?!?! :D Awesome work man! Keep it up :)
@statquest
@statquest 3 жыл бұрын
Thanks! I have 4 more neural network videos coming out in the next month.
@richarda1630
@richarda1630 3 жыл бұрын
@@statquest awesome! can't wait :D
@zombieeplays3146
@zombieeplays3146 Ай бұрын
I come to this channel for the intros tbh!
@statquest
@statquest Ай бұрын
bam! :)
@AG-dt7we
@AG-dt7we 2 жыл бұрын
Thanks, this and the previous intution on how neural network can fit a complex squiggle is amazing explanations. at 3:25 we assumed all other parameters except b3 were optimised already. but in practice we would begin with all parameters unknow (randomly initialied / using some initialiser). Is that right ?
@statquest
@statquest 2 жыл бұрын
It depends. Sometimes parts of the neural network are pre-trained. Other times they are not.
@abhishekm4996
@abhishekm4996 3 жыл бұрын
Much waiting.... Finally came..
@statquest
@statquest 3 жыл бұрын
Bam! :)
@SPLICY
@SPLICY 3 жыл бұрын
This is what she said
@chandank5266
@chandank5266 3 жыл бұрын
Thanks man❤️
@statquest
@statquest 3 жыл бұрын
:)
@gauravpoudel7288
@gauravpoudel7288 9 ай бұрын
Thank you so much
@statquest
@statquest 9 ай бұрын
No problem!
Neural Networks Pt. 3: ReLU In Action!!!
8:58
StatQuest with Josh Starmer
Рет қаралды 243 М.
Gradient Descent, Step-by-Step
23:54
StatQuest with Josh Starmer
Рет қаралды 1,2 МЛН
КАКОЙ ВАШ ЛЮБИМЫЙ ЦВЕТ?😍 #game #shorts
00:17
Chips evolution !! 😔😔
00:23
Tibo InShape
Рет қаралды 42 МЛН
Backpropagation Details Pt. 1: Optimizing 3 parameters simultaneously.
18:32
StatQuest with Josh Starmer
Рет қаралды 185 М.
What is backpropagation really doing? | Chapter 3, Deep learning
12:47
3Blue1Brown
Рет қаралды 4,3 МЛН
Watching Neural Networks Learn
25:28
Emergent Garden
Рет қаралды 1,1 МЛН
The Essential Main Ideas of Neural Networks
18:54
StatQuest with Josh Starmer
Рет қаралды 870 М.
The Chain Rule
18:24
StatQuest with Josh Starmer
Рет қаралды 231 М.
Gradient Descent Explained
7:05
IBM Technology
Рет қаралды 55 М.
Gradient Descent in 3 minutes
3:06
Visually Explained
Рет қаралды 156 М.