I'm sorry for you that this video didn't do so well with clicks, don't let this discourage you from making more beautiful explanations :) There will always be people having their aha-Moments through you
@mihir777 Жыл бұрын
Intellectual videos don't really get much views. The cat and dog videos will always satisfy the greater population, providing easy dopamine hits to the reptilian brains.
@NovaWarrior772 жыл бұрын
The prodigal son returns.
@allantouring9 ай бұрын
Where's part 3? I love this series! ❤
@DelandaBaudLacanian2 жыл бұрын
my mind is blown, this is so simple and elegant, thank you for taking the time to explain neural networks and linear transformations, this is going to be one of those videos I watch over and over until I really grok it!
@elidrissii Жыл бұрын
Thank you for making these videos, absolute gems. People like you make KZfaq worth it.
@saidelcielo4916 Жыл бұрын
Once again WOW this is the best visualization of neural networks I've ever seen, and I've learned tremendously from it. Please make more videos!!
@aleksszukovskis20742 жыл бұрын
Bruce! It's been a whole year. you still owe me 16 contents
@prometheus73872 жыл бұрын
Grant Junior returns
@KenanSeyidov Жыл бұрын
Excited for part 3!
@sortsvane2 жыл бұрын
Hands down THE most lucid explanation of NN I've seen 💯 Sharing it with my CompSci group. Also curious to see how you'll visualise back propagation.
@m4sterbr0s2 жыл бұрын
Awesome, a new video!! Really happy to see you making content again!!
@stevenbacon38782 жыл бұрын
Thank you for making this video, it's awesome. I look forward to seeing more of your work!
@imranyaaqub17042 жыл бұрын
Thank you for this informative video. I was one of many waiting for part 2, but didn't get notified as I was only subscribed, and didn't know to also hit the bell notification to get an update on when part 2 was out. I suspect many people will be coming back at odd points into the future to see if part 2 has come out. Hope they enjoy it as much as I have.
@vtrandal2 жыл бұрын
This is a rare occasion where I am fortunate to be witnessing excellent progress in technology as it happens. Thank you!
@JamieSundance Жыл бұрын
This video series is fantastic, these concepts never land for me until I see visual spacial context. Keep up the great work, you are greatly appreciated!
@BlackM3sh Жыл бұрын
I happy I managed to find this video again. 😄 I suddenly felt an urge to rewatch it. I really like the clear visuals of the video. It's a shame you have yet to come out with a part 3, though.
@judo-rob51972 жыл бұрын
Very nice explanations of a complicated topic. The visuals make it more intuitive.
@IndyRider Жыл бұрын
This video has done such a great job of visually breaking down a complex concept with examples!
@waynedeng96042 жыл бұрын
this is the best video I’ve ever watched, I’m in tears, you’ve changed my life with your beautiful animations and soothing voice
@NotRexButCaesar2 жыл бұрын
Your linked material about using periodic activation functions was very interesting.
@symbolspangaea Жыл бұрын
I saw this video 11 months after published, and came as a gift. Thank you sooooo much!
@asemhusein75752 жыл бұрын
The words can't explain how amazing this video finally a video that clears everything Thank you
@arnavvirmani86882 жыл бұрын
Video makes it easy for non math folks like me to gain some semblance of an understanding of neural networks. Great job!
@Odisse011 ай бұрын
big up for this outstanding work! as an fellow student of these topics, i want to thank you for the effort put there. i'm really impressed both in the script and animations. much love ❤
@finkelmann2 жыл бұрын
Brilliant stuff. I've watched my share of neural network videos, and this one is truly unique
@polqb32052 жыл бұрын
Wow, the video is sooo good, the explanations are wonderful and the animations are so beautiful, I just love it 😍😍
@hiewhongliang2 жыл бұрын
This is awesome!!! Keep posting and keep up the great work.
@ChauNguyen-jy3fk2 жыл бұрын
I've been waiting for this video for several months!
@airatvaliullin84202 жыл бұрын
What a wonderful explanation! I need to know this for my project and each time I watch something about the NN I'm sure im getting better at understanding what's under the cover. But never have I seen such elegant way to introduce the topic. Bravo!
@vcubingx2 жыл бұрын
Thank you!
@usama579262 жыл бұрын
What a great explanation. Waiting for part 3
@my_master552 жыл бұрын
ngl, this is what is called the "high-quality content", thank you very much for your efforts 👏😍 🚀
@hannesstark50242 жыл бұрын
Awesome job!
@williamharr733811 ай бұрын
Excellent Video!
@soumyasarkar41002 жыл бұрын
this is some extraordinary explaination
@TheBookDoctor2 жыл бұрын
Wow. I've watched a lot of "how do neural networks work" videos, and this is the first one that has offered me any truly new insight in a long time. Excellent!
@vcubingx2 жыл бұрын
Thank you! I appreciate the kind words :)
@arturpodsiady7978 Жыл бұрын
Great video, thank you!
@laurent-minimalisme2 жыл бұрын
Man, this video is a masterpiece! congrat!
@mourirsilfaut67692 жыл бұрын
Thank you for making these videos.
@KukaKaz8 ай бұрын
Amazing video ! Keep it up 👍
@Max-fw3qy Жыл бұрын
Geez man, your video is very good to visualize what a nn really does! One piece of advice, if I may....after a complicated explanation or a vary loaded explanation as you did with the output of the nn, which is very complex to understand uf you know nothing about it, try to summarize it with a simple sentence, just as you did in 7:40. That was beautifully explained, bravo!👍🏻👍🏻👍🏻
@saidelcielo4916 Жыл бұрын
Thanks!
@jasdeepsinghgrover24702 жыл бұрын
Amazing explanation!!..
@woddenhorse2 жыл бұрын
Simply Awesome 🔥🔥🔥🔥
@usama579262 жыл бұрын
Oh man! Finally 2nd part is here.....
@vincent2154 Жыл бұрын
Really great 👍
@mohegyux4072 Жыл бұрын
KZfaq's algorithm should be ashamed of itself !! how could this video have less than 20k views!!!!!! thanks, had multiple whoa! moments
@jacobliu7602 жыл бұрын
I enjoyed this video so much.
@vcubingx2 жыл бұрын
Thank you Jacob.
@adriangabriel32192 жыл бұрын
Really great! Do you have a tutorial on how you created the visualizations of the different layers? Would it be possible to do that in pure python as well?
@CesarMaglione2 жыл бұрын
¡Excellent! Take your like! 👍😉
@aaronwtr11502 жыл бұрын
Thank you for this gerat video
@Hopeful-zx9wk2 жыл бұрын
return of the king
@vcubingx2 жыл бұрын
But when will hopeful69420 return
@LuddeWessen2 жыл бұрын
Really nice video. However, I think you should mention that you use a binarized (one hot) encoding of argmax and not argmax as it is commonly defined, as viewers (like me) could get confused. Otherwise an excellent video, that conveys the intuition really well! 😀
@vcubingx2 жыл бұрын
Good point, I'll include the terminology next time
@wise_math Жыл бұрын
Nice video. How do you make the white edge border of a scene? (like in Recap Part 1 scene)
@MadlipzMarathi2 жыл бұрын
Finally
@skifast_takechances Жыл бұрын
banger
@RohanDasariMinho2 жыл бұрын
Goat cubing x
@praveenrajab06222 жыл бұрын
In 10:49 , aren't the x and y coordinates of the plot is the output values of the second last layer of the nn?
@anwarulbashirshuaib56732 жыл бұрын
holy shit!
@alexcheng24982 жыл бұрын
I've missed this.
@jamietea1072 Жыл бұрын
Intro part 1 Funny Galaxy part 2 Swastika part 3 Ending of Evangelion
@ali493beigi52 жыл бұрын
Great! Can you explain me how you produce these animations? Is there any software you have used?
@vcubingx2 жыл бұрын
I use manim
@Anujkumar-my1wi2 жыл бұрын
I want to ask that as neural net approximates a function over a particular domain interval ,what'll happen if it gets input outside the domain when testing?
@pi-meson76772 жыл бұрын
When you come back after 2¹⁰ years
@dann_y5319Ай бұрын
9:13 grid
@ko-prometheus Жыл бұрын
Can I use your mathematical apparatus, to investigate the physical processes of Metaphysics?? I am looking for a mathematical apparatus capable of working with metaphysical phenomena, i.e. metamathematics!!
@dewibatista57523 ай бұрын
PART 3 PART 3 PART 3
@anshul.infinity2 жыл бұрын
I am trying to visualise how the neural network transformed the input space into linearly separable space layer by layer in a new basic data set.
@gdash69252 жыл бұрын
where were you at 8:50? in University?
@abrahamgomez653Ай бұрын
Chaos happens
@TheRmbomo2 жыл бұрын
5:25 When describing that the sum of the array resulting from softmax equals 1, I think the visual is missing that communication too. Such as stacking all of the lines on top of each other, up to a value of 1 or 100%. Don't just rely on words. Otherwise great video, thank you.
@enisten2 жыл бұрын
3:47 Did you mean a range of i̶n̶p̶u̶t̶s̶ outputs?
@omridrori3286 Жыл бұрын
What about part 3!!!
@PapaFlammy692 жыл бұрын
wb :)
@vcubingx2 жыл бұрын
Thanks:)
@OrenLikes5 ай бұрын
w12 reads the first weight of the second input? this is confusing! should be w21 => from input x2, we look at w1 (that, obviously, goes to output 1)!
@nathannguyen20412 жыл бұрын
How would a neural network handle categorical variables?
@vcubingx2 жыл бұрын
As inputs? One way is to have each input be a vector of dimension n, where n is the number of categories. Then, for each input, assign the category index 1, and the rest 0. For example, if my input was a 4-category variable of either cat, dog, wolf, tiger. Then the input cat could be {1, 0, 0, 0}. See "one-hot encoding" if you're interested
@vcubingx2 жыл бұрын
There are plenty of other ways. In the case of NLP (which is my domain atm), we want to be able to encode tokens (some sequence of characters) into input vectors. An older method to do this is word2vec, which converts words to vectors based off context. This allows us to assign each word to some input vector, and we can pass along each vector as inputs to an NN. These days though, modern neural language models (GPT3, etc.) have sophisticated embeddings and word2vec has largely fallen out of grace
@nit2352 жыл бұрын
Very informative video Thank you a lot Do you have any suggestions for me, I want to learn manim and make videos like how ML algorithms work, their pros and cons cases? Or, if you have a manim learners classes, then I can directly enroll to learn.
@TimmacTR2 жыл бұрын
What the.....
@jayantnema96102 жыл бұрын
hey don't you think saying "this is what NN does under the hood" an overshoot? I mean all the popular literature in textbooks and all ML community also claims that it does exactly that but if this was truly the case, if it was behaving that logicallh then adverserial attacks would have been impossible. But we all know that one pixel attack and noise based attacks are quite frequently achievable by GANs. The interpretation of layers extract features from the input is true provided features are not the human interpretable shapes or patterns, to call them so leads to an error. Because one pixel attacks and noise based attacks do not affect the feature as such, the horse is still horse if you change twentyish pixels out of a 1000.. but the NN suddenly starts saying it is a dog with 99% confidence. If it were really extracting features as in patterns as humans understand it would never even make that error. Humans have 100% accuracy and immunity against some twenty pixels changing out of a 1000 because we extract patterns. NN does not, if it did it should also be immune. But it is not. This means that the popular understanding is still incomplete and it would be wrong to say anything on how NN works under the hood. Since you can find multiple completely different sets of weights and still get excellent classification accuracy. This means the NN is interpreting the spiral in its own way and not human style 5 zone with nonlinear boundary. Because human style there is only 1 interpretation logically possible. That fails to explain how we can get multiple sets of weights, not at all close or alike, still giving solid accuracy
@MrMehrd Жыл бұрын
Hm
@revimfadli46664 ай бұрын
But salty redditors say this isn't how the thing works at all (they deleted their comments in shame after I asked for elaboration)
@vcubingx4 ай бұрын
Haha, sorry but what redditors? What post are you talking about. Kinda curious
@jamesjones8487 Жыл бұрын
I finally realize that I am a useless stupid fool.
@usama579262 жыл бұрын
When 3rd party is coming
@OrenLikes5 ай бұрын
you said "softmax is not a version of argmax" and then you say "softmax is a smoother version of argmax" - make up your mind!