Human Stories in AI: Simon Stochholm
37:27
Human Stories in AI: Khushi Jain
27:13
Human Stories in AI: Achal Dixit
33:12
Human Stories in AI: Rick Marks
31:13
Another 3 lessons from my Pop!!!
6:46
The Ukulele: Clearly Explained!!!
3:27
Пікірлер
@HardikBhakhar
@HardikBhakhar 46 минут бұрын
hmmm carlin fan....BAM!!
@mojtabakanani2335
@mojtabakanani2335 4 сағат бұрын
overrated channel, with all respect
@statquest
@statquest 2 сағат бұрын
noted
@MidnightMartiniBand-po6ty
@MidnightMartiniBand-po6ty 5 сағат бұрын
As usual, Josh provides the best explanations available. I've done a bunch of online courses - and I always end up referring back to StatQuest to truly understand. I purchased the StatQuest Illustrated PDF and use it all the time. Highly recommended for a) learning these concepts and b) having an index of all of the key topics to know for work and interviews (at end of PDF).
@statquest
@statquest 2 сағат бұрын
Thank you very much! :)
@peteradah1045
@peteradah1045 5 сағат бұрын
God bless you KZfaqrs....really I can't explain how much I've learnt so much..
@statquest
@statquest 2 сағат бұрын
Thanks!
@faridfouad9638
@faridfouad9638 6 сағат бұрын
I have to say, thank you so much, infinite BAM!!
@statquest
@statquest 2 сағат бұрын
You're welcome!
@gabrielcrone6753
@gabrielcrone6753 8 сағат бұрын
Hi, Josh. Excellent video! So helpful and clear! 😄I am using a new version of randomForrest, and I cannot seem to locate within my model object the err.rate vector. When I write, "model$err.rate", it returns nothing. Do you know if there are equivalent objects now inside of the model to extract the error rate info? Thanks!
@statquest
@statquest 2 сағат бұрын
What is the exact version you are using? 4.7-1.1 has err.rate. You can see it in the documentation here: cran.r-project.org/web/packages/randomForest/randomForest.pdf
@supahotfire8886
@supahotfire8886 15 сағат бұрын
So there's a 6% correlation between sniffing rocks and a mouse's weight? Lol
@statquest
@statquest 9 сағат бұрын
:)
@Nodgelol1
@Nodgelol1 18 сағат бұрын
great explaination with outstanding graphical representations and very funny presentation style ;) thank you very much!
@statquest
@statquest 16 сағат бұрын
Thanks!
@mousquetaire86
@mousquetaire86 19 сағат бұрын
The book "Weapons of Math Destruction" by Cathy O'Neil, is worth reading, to understand how data analytics can be misused, with malign consequences.
@statquest
@statquest 16 сағат бұрын
noted!
@mardavpatel1951
@mardavpatel1951 19 сағат бұрын
TRIPPLE BAAMM ❤❤
@statquest
@statquest 16 сағат бұрын
:)
@zongyili4629
@zongyili4629 22 сағат бұрын
Thank you for the informative video. I have a question about estimating parameters in statistical analysis. Could you explain the difference between: Using Maximum Likelihood Estimation (MLE) to estimate the parameters of a normal distribution, where the variance formula includes n in the denominator, and Calculating an unbiased estimate of the population variance from a sample, where n−1 is used in the denominator? Both methods use measurements to estimate the parameters of the population from which the measurements are derived, yet they approach the calculation of variance differently. This can be confusing. Could you clarify this?
@statquest
@statquest 16 сағат бұрын
I have a video about why we use n-1 when we estimate variance from a small set of data here: kzfaq.info/get/bejne/qa6Cdcpnp86vmn0.html
@user-ry5zu1wo4e
@user-ry5zu1wo4e 22 сағат бұрын
What an amazing explanation. Thank you so much
@statquest
@statquest 16 сағат бұрын
Thanks!
@martinotanasini3716
@martinotanasini3716 23 сағат бұрын
Thanks for the great video! Do I understand correctly that the sample size determined from the power analysis depends on the means and variances estimated from the first experiment? How do I deal with the fact that this first experiment could result in estimating means and variances far away from the population's?
@statquest
@statquest 16 сағат бұрын
You have to start with some general sense of the variation in the data. It doesn't have to be perfect, but it's the best you can do.
@legendary_gameron6760
@legendary_gameron6760 Күн бұрын
Can you or any one help me, how can I use this method to adjust all perameters simontanously of a 2 hidden layer containing nural network and also do I need to calculate manually and then plug in the values in my network. You may need a video to explain😅😅......
@statquest
@statquest 17 сағат бұрын
I've got some code examples for how to use cross entropy in PyTorch here: lightning.ai/lightning-ai/studios/statquest-build-and-train-a-neural-network-with-multiple-inputs-and-outputs
@RahulVerma-Jordan
@RahulVerma-Jordan Күн бұрын
This should also reach to the AI/ML scientists behind these algorithms.
@statquest
@statquest 17 сағат бұрын
:)
@fisicaparalavida108
@fisicaparalavida108 Күн бұрын
those grapsh are excellent. how much work doing it. Thank you so much!
@statquest
@statquest 17 сағат бұрын
Thanks! Lots of work goes into these.
@esmeluo6574
@esmeluo6574 Күн бұрын
This is such a great video! I do have one question, it looks like encoder calculate self attention for all words in the input (regardless of ordering) but decoder only compute self attention related to the words that appeared previously, it this the correct understanding? It also looked like the work for encoder can be parallelized but decoder is sequential (since the second token use first token as an input. Is this the correct understanding?
@statquest
@statquest 17 сағат бұрын
Those are the main ideas. The encoder can see all of the input at the same time, and the decoder can only see the output as it is generated. During inference, the encoder can process the the input in parallel and the decoder does its work sequentially. However, during training (which is the hard, time consuming part), the decoder can do its work in parallel using something called "masked self-attention", which I describe in the "decoder-only transformer" video: kzfaq.info/get/bejne/mLdlddKg0b6dcZs.html
@RahulVerma-Jordan
@RahulVerma-Jordan Күн бұрын
If I watched your videos during my college, my career trajectory would be totally different. BIG BAM!!!!
@statquest
@statquest 17 сағат бұрын
Thanks!
@larissacury7714
@larissacury7714 Күн бұрын
@statquest
@statquest 17 сағат бұрын
:)
@birdost8448
@birdost8448 Күн бұрын
You are the best!!!!!❤🎉😊
@statquest
@statquest 17 сағат бұрын
Thanks! :)
@alexandradragonstone6015
@alexandradragonstone6015 Күн бұрын
when getting the average why there is n-1 in the denominator ?
@statquest
@statquest Күн бұрын
For PCA, this is just convention. However, it has roots in how variance is estimated in general, which I try to explain here: kzfaq.info/get/bejne/qa6Cdcpnp86vmn0.html
@jacobamarjan2325
@jacobamarjan2325 Күн бұрын
I've been struggling to understand neural networks until i stumbled upon this video. This is the best explanation with the best presentation (I agree fully on using easy to understand visualization instead of those fancy one). I don't usually write comments, but I feel the need to thank you for this. Thank you so much!
@statquest
@statquest Күн бұрын
Thank you very much! :)
@user-wf2co1fq6s
@user-wf2co1fq6s Күн бұрын
i found out your channel today by my Dad's recommendation
@statquest
@statquest Күн бұрын
bam!
@mortyk182
@mortyk182 Күн бұрын
woah this was some amazing teaching skills sir, you're totally gifted with that
@statquest
@statquest Күн бұрын
Thanks! 😃
@RoyalYoutube_PRO
@RoyalYoutube_PRO Күн бұрын
Has the mystery of 'n-1' been resolved yet?
@statquest
@statquest Күн бұрын
Not yet. The best I can do is give you a link: online.stat.psu.edu/stat415/lesson/1/1.3
@ukkyukang
@ukkyukang Күн бұрын
Thanks!
@statquest
@statquest Күн бұрын
bam! :)
@user-rf8jf1ot3t
@user-rf8jf1ot3t Күн бұрын
I love this video. Simple and clear.
@statquest
@statquest Күн бұрын
Thanks!
@dy8576
@dy8576 Күн бұрын
I keep coming back to this video, every time i forget about the inner workings, and its always as easy to regather everything. What content!
@statquest
@statquest Күн бұрын
Glad it's helpful! :)
@PuneetMehra
@PuneetMehra Күн бұрын
This video specifically was too difficult to understand. For me :(
@statquest
@statquest Күн бұрын
Sorry to hear that. It might be helpful if you watched this one first: kzfaq.info/get/bejne/fqpzgriJqpmsfYE.html
@sidasmad2389
@sidasmad2389 Күн бұрын
You've an amazing way of breaking down things and I can't believe how entertaining you made it.
@statquest
@statquest Күн бұрын
Thank you very much!
@sidasmad2389
@sidasmad2389 Күн бұрын
Just found your channel a few minutes ago and boy am I already questing on and on. Thank you for the amazing lessons!
@statquest
@statquest Күн бұрын
Bam! :)
@sandeeppatra4577
@sandeeppatra4577 Күн бұрын
Hey, The lecture was great, i completely understood the concept of ChIP Seq, I have on doubt, lets say if the DNA binding protein is unknown, for example if its a novel transcription factor and we don't have much information about it. How can we raise antibodies against that protein if its completely new and also how can we identify the DNA sequence subsequently?
@statquest
@statquest Күн бұрын
There may be methods that can just determine protein-bound regions, in a general sense.
@ertugruledits3349
@ertugruledits3349 Күн бұрын
BAM!:) you are our exam saviour 😅
@statquest
@statquest Күн бұрын
Good luck!
@koustubhmuktibodh4901
@koustubhmuktibodh4901 Күн бұрын
Great explanation.
@statquest
@statquest Күн бұрын
Thanks!
@enum4794
@enum4794 Күн бұрын
i wonder how will LSTM works with 20 units in this case, i hope u can explain it to me thankyou. Btw thanks for the great content!
@statquest
@statquest Күн бұрын
I talk about how to stack and use multiple LSTMs in my video on encoder-decoder networks: kzfaq.info/get/bejne/gp54ftqWv6-znZs.html
@TheWayOfNaN
@TheWayOfNaN Күн бұрын
kzfaq.info/get/bejne/fZmVo69ettCwlZc.html
@statquest
@statquest Күн бұрын
bam! :)
@khanghuy7384
@khanghuy7384 2 күн бұрын
u saved my life
@statquest
@statquest Күн бұрын
bam! :)
@amirammar6687
@amirammar6687 2 күн бұрын
You exemplify what a lecturer should be.
@statquest
@statquest Күн бұрын
Thanks!
@yuvalalmog6000
@yuvalalmog6000 2 күн бұрын
Will you ever make videos on the subjects of Reinforcement learning, NLP or generative models?
@statquest
@statquest Күн бұрын
I think you could argue that this video is about NLP and is also a generative model, and I'll keep the other topic in mind.
@yuvalalmog6000
@yuvalalmog6000 Күн бұрын
​@@statquest I"ll explain myself better as I admit I phrased it poorly. For deep learning and machine learning you made amazing videos that covered the subjects from basic aspects to advanced ones - thus essentially teaching the whole subject in a fun, creative & enjoyable sequence of videos that can help beginners know it from top to bottom. However, for NLP for example you did talk about specific subjects like word embedding or auto-translation, but there are other topics (mostly older things) in that field that are important to learn such as n-grams & HMM. So my question was not only about specific advanced topics that connect to others, but rather about a full course that covers the basics of the subject as well. Sorry for my bad phrasing and thank you both for your quick answer and amazing videos! 😄
@statquest
@statquest Күн бұрын
@@yuvalalmog6000 I hope to one day cover HMMs.
@AccessKelly
@AccessKelly 2 күн бұрын
My cat liked this song.
@statquest
@statquest Күн бұрын
bam! :)
@studentgaming3107
@studentgaming3107 2 күн бұрын
dislike for the shitty intro's but good video though
@statquest
@statquest Күн бұрын
noted
@sweetlemon4625
@sweetlemon4625 2 күн бұрын
i watched it on 2x speed to save time ...except I had to repeat it 3 times
@statquest
@statquest Күн бұрын
That's pretty funny. Bam?
@meets8
@meets8 2 күн бұрын
THANK YOU SOOOOOOOO MUCH!!!! So so so grateful I found you!
@statquest
@statquest Күн бұрын
Glad I could help!
@ID10T_6B
@ID10T_6B 2 күн бұрын
This is the only video on youtube that explains how such a complicated thing works so simply.
@statquest
@statquest 2 күн бұрын
bam! :)
@PuneetMehra
@PuneetMehra 2 күн бұрын
This is your 9th video after Intro and you directly jumped to "large p-value", without explaining what is p-value and what is "large p-value", "t-test", etc in any of the previous 8 videos!
@statquest
@statquest 2 күн бұрын
Sorry about that. You'll start learning about those in the next video.
@andile3982
@andile3982 2 күн бұрын
Been studying for my exams and have really struggled with this section. Thanks Josh. QUADRUPLE BAM !
@statquest
@statquest 2 күн бұрын
Hahaha! BAM! :)
@halkerable
@halkerable 2 күн бұрын
Thank you Josh - these videos are so helpful. Is there a statistical test for this question: "is the slope coefficient of a linear regression model equal to 1?" The context is for a quantitative test, comparing it to a series of analytes of known quantity. I'd like to know if the test has a bias, e.g. it over-quantitates higher concentrations (or under-quantitates lower concentration samples) Thanks again
@statquest
@statquest 2 күн бұрын
I don't believe that there is one, but you can plot the residuals and see if you see a bias there (the residuals should be normally distributed - and you can test that with something called a K-S Test).
@jorenmaes498
@jorenmaes498 2 күн бұрын
I just noticed when you said "please subscribe" at the end of the video, the subscribe button lit up:)
@statquest
@statquest 2 күн бұрын
bam! :)