No video

Cosine Similarity, Clearly Explained!!!

  Рет қаралды 86,540

StatQuest with Josh Starmer

StatQuest with Josh Starmer

Күн бұрын

Пікірлер: 259
@statquest
@statquest Жыл бұрын
To learn more about Lightning: lightning.ai/ Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
@marchanselthomas
@marchanselthomas 2 ай бұрын
The explanation is so clean. I was clapping for him from my room. How can someone be so good at their job!
@statquest
@statquest 2 ай бұрын
Thank you! :)
@nuridaw9586
@nuridaw9586 22 күн бұрын
I clapped too, twice! :)
@jwilliams8210
@jwilliams8210 Жыл бұрын
You are EXCEPTIONALLY good at CLEARLY describing complex topics!!! Thank you!
@statquest
@statquest Жыл бұрын
Thank you very much! :)
@usamsersultanov689
@usamsersultanov689 Жыл бұрын
I think and hope that this video is a preamble for more comlex NLP topics such as Word Embeddings etc.. many thanks for all of your efforts!
@statquest
@statquest Жыл бұрын
Yes it is! :)
@xanderortega4359
@xanderortega4359 4 ай бұрын
Cosine Similarity is used as an evaluation tool on word2vec
@mattgenaro
@mattgenaro 9 ай бұрын
Such a simple, yet, a beautiful and powerful concept of similarity. Thanks, StatQuest!
@statquest
@statquest 9 ай бұрын
bam!
@nossonweissman
@nossonweissman Жыл бұрын
You literally make it so easy!! I can't help but smile 😊😊😊❤️❤️❤️ By far one of my favorite KZfaq channels!
@statquest
@statquest Жыл бұрын
Thank you so much! :)
@tysontakayushi8394
@tysontakayushi8394 3 ай бұрын
I usually hate when people say that a video explains well, because usually this is not the case. But, haha, amazing job! Well done, really nice explained, it's a gamification, they way I understand!
@statquest
@statquest 3 ай бұрын
Thanks!
@virenpai9395
@virenpai9395 6 ай бұрын
My Love for learning Data Science and Statistics has increased multi-folds because of you. Thank you Josh!!🙂
@statquest
@statquest 6 ай бұрын
bam! :)
@insaiyancvk
@insaiyancvk 24 күн бұрын
Wonderfully explained, Josh! You've earned a subscriber!
@statquest
@statquest 24 күн бұрын
Thank you!
@jasonlough6640
@jasonlough6640 5 ай бұрын
Dude these are so good. I have to watch them several times, and then I try write some code to reinforce the concept. Your vides are absolutely amazing.
@statquest
@statquest 5 ай бұрын
Thank you!
@kforay42
@kforay42 Жыл бұрын
Your videos are such a lifesaver! Could you do one on the difference between PCA and ICA?
@statquest
@statquest Жыл бұрын
I'll keep that in mind.
@bladongarland8635
@bladongarland8635 3 ай бұрын
Hilarious, easy to understand, and entertaining. Bravo!
@statquest
@statquest 3 ай бұрын
Glad you enjoyed it!
@torley
@torley Жыл бұрын
QUADRUPLE BAM!!! Thanks for such fun yet pragmatic explainers.
@statquest
@statquest Жыл бұрын
Thank you!
@dukeduke1910
@dukeduke1910 4 ай бұрын
This guy is seriously funny. I thought I was the only person who ever watched gymkata (like 50 times, especially the part in the town where everyone was crazy). This video def explains cosine sim clearly. Thk u!
@statquest
@statquest 4 ай бұрын
BAM! :)
@KarthikNaga329
@KarthikNaga329 Жыл бұрын
This is another great video, Josh! question: @3:51 you talk about having 3 Hellos and that still results in a 45 degree angle with Hello World. However, comparing Hello to Hello World seems to be a diff angle from comparing Hello to Hello World World. Is there an intuition as to why this is the case? That is adding as many Hellos to Hello keeps the angle the same, but adding more Worlds to Hello World seems to change the Cosine Similarity.
@statquest
@statquest Жыл бұрын
Two answers: 1) Just plots the points on a 2-dimensional graph for the two pairs of phrases and you'll see that the angles are different. 2) The key difference is that "hello hello hello" only contains the word "hello". If we had included "world", then the angles would be different. Again, you can plot the points to see the differences.
@user-yd8sr9ot9u
@user-yd8sr9ot9u 10 ай бұрын
wow thankyou!!! i don't know how to calculate it , but after watching this, i become mathmatician!!
@statquest
@statquest 10 ай бұрын
bam!
@olucasharp
@olucasharp Жыл бұрын
It all seems so easy when you speak about such complicated things! Huge talent! And so funny ⚡⚡⚡
@statquest
@statquest Жыл бұрын
Thank you!
@AreyHawUstad
@AreyHawUstad 2 ай бұрын
Holy shit did I land on a gold mine. Love the explanation (minus the intro, sorry Josh). Thanks a bunch!
@statquest
@statquest 2 ай бұрын
Thanks!
@magicfox94
@magicfox94 Жыл бұрын
Excellent explaination! I hope it is the first of a NLP series of videos!
@statquest
@statquest Жыл бұрын
I hope to do word embeddings soon.
@ibrahimogunbiyi4296
@ibrahimogunbiyi4296 3 ай бұрын
I came here as I need to learn something in NLP. Thank you, I understood it clearly.
@statquest
@statquest 3 ай бұрын
BAM! :)
@user-yu7ie2em5b
@user-yu7ie2em5b 2 күн бұрын
You really are the best !
@statquest
@statquest Күн бұрын
Thank you!
@RaynerGS
@RaynerGS 10 ай бұрын
I love you!!!! Salute from Brazil.
@statquest
@statquest 10 ай бұрын
Muito obrigado! :)
@bjornnorenjobb
@bjornnorenjobb Жыл бұрын
Awesome video! I had no idea what Cosine Similarity was, but you explained super clearly
@statquest
@statquest Жыл бұрын
Thanks!
@suzhenkang
@suzhenkang Жыл бұрын
pretty good .
@raphaelbonillo2192
@raphaelbonillo2192 3 ай бұрын
Você democratiza a matemática! Deveriam fazer assim nas escolas.
@statquest
@statquest 3 ай бұрын
Muito obrigado!
@ericvaish
@ericvaish 10 күн бұрын
How am I able to understand this topic? Wasn't this supposed to be difficult? 😭 Seriously Great Explanation Josh.
@statquest
@statquest 10 күн бұрын
Thank you!
@Ghulinzer
@Ghulinzer Жыл бұрын
Great video! I've seen though in many articles out there that people consider cosine similarity the same as Pearson's correlation since they produce the same outcome when E(X) = E(Y) = 0 and the means of X and Y = 0. This is not true since both measure different things. Cosine similarly measures the cosine of the angle between two vectors in a multi-dimensional space and returns a similarity score as explained in the video, while Pearson's correlation measure the linear relationship between 2 variables.
@statquest
@statquest Жыл бұрын
Correct!
@abdulrafay2420
@abdulrafay2420 7 ай бұрын
What a great way of explaination !! Love it ❤
@statquest
@statquest 7 ай бұрын
Thanks!
@theedspage
@theedspage Жыл бұрын
Hello! Hello! Hello! Thank you for introducing me to this topic! Subscribed.
@statquest
@statquest Жыл бұрын
Awesome! Thank you!
@davidmurphy563
@davidmurphy563 Жыл бұрын
Could you cover discrete cosine/fourier transforms pretty please?* I've love to know how to break signals up into their component frequencies. If you haven't already!
@statquest
@statquest Жыл бұрын
I'll keep that in mind.
@Iiochilios1756
@Iiochilios1756 Жыл бұрын
Have you seen 3blue1brown video on this topic? Not sure if it about descreet FT.
@infraia
@infraia 2 ай бұрын
Excellent explanation!
@statquest
@statquest 2 ай бұрын
Thanks!
@spambaconeggspamspam
@spambaconeggspamspam Жыл бұрын
Perfect! I'm trying to figure out how to best present my Single Cell Data in a UMAP and saw i cosine is the default distance metric in Seurat!
@statquest
@statquest Жыл бұрын
BAM! :)
@FreeMc54
@FreeMc54 Ай бұрын
you are insane at explaining clearly, btw you sing really well😂
@statquest
@statquest Ай бұрын
Thanks! 😃
@limebro8833
@limebro8833 Жыл бұрын
This video saved me, I cannot thank you enough.
@statquest
@statquest Жыл бұрын
Bam! :)
@chrisguiney
@chrisguiney Жыл бұрын
This video also does a good job highlighting how cosine and dot products are the same. Unless I'm mistaken, that equation can be written dot(a, b) / (magnitude(a) * magnitude(b)), where magnitude(x) = sqrt(dot(x, x))
@statquest
@statquest Жыл бұрын
yep
@user-kp8lw1nz7m
@user-kp8lw1nz7m 6 ай бұрын
you are the King Josh 👏👏👏👏 wonderful job!!!
@statquest
@statquest 5 ай бұрын
Thank you! 😃
@suaridebbarma1255
@suaridebbarma1255 3 ай бұрын
this video was absolutely a BAM!!
@statquest
@statquest 3 ай бұрын
Thanks!
@anuj5576
@anuj5576 Жыл бұрын
Super simplistic explanation! Thanks for your effort.
@statquest
@statquest Жыл бұрын
Thanks!
@DrKnowsMore
@DrKnowsMore 11 күн бұрын
Like most things, it is relatively straightforward when you remove the jargon
@statquest
@statquest 11 күн бұрын
bam! :)
@fazelamirvahedi9911
@fazelamirvahedi9911 9 ай бұрын
Thank you for making all of these informative, simple and precise videos. I wondered what happens if two phrases deliver the same meaning but have different orders of words, for instance: A) I like Gymkata. B) I really like Gymkata. In this case doesn't the extra adverb "really" in the second sentence disturb the phrase matrix? And one more question, if the three phrases have the same length and two of them have the same meaning but have used different words, like: A) I like Gymkata. B) I love Gymkata. C) I like volleyball. In this case, would the cosine similarity between A and B be more than A and C?
@statquest
@statquest 9 ай бұрын
In this video, we're simply counting the number of words that are the same in different phrases, however, you can use other metrics to calculate the cosine similarity, and that is often the case. For example, we could calculate "word embeddings" for each word in each phrase and calculate the cosine similarity using the word embedding values and that would allow phrases with similar meanings to have larger similarities. To learn more about word embeddings, see: kzfaq.info/get/bejne/rM-KpbKfr8nQiWQ.html
@willw4096
@willw4096 Жыл бұрын
Great video! My notes: 3:52 4:23
@statquest
@statquest Жыл бұрын
bam!
@exoticcoder5365
@exoticcoder5365 Жыл бұрын
I must watch Gymkata ! Thanks for the recommendation ! And excellent explanation of the topic !
@statquest
@statquest Жыл бұрын
bam! :)
@azzahamed2063
@azzahamed2063 7 ай бұрын
This is an AMAZING explanation !!
@statquest
@statquest 7 ай бұрын
Thank you!
@AU-hs6zw
@AU-hs6zw Жыл бұрын
You deliver the moment I need it. Thanks
@statquest
@statquest Жыл бұрын
BAM! :)
@MOROCCANFREEMIND
@MOROCCANFREEMIND 6 ай бұрын
The quality of your explanation is more than triple bam!!😂
@statquest
@statquest 6 ай бұрын
Thanks!
@muhammadazeemmohsin5666
@muhammadazeemmohsin5666 9 ай бұрын
what's an amazing explanation. Thanks for the video.
@statquest
@statquest 9 ай бұрын
Thanks!
@RichardGreco
@RichardGreco Жыл бұрын
Great video. Very interesting. I hope to see you apply this to more examples.
@statquest
@statquest Жыл бұрын
We'll see it used in CatBoost for sure.
@AmineBELALIA
@AmineBELALIA Жыл бұрын
this video needs more views it is awesome
@statquest
@statquest Жыл бұрын
Thank you! :)
@sciab3674
@sciab3674 5 ай бұрын
thanks a lot. easy to understand
@statquest
@statquest 5 ай бұрын
Thanks!
@lifeisbeautifu1
@lifeisbeautifu1 5 ай бұрын
Thank you!
@statquest
@statquest 5 ай бұрын
Thanks!
@abrahammahanaim3859
@abrahammahanaim3859 Жыл бұрын
Hey josh thanks for the video nice explanation.
@statquest
@statquest Жыл бұрын
You bet!
@debatradas1597
@debatradas1597 Жыл бұрын
Thank you so much
@statquest
@statquest Жыл бұрын
You're most welcome!
@bachdx2812
@bachdx2812 Жыл бұрын
thanks a lot. this kind of videos are super helpful for me !!!
@statquest
@statquest Жыл бұрын
Thanks! :)
@mystmuffin3600
@mystmuffin3600 Жыл бұрын
Cool! (in StatQuest voice)
@statquest
@statquest Жыл бұрын
bam! :)
@millennialm1money500
@millennialm1money500 4 ай бұрын
Great video 🎉
@statquest
@statquest 4 ай бұрын
Thank you 😁!
@ymperformance
@ymperformance Жыл бұрын
Great video and great explanation! Thanks.
@statquest
@statquest Жыл бұрын
Glad it was helpful!
@kavita8925
@kavita8925 Жыл бұрын
Your Explanation is great
@statquest
@statquest Жыл бұрын
Thanks!
@madhubabukencha5037
@madhubabukencha5037 Жыл бұрын
Man you are not human, you are my god 😀
@statquest
@statquest Жыл бұрын
:)
@CristianoGarcia10
@CristianoGarcia10 Жыл бұрын
Excellent and clear video! I wonder why NLP applications use more often cosine distance rather than other metrics, such as euclidean distance. Is there a clear reason for that? Thanks in advance
@statquest
@statquest Жыл бұрын
I'm not certain, but one factor might be how easy it is to compute (people often omit the denominator making the calculation even easier) and it might be nice that the cosine similarity is always between 0 and 1 and doesn't need to be normalized.
@gsp_admirador
@gsp_admirador Жыл бұрын
nice easy explanation
@statquest
@statquest Жыл бұрын
Thanks!
@samrasoli
@samrasoli Жыл бұрын
useful, thanks
@statquest
@statquest Жыл бұрын
Thanks!
@dataanalyticswithmichael8931
@dataanalyticswithmichael8931 Жыл бұрын
superb ! Thank you for the explanation
@statquest
@statquest Жыл бұрын
Thanks!
@murilopalomosebilla2999
@murilopalomosebilla2999 Жыл бұрын
Hello!! Nice video!
@statquest
@statquest Жыл бұрын
Thank you!
@jonathanramos6690
@jonathanramos6690 5 ай бұрын
Amazing!!
@statquest
@statquest 5 ай бұрын
Thanks!
@smegala3815
@smegala3815 Жыл бұрын
Very useful 👍
@statquest
@statquest Жыл бұрын
Thank you! :)
@pouryajafarzadeh5610
@pouryajafarzadeh5610 Жыл бұрын
Cosine similarity is a good method for comparing the embedding vectors, especially for face recognition.
@statquest
@statquest Жыл бұрын
Nice!
@edmiltonpeixeira3221
@edmiltonpeixeira3221 Жыл бұрын
Parabéns pelo conteúdo. Excelente explicação, como não encontrei em nenhum outro vídeo
@statquest
@statquest Жыл бұрын
Muito obrigado! :)
@banibratamanna5446
@banibratamanna5446 4 ай бұрын
the generalized equation of cosine similarity comes from the dot product of 2 vectors in multidimension.....by the way big fan of yours❤
@statquest
@statquest 4 ай бұрын
scaled to be between -1 and 1. :)
@luizcarlosazevedo9558
@luizcarlosazevedo9558 Жыл бұрын
Hey, great video as always!! Is the cosine similarity good for regression problems in which the targets are pretty close to zero? Im trying to implement some accuracy metrics for a transformer model
@statquest
@statquest Жыл бұрын
Hmm... I bet it would work (if you had a row of predictions and a row of known values).
@Francescoct
@Francescoct Жыл бұрын
Great video! Have you made one for the Word Embeddings?
@statquest
@statquest Жыл бұрын
Coming soon!
@cartulinito
@cartulinito Жыл бұрын
Great video as we are used to.
@statquest
@statquest Жыл бұрын
Thank you! :)
@miltonborges7356
@miltonborges7356 8 ай бұрын
Amazing
@statquest
@statquest 8 ай бұрын
Thanks!
@Shehab-Codes
@Shehab-Codes 9 ай бұрын
Thank you so much I had no idea what cosine similarity is and you illustrated it easily, appreciate it Btw how cosine similarity can result in -ve number
@statquest
@statquest 9 ай бұрын
The cosine similarity can be calculated for any 2 sets of numbers, and that can result in a negative value.
@Levy957
@Levy957 Жыл бұрын
you are amazing
@statquest
@statquest Жыл бұрын
Thanks!
@chris-graham
@chris-graham Жыл бұрын
"in contrast, this last sentence is from someone who does not like troll 2" - I was expecting a BOOOO after that lol
@statquest
@statquest Жыл бұрын
Ha! That would have been great.
@lonok84
@lonok84 4 ай бұрын
Wow, I used this to make a bot from whatsapp, to put client on flow/menu based on the first message from client
@statquest
@statquest 4 ай бұрын
bam!
@yuan8947
@yuan8947 Жыл бұрын
Always thank you for the great and easy-understanding video! And I have a question about the totally different word. If there are 2 sentences like very good/super nice, since very, good, super, nice are totally different, the cosine similarity will be 1. However, they are actually the same meaning! I want to ask what else preprocessing should we do toward such situation? Thank you so much!
@statquest
@statquest Жыл бұрын
I think you might need more context (longer phrases) to get a better cosine similarity. I just used 2 words because I could draw them, but in practice, you use more.
@AxDhan
@AxDhan Жыл бұрын
I'm a native spanish speaker, and it surprised me when it started speaking spanish, it will reach more people, but they will miss your motivating silly songs xD
@statquest
@statquest Жыл бұрын
Thanks! Yeah - I'm not sure what to do about the silly songs. :)
@nidhi_singh9494
@nidhi_singh9494 5 ай бұрын
Hey...so cosine is only depends on angle not on lengths... When the case of three Hello were shown, how it can be distinguished between them as similarity is same for both sentence
@statquest
@statquest 5 ай бұрын
What time point, minutes and seconds, are you asking about?
@s0meus3r
@s0meus3r 5 ай бұрын
I got it BAMM !!🎉
@statquest
@statquest 5 ай бұрын
BAM!
@jainanshu2000
@jainanshu2000 Жыл бұрын
Great video ! One question - how is this diffrent from the regular string comparison we use various programming languages?
@statquest
@statquest Жыл бұрын
I'm not sure I understand your question. My understanding of string comparison in programming languages is that it just compares the bits to make sure they are equal and the result is a boolean True/False type thing.
@itSinger
@itSinger 7 ай бұрын
tysm
@statquest
@statquest 7 ай бұрын
Thanks!
@MrJ17J
@MrJ17J Жыл бұрын
Super interesting ! Do you have examples of how those are implemented in practice ?
@statquest
@statquest Жыл бұрын
I talk about that at the start of the video, but it's also used by CatBoost to compare the predicted values for a bunch of samples to their actual values.
@XEQUTE
@XEQUTE 5 ай бұрын
You're kinda like Phil from Mordern Family but for Data Science/ Statistics
@statquest
@statquest 5 ай бұрын
:)
@SalahMusicOfficial
@SalahMusicOfficial Жыл бұрын
Hi Josh, I’m trying to understand why cosine similarity may be the best metric to find semantically similar texts (using pertained embeddings). It sounds like the two vectors have to only directionally similar for cosine similarity to be high. What about using something like Euclidean or Manhattan distance. Would a distance metric be better to see if two texts are semantically similar?
@statquest
@statquest Жыл бұрын
That's a good question and, to be honest, I don't know the answer. I do know, however, that most neural networks - when they use "attention" (like in transformers, which are used for ChatGPT) - just use the numerator of the cosine similarity as the "similarity metric". In other words, they just compute the dot-product. Maybe they do this because it's super fast, and the speed outweighs the benefits of using another, more sophisticated method. Also, it's worth noting that this is a similarity metric and not a distance. In other words, as the value goes up, things are "more similar" (the angle is smaller). In contrast, the Euclidean and Manhattan distances are...distances. That is, as the value goes up, the things are further away and considered "less similar" Lastly, cool music on your channel! You've got a dynamite voice.
@SalahMusicOfficial
@SalahMusicOfficial Жыл бұрын
@@statquest thank you! let me know if you need another voice in any of your intro jingles 😁
@statquest
@statquest Жыл бұрын
@@SalahMusicOfficial bam!
@rajashreechakraborty747
@rajashreechakraborty747 7 ай бұрын
Can u please help me with this? This is my data: A: cosine: 0.58, z-score: 372 B: cosine: 0.63 , z-score: 370 How can I find the p-value/significance of the 0.5 change in the cosine similarities?
@statquest
@statquest 7 ай бұрын
We didn't cover p-values in the video.
@SystemDesign-pro
@SystemDesign-pro Ай бұрын
if you say DOUBLE BAM one more time, I'm goona FLMAO
@statquest
@statquest Ай бұрын
:)
@arvindmathur5556
@arvindmathur5556 22 күн бұрын
Somewhere it says Cosine Similarity is a number between -1 and +1 but in other places it is said to be between 0 & 1. What is the truth?
@statquest
@statquest 22 күн бұрын
The cosine similarity can be between -1 and 1. If all the input data are positive (like they are in a bunch of the examples in this video, since we are just using count data, and count data is positive) then you'll be restricted to values between 0 and 1, but the data don't always have to be positive.
@Mrnafuturo
@Mrnafuturo Жыл бұрын
Does cosine similarity equation ends up being a vector normalization of the projection of one vector over the other one?
@statquest
@statquest Жыл бұрын
I believe that is correct.
@001kebede
@001kebede 11 ай бұрын
how can we relate this with correlation between two continuous random variables?
@statquest
@statquest 11 ай бұрын
See: stats.stackexchange.com/questions/235673/is-there-any-relationship-among-cosine-similarity-pearson-correlation-and-z-sc#:~:text=TL%3BDR%20Cosine%20similarity%20is,a%20norm%20of%20%E2%88%9An.&text=To%20convert%20a%20z%2Dscore,function%20for%20a%20Gaussian%20distribution.
@fathan3306
@fathan3306 Ай бұрын
BAMMM!
@statquest
@statquest Ай бұрын
:)
@ZOBAER496
@ZOBAER496 Жыл бұрын
Can you please tell about some applications of cosine similarity like where is it used in which type of problems?
@statquest
@statquest Жыл бұрын
I talk about that at the start of the video, but you can also use it whenever you want to compare two rows of data. For example, CatBoost uses it compare predicted values for a bunch of data to their actual values.
@aquagardening5803
@aquagardening5803 7 ай бұрын
BAM!!!
@statquest
@statquest 7 ай бұрын
:)
@kushiiiy1582
@kushiiiy1582 Жыл бұрын
Why is it specifically Cos, and not Tan? Since you’re collecting the opposite and adjacent length??
@statquest
@statquest Жыл бұрын
The cosine is easy to calculate and, unlike the tangent function, is defined for all possible angles.
@raven-888
@raven-888 Жыл бұрын
Love you
@tophat593
@tophat593 Жыл бұрын
Love you too
@statquest
@statquest Жыл бұрын
:)
@eddiesec
@eddiesec Жыл бұрын
I still don't understand how that works for embeddings though. Each embedding dimension should represent loosely a grammatical property of the words, than how can one word that is farther than another in a single dimension (as in your Hello Hello Hello example) be considered identical?
@statquest
@statquest Жыл бұрын
I'll do a video on embeddings soon.
@sushi666
@sushi666 Жыл бұрын
Can you please do Spherical K Means with Cosine Similarity as the distance metric?
@statquest
@statquest Жыл бұрын
I'll keep that in mind.
@PromitiDasgupta-mz7uc
@PromitiDasgupta-mz7uc Жыл бұрын
can i use cosine similarity for building a similarity matrix between two different brain regions?
@statquest
@statquest Жыл бұрын
Probably.
@Olddays100s
@Olddays100s 10 ай бұрын
but if the phrases are Hello World and World Hello. The cosine would still be 1. how to differentiate between them using cosine similarities? do algorithms introduce another dimension?
@statquest
@statquest 10 ай бұрын
Algorithms use other methods to keep track of word order. For example, transformers use positional encoding. To learn more, see: kzfaq.info/get/bejne/sN6BrLd8ndfZqY0.html
@shintaardani6332
@shintaardani6332 5 ай бұрын
I am conducting sentiment analysis research and found that some data has a Cosine Similarity of 0. Are there any methods to make the Cosine Similarity not equal to 0?
@statquest
@statquest 5 ай бұрын
you could pad each phrase with something, so all phrases have at least one thing in common.
@shintaardani6332
@shintaardani6332 5 ай бұрын
@@statquest Thank you so much😁
@skiraf
@skiraf Жыл бұрын
"Troll 2" should be considered 1 word. It refers to only one idea, the troll sequel movie which is different than the first troll movie.
@statquest
@statquest Жыл бұрын
Sounds good to me!
Mutual Information, Clearly Explained!!!
16:14
StatQuest with Josh Starmer
Рет қаралды 89 М.
Support Vector Machines Part 1 (of 3): Main Ideas!!!
20:32
StatQuest with Josh Starmer
Рет қаралды 1,3 МЛН
Пройди игру и получи 5 чупа-чупсов (2024)
00:49
Екатерина Ковалева
Рет қаралды 4,2 МЛН
The math behind Attention: Keys, Queries, and Values matrices
36:16
Serrano.Academy
Рет қаралды 235 М.
Every Distance in Data Science (Almost 100K Subs!)
21:25
ritvikmath
Рет қаралды 11 М.
Why Democracy Is Mathematically Impossible
23:34
Veritasium
Рет қаралды 1,9 МЛН
ROC and AUC, Clearly Explained!
16:17
StatQuest with Josh Starmer
Рет қаралды 1,5 МЛН
Covariance, Clearly Explained!!!
22:23
StatQuest with Josh Starmer
Рет қаралды 549 М.
Vector Databases simply explained! (Embeddings & Indexes)
4:23
AssemblyAI
Рет қаралды 317 М.
Attention in transformers, visually explained | Chapter 6, Deep Learning
26:10
Cosine Similarity and Cosine Distance
11:51
Krish Naik
Рет қаралды 110 М.
The medical test paradox, and redesigning Bayes' rule
21:14
3Blue1Brown
Рет қаралды 1,2 МЛН