Principle Component Analysis (PCA) | Part 1 | Geometric Intuition

  Рет қаралды 79,685

CampusX

CampusX

Күн бұрын

This video focuses on providing a clear geometric intuition behind PCA. Learn the basics and set the foundation for understanding how PCA works in simplifying and preserving important information in your data.
============================
Do you want to learn from me?
Check my affordable mentorship program at : learnwith.campusx.in/s/store
============================
📱 Grow with us:
CampusX' LinkedIn: / campusx-official
CampusX on Instagram for daily tips: / campusx.official
My LinkedIn: / nitish-singh-03412789
Discord: / discord
E-mail us at support@campusx.in
⌚Time Stamps⌚
00:00 - Intro
00:44 - What is PCA
05:16 - Benefits of using PCA
07:33 - Geometric Intuition
25:01 - What is variance and why is it important?

Пікірлер: 85
@aienthu2071
@aienthu2071 Жыл бұрын
So grateful for the videos that you make. I have burnt my pockets, spent hours on various courses just for the sake of effective learning. But most of the times I end up coming back at campusx videos. Thank you so much.
@hritikroshanmishra3630
@hritikroshanmishra3630 11 ай бұрын
ho gya to?
@investmentplan880
@investmentplan880 7 ай бұрын
Same
@henrystevens3993
@henrystevens3993 2 жыл бұрын
Unbelievable...nobody taught me PCA like this.... Sir 5/5 for your teachings 🙏🙏 god bless you ❤️
@vikramraipure6366
@vikramraipure6366 Жыл бұрын
I am interested with you for group study, reply me bro
@prasadagalave9762
@prasadagalave9762 4 ай бұрын
04:15 PCA is a feature extraction technique that reduces the curse of dimensionality in a dataset. 08:30 PCA is a technique that transforms higher dimensional data to a lower dimensional data while preserving its essence. 12:45 Feature selection involves choosing the most important features for predicting the output 17:00 Feature selection is based on the spread of data on different axes 21:15 PCA is a feature extraction technique that creates new features and selects a subset of them. 25:30 PCA finds new coordinate axes to maximize variance 29:45 Variance is a good measure to differentiate the spread between two data sets. 33:54 Variance is important in PCA to maintain the relationship between data points when reducing dimensions.
@SumanPokhrel0
@SumanPokhrel0 Жыл бұрын
It's like im watching a NF series , at first you're introduced to different terms, methods their usecases and in the last 10 mins of the video everything adds up and you realize what ahd why these stratigies are in use. Amazing.
@makhawar8423
@makhawar8423 Жыл бұрын
Can't have better understanding of PCA than this..Saved so much time and energy..Thanks a lot
@rafibasha1840
@rafibasha1840 2 жыл бұрын
You have done good research on every topic bro ,nice explanation ..I am so happy I found this channel at the same time feeling bad for not finding it earlier
@DataScienceSchools
@DataScienceSchools 2 жыл бұрын
Exactly. same here
@krishnakanthmacherla4431
@krishnakanthmacherla4431 2 жыл бұрын
Wow , i regret why I did not get to this channel, very clear as a story , i can explain a 6 year old and make him/her understand ❤️👏
@geetikagupta5
@geetikagupta5 Жыл бұрын
I am loving this channel more and more everytime I see a video here.The way content is presented and created is really awesome.Keep Inspiring and motivating us.I am learning a lot here.
@kirtanmevada6141
@kirtanmevada6141 7 ай бұрын
Totally an Awesome playlist for learning Data Science/Mining or for ML. Thank you so much sir! Means a lot!!
@prankurjoshi9672
@prankurjoshi9672 Жыл бұрын
No words to express how precious your teaching is....
@sidindian1982
@sidindian1982 Жыл бұрын
Sir , the way you explained the Curse of Dimensionality & its Solutions in Previous vedio -- Just mind blowing ..... YOU ARE GOD
@avinashpant9860
@avinashpant9860 Жыл бұрын
Awesome explanation and best part is how he drops important info in between the topic, like such a good interpretation of scatter plot is in this video which i wouldn't find even in dedicated scatter plot video. So perfect.
@fahadmehfooz6970
@fahadmehfooz6970 Жыл бұрын
Never have I seen a better explanation of PCA than this!
@nawarajbhujel8266
@nawarajbhujel8266 6 ай бұрын
Top have this level of teaching, one should have deep level of understanding both from theoritcal as well as practical aspects. You have proved it again. Thank for providing such valuable teaching.
@eshandkt
@eshandkt 2 жыл бұрын
one of the finest explanation of pca I have ever seen Thankyou Sir!
@zaedgtr6910
@zaedgtr6910 8 ай бұрын
Amazing explanation....NO one can explain pca as easily as you have done. Better than IIT professors.
@rahulkumarram5237
@rahulkumarram5237 Жыл бұрын
Beautifully explained !!! Probably the best analogy one could come up with. Thank you, sir.
@narmadaa2106
@narmadaa2106 9 ай бұрын
Excellent sir I have listened to different video lectures on PCA, But i didn't understand it properly. But your's is the best one. Thank you so much
@shubhankarsharma2221
@shubhankarsharma2221 Жыл бұрын
Very nicely explained topics. One of the best teacher on ML.
@balrajprajesh6473
@balrajprajesh6473 Жыл бұрын
I thank God for blessing me with this teacher.
@SameerAli-nm8xn
@SameerAli-nm8xn Жыл бұрын
First of all the playlists is amazing you have done a really good job in explaining the concepts and intrusions behind the algorithms, I was wondering could you create a separate playlist for ARIMA SARIMAX and LSTM algorithms i really want to see those above algorithms in future class
@DataScienceSchools
@DataScienceSchools 2 жыл бұрын
Wow, how simply you did it.
@raghavsinghal22
@raghavsinghal22 2 жыл бұрын
Best Video for PCA. I'll definitely recommend to my friends 🙂
@motivatigyan6417
@motivatigyan6417 Жыл бұрын
U r outstanding for me sir...i can't able to understand untill i watch your video
@siddiqkawser2153
@siddiqkawser2153 17 күн бұрын
U rock dude! Really appreciate that
@pankajbhatt8315
@pankajbhatt8315 2 жыл бұрын
Amazing explanation!!
@ytg6663
@ytg6663 3 жыл бұрын
बहुत सुंदर है👍👍🙏❤️🔥
@ParthivShah
@ParthivShah 4 ай бұрын
Thank You Sir.
@qaiserali6773
@qaiserali6773 2 жыл бұрын
Great content!!!
@DeathBlade007
@DeathBlade007 Жыл бұрын
Amazing Explanation
@ritesh_b
@ritesh_b Жыл бұрын
thanks for the great explaination please keep explaining in this way only
@harsh2014
@harsh2014 Жыл бұрын
Thanks for the explanations!
@jiteshsingh6030
@jiteshsingh6030 2 жыл бұрын
Just Wow 🔥 😍
@jazz5314
@jazz5314 Жыл бұрын
Wowww!!!! Best video
@jawadali1753
@jawadali1753 2 жыл бұрын
your teaching style is amazing , you are gem
@vikramraipure6366
@vikramraipure6366 Жыл бұрын
I am interested with you for group study, reply me bro
@mukeshkumaryadav350
@mukeshkumaryadav350 Жыл бұрын
amazing explanation
@761rishabh
@761rishabh 3 жыл бұрын
Nice Presentation sir
@armanmehdikazmi5390
@armanmehdikazmi5390 7 ай бұрын
hats off to you sirrrr
@user-nv9fk2jg5m
@user-nv9fk2jg5m 8 ай бұрын
You are so good in this, i m like 'tbse kha thae aap'
@VIP-ol6so
@VIP-ol6so 3 ай бұрын
great example
@beautyisinmind2163
@beautyisinmind2163 Жыл бұрын
Damn, you are the Messiah in ML teaching
@yashjain6372
@yashjain6372 Жыл бұрын
best explanation
@aadirawat4230
@aadirawat4230 2 жыл бұрын
Such an underrated channel for ML.
@beb57swatimohapatra21
@beb57swatimohapatra21 9 ай бұрын
Best course for ML
@msgupta07
@msgupta07 2 жыл бұрын
Amazing explanation... Can you share this one note for windows 10 notes of this entire series "100 days of Machine Learning"
@arpitchampuriya9535
@arpitchampuriya9535 Жыл бұрын
Excellent
@arshad1781
@arshad1781 3 жыл бұрын
thanks
@MARTIN-101
@MARTIN-101 11 ай бұрын
sir you have no idea, how much you are helping data learners like me. thanks a lot. how can i help you. is there any where i can pay to you as a token of appreciation ?
@rafibasha4145
@rafibasha4145 2 жыл бұрын
Hi Bro,please make videos on feature selection techniques
@sahilkirti1234
@sahilkirti1234 3 ай бұрын
you are the god
@ambarkumar7805
@ambarkumar7805 Жыл бұрын
what is the difference between feature extraction and feature contruction as both are reducing the no of features?
@shivendrarajput4413
@shivendrarajput4413 Ай бұрын
that is what we do
@amanrajdas4540
@amanrajdas4540 Жыл бұрын
sir your videos are really amazing, I had learned a lot from your videos. But I have a doubt in feature construction and feature extraction. They both are looking similar. So can you please ,tell me the one major difference between these two.
@vikramraipure6366
@vikramraipure6366 Жыл бұрын
I am interested with you for group study, reply me bro
@Ishant875
@Ishant875 6 ай бұрын
I have a doubt, If a variable is in range 0 to 1 and another variable is in range 0 to 1000(will have more variance / spread ). Why choosing 2nd variable just by looking at variance make sense? It may be matter of units like in km and cm. For this problem we use scaling. Am I right?
@sachinahankari
@sachinahankari 4 ай бұрын
Variance of grocery shop is greater than number of rooms but you have shown reverse..
@mustafachenine7942
@mustafachenine7942 2 жыл бұрын
Is it possible to have an example of pictures to classify them into two categories? If the dimensions are reduced in pca and classification in knn is better , please
@surajghogare8931
@surajghogare8931 2 жыл бұрын
Cleaver explaination
@1234manasm
@1234manasm Жыл бұрын
Very nice explanation my i know which hardware you use to write on the notepad?
@learnfromIITguy
@learnfromIITguy Жыл бұрын
solid
@pkr1kajdjasdljskjdjsadjlaskdja
@pkr1kajdjasdljskjdjsadjlaskdja 5 ай бұрын
bhai ye video viral kiyu nahi ho raha hai ..thank you sir ❤
@kindaeasy9797
@kindaeasy9797 6 ай бұрын
but agar PCA ke geometric intuition mai mai clockwise ghumau axis ko toh variance toh rooms ka kam ho jaega na , or agar mai same process kru by taking washroomn on x axis and rooms on y tab toh washroom select ho jaega na ??
@0Fallen0
@0Fallen0 Жыл бұрын
24:24 Aha! So PCA finds an alternate co-ordiante system and uses the change of basis matrix to transform the data.
@x2diaries506
@x2diaries506 Жыл бұрын
Dear sir I am confused about the variance formula and your interpretation. Kindly recheck.
@AyushPatel
@AyushPatel 3 жыл бұрын
sir i just wanted to ask that can we write our own machine learning algorithms instead of using sklearn and tensorflow i mean from scratch plz make a video about that. I have been following you whole series. Sir do reply. Thanks to your efforts
@ytg6663
@ytg6663 3 жыл бұрын
Ha Likhsaktr ho yaar... Yes you can...
@vikramraipure6366
@vikramraipure6366 Жыл бұрын
I am interested with you for group study, reply me bro
@AyushPatel
@AyushPatel Жыл бұрын
@@vikramraipure6366 actually currently i am working on some other project so.. i am sorry.. thanks for the proposal!
@yashwanthyash1382
@yashwanthyash1382 Жыл бұрын
My suggestion is use sklearn library for existed algorithms. If that doesn't work create your own algorithm.
@namansethi1767
@namansethi1767 2 жыл бұрын
Thanks you sir
@vikramraipure6366
@vikramraipure6366 Жыл бұрын
I am interested with you for group study, reply me bro
@namansethi1767
@namansethi1767 Жыл бұрын
Done....Give me your mobile no. ..... I will call i when I free
@ayushtulsyan4695
@ayushtulsyan4695 Жыл бұрын
Bhai ek playlist dedo for statistical application in Data Science
@vatsalshingala3225
@vatsalshingala3225 Жыл бұрын
❤❤❤❤❤❤❤❤❤❤❤❤❤❤
@Star-xk5jp
@Star-xk5jp 5 ай бұрын
Day3: Date:11/1/24
@devnayyar9536
@devnayyar9536 Жыл бұрын
Sir notes milege app ki
@murumathi4307
@murumathi4307 2 жыл бұрын
This work same SVM? 🤔
@campusx-official
@campusx-official 2 жыл бұрын
Yes
@murumathi4307
@murumathi4307 2 жыл бұрын
@@campusx-official tq sir .. your class aswame 🙏
@sahilkirti1234
@sahilkirti1234 3 ай бұрын
04:15 PCA is a feature extraction technique that reduces the curse of dimensionality in a dataset. 08:30 PCA is a technique that transforms higher dimensional data to a lower dimensional data while preserving its essence. 12:45 Feature selection involves choosing the most important features for predicting the output 17:00 Feature selection is based on the spread of data on different axes 21:15 PCA is a feature extraction technique that creates new features and selects a subset of them. 25:30 PCA finds new coordinate axes to maximize variance 29:45 Variance is a good measure to differentiate the spread between two data sets. 33:54 Variance is important in PCA to maintain the relationship between data points when reducing dimensions. Crafted by Merlin AI.
Curse of Dimensionality
15:25
CampusX
Рет қаралды 41 М.
3M❤️ #thankyou #shorts
00:16
ウエスP -Mr Uekusa- Wes-P
Рет қаралды 12 МЛН
Was ist im Eis versteckt? 🧊 Coole Winter-Gadgets von Amazon
00:37
SMOL German
Рет қаралды 34 МЛН
Tom & Jerry !! 😂😂
00:59
Tibo InShape
Рет қаралды 65 МЛН
The day of the sea 🌊 🤣❤️ #demariki
00:22
Demariki
Рет қаралды 104 МЛН
StatQuest: Principal Component Analysis (PCA), Step-by-Step
21:58
StatQuest with Josh Starmer
Рет қаралды 2,8 МЛН
PCA : the math - step-by-step with a simple example
20:22
TileStats
Рет қаралды 99 М.
Principal Component Analysis (PCA) - easy and practical explanation
10:56
Principal Component Analysis (PCA)
26:34
Serrano.Academy
Рет қаралды 401 М.
StatQuest: PCA main ideas in only 5 minutes!!!
6:05
StatQuest with Josh Starmer
Рет қаралды 1,2 МЛН
Principal Component Analysis (PCA)
13:46
Steve Brunton
Рет қаралды 366 М.
3M❤️ #thankyou #shorts
00:16
ウエスP -Mr Uekusa- Wes-P
Рет қаралды 12 МЛН