Principle Component Analysis (PCA) | Part 2 | Problem Formulation and Step by Step Solution

  Рет қаралды 64,579

CampusX

CampusX

Күн бұрын

This video breaks down the problem formulation and offers a step-by-step solution guide. Enhance your understanding of PCA and master the techniques for dimensionality reduction in your data.
Code used: github.com/campusx-official/1...
About Eigen Vectors:
www.visiondummy.com/2014/04/g....
• Eigenvectors and eigen...
Plotting tool used:
www.geogebra.org/m/YCZa8TAH
============================
Do you want to learn from me?
Check my affordable mentorship program at : learnwith.campusx.in/s/store
============================
📱 Grow with us:
CampusX' LinkedIn: / campusx-official
CampusX on Instagram for daily tips: / campusx.official
My LinkedIn: / nitish-singh-03412789
Discord: / discord
E-mail us at support@campusx.in
⌚Time Stamps⌚
00:00 - Practical Example on MNIST Dataset
00:33 - Problem Formulation
12:55 - Covariance and Covariance Matrix
23:17 - Eigen Vectors and Eigen Values
25:37 - Visualizing Linear Trasnformations
35:35 - Eigendecompostion of a covariance Matrix
38:04 - How to solve PCA
43:41 - How to transform points?
48:18 - Code Demo with Vizualization
56:00 - Outro

Пікірлер: 143
@akash.deblanq
@akash.deblanq 2 жыл бұрын
This is hands down the best PCA explanation I have seen on the internet. Period!
@abhinavkatiyar6950
@abhinavkatiyar6950 Жыл бұрын
True
@sidindian1982
@sidindian1982 Жыл бұрын
indeed
@ADESHKUMAR-yz2el
@ADESHKUMAR-yz2el 11 ай бұрын
गुरुर्ब्रह्मा ग्रुरुर्विष्णुः गुरुर्देवो महेश्वरः। गुरुः साक्षात् परं ब्रह्म तस्मै श्री गुरवे नमः।। you are the real teacher sir.. 💫
@amartyatalukdar1024
@amartyatalukdar1024 4 ай бұрын
I have studied eigenvalues and eigenvectors, multiple times, but this video explained to me the depth of it in a very simple way! One of the best teachers out there.
@sudhanshusingh5594
@sudhanshusingh5594 2 жыл бұрын
Feature extraction perfect example is u sir High dimension knowledge ko 2D me convert krk smjhne layak bana dete ho.. Thankq so much sir.
@avinashpant9860
@avinashpant9860 Жыл бұрын
I am always amazed how important the concept of eigenvectors and eigenvalues are, they are one of the most important concepts of quantum mechanics. Every operator( ex- energy, momentum) in Q.Mech is a linear operator and our aim usually is to find the corresponding eigenvectors and eigenvalues. Time-independent Schrödinger eq usually takes the form of eigenvalue equation Hψ =Eψ. It's so amazing to see how these concepts are finding their role in Machine Learning as well. MY love for Math keeps on growing. As always thank you for your amazing videos
@AltafAnsari-tf9nl
@AltafAnsari-tf9nl Жыл бұрын
You are simply amazing. I bet no one can teach PCA so well.
@lightyagami7085
@lightyagami7085 2 жыл бұрын
No one has ever explained Eigen Vectors in such a simple way, You are awesome !!
@alkalinebase
@alkalinebase Жыл бұрын
Honestly everything I know, I owe it to you, thankyou for being the real HERO in the need!!!
@vijendravaishya3431
@vijendravaishya3431 Күн бұрын
those who are wondering why three eigen vectors everytime, because covariance matrix is a symmetric matrix, and Real Symmetric Matrices have n linearly independent and orthogonal eigenvectors. zero vector is not considered an eigen vector although it satisfies Ax= λx, like wise there might be upto n LI eigen vectors for n*n symmetrix matrix
@sudiptahalder423
@sudiptahalder423 Жыл бұрын
Best explanation of PCA I've ever seen!! ❤️
@bibhutibaibhavbora8770
@bibhutibaibhavbora8770 7 ай бұрын
Omg in one video he explains the most difficult linear algebra topic and applied it to machine learning and also showed us the code. Hats off
@Lets_do_code-vl7im
@Lets_do_code-vl7im 6 ай бұрын
its a video of 3 parts and in a way he is explaining dum man can understand the concept
@monicakumar6769
@monicakumar6769 8 ай бұрын
This is the best video I have watched on this topic!
@krithwal1997
@krithwal1997 2 жыл бұрын
This Channel is Goldmine 🙌
@anshuman4mrkl
@anshuman4mrkl 3 жыл бұрын
Amazingly explained. 🤩👏🏻
@tr-GoodVibes
@tr-GoodVibes Жыл бұрын
This is called Teaching ! Thanks for this wonderful explanation.
@susamay
@susamay 9 ай бұрын
u r the best Nitish. Thanks for all these.
@bangarrajumuppidu8354
@bangarrajumuppidu8354 2 жыл бұрын
what an amzing explanation very intutive am follwing ur whole series sir
@deepanshugoel3790
@deepanshugoel3790 Жыл бұрын
you have made a complex topic like PCA so easy for us to understand
@pravinshende.DataScientist
@pravinshende.DataScientist 2 жыл бұрын
your content is best ever!! thank your sir!
@nishantgoyal6657
@nishantgoyal6657 5 ай бұрын
One of the best videos I found for PCA. You have great skills brother.
@varunahlawat9013
@varunahlawat9013 Жыл бұрын
The most satisfying Machine Learning lecture that I've ever seen by far🤩🤩
@aounhaider8335
@aounhaider8335 11 ай бұрын
You have cleared my concept which was not well explained by any other instructor on youtube! Great job❤❤
@pulimiyashwanth9925
@pulimiyashwanth9925 9 ай бұрын
this channel is so underated,by seeing this pca video everyone one can understand dimensonality reduction,thank you sir for the hard work
@farhansarguroh8680
@farhansarguroh8680 Жыл бұрын
My head hurts, this is sooo descriptive and apt and worthy enough of all the time. best. Kuddos
@sameerabanu3115
@sameerabanu3115 8 ай бұрын
extremely superb explanation, kudooss
@ashishshejwal8514
@ashishshejwal8514 11 ай бұрын
Speechless ,too good to grasp
@ritugujela8345
@ritugujela8345 Жыл бұрын
Thank you so much sir, you always leave us awestruck by your remarkable explanation and in-depth knowledge. I never knew this topic can be explained with this much clearity. The teacher I never knew I needed in my life ❤️✨
@samikshakolhe5086
@samikshakolhe5086 Жыл бұрын
The Epic PCA explaination ever seen on KZfaq and never done by any DS KZfaqr Yet, Hat's off to your teaching skills sir
@singnsoul6443
@singnsoul6443 7 ай бұрын
I am blown away by understanding the true meaning of eigen vectors. I always knew the definition but now I have understood the meaning. You are a savior!
@ssh0059
@ssh0059 Ай бұрын
Wow best video on PCA on internet
@anoopkaur6119
@anoopkaur6119 9 ай бұрын
Awesome video, omg , u explained every concept so clearly. thx a lot sir
@QAMARRAZA-pm6nc
@QAMARRAZA-pm6nc 3 ай бұрын
how can i thank you, what a wonderful teacher available for free for the help of many students
@satyamgupta4808
@satyamgupta4808 9 ай бұрын
Sir No one in KZfaq taught this much intutive. Even paid course cant teach this much indepth
@Rupesh_IITBombay
@Rupesh_IITBombay 5 ай бұрын
Thank You sir. such a crisp explaination....
@soumilyade1057
@soumilyade1057 Жыл бұрын
45:44 I think it's gonna be (3,1) and when transposed it's gonna be (1,3) which then is multiplied with the matrix representing the dataset. (1,3) × (3, 1000) . This representation is valid too
@namansethi1767
@namansethi1767 2 жыл бұрын
Thanks sir for this amazing explanation
@somanshkumar1325
@somanshkumar1325 Жыл бұрын
Brilliant explanation! Thank you so much :)
@TheMLMine
@TheMLMine 7 ай бұрын
Great step-by-step explanation
@vashugarg2072
@vashugarg2072 Жыл бұрын
Best Teaching Skills I have ever seen ,for All Machine Learning concepts, Hats of you Sir!🎉🎉🎊🎊
@osho_magic
@osho_magic Жыл бұрын
I have seen and understand lin alg playlist at 3blue1 brown But you enhances my doubts clarifications even more thanks
@hello-iq6yz
@hello-iq6yz Жыл бұрын
Amazing clarity !!!
@bikimaharana9350
@bikimaharana9350 2 жыл бұрын
Maza aagaya bhai itna achha explanation sayad hi kisine youtube me kiya hoga.
@abhijitkumar7831
@abhijitkumar7831 Ай бұрын
Amazing Tutorial
@world4coding
@world4coding Жыл бұрын
sir ye dekh kar maja ko maja aaa gaya. kya hi bolu sir ab . PCA ka Best video tha ye. Love you sir.
@pramodshaw2997
@pramodshaw2997 2 жыл бұрын
god bless you. wonderful session
@aryastark4064
@aryastark4064 7 ай бұрын
you are a saviour to my sinking boat❣. thanks a lot.
@bhavikpunmiya9641
@bhavikpunmiya9641 3 ай бұрын
Thankyou So Much Sir, You not only Cleared my doubt's about how PCA works, but also for the first time gave me mathematical intitution of Eigen Value and Eigen Vector and even Matrices Transformation which I am learning from previous so many years Best Explaination I've seen regarding this topic
@11aniketkumar
@11aniketkumar 8 ай бұрын
गुरू ब्रह्मा गुरू विष्णु, गुरु देवो महेश्वरा गुरु साक्षात परब्रह्म, तस्मै श्री गुरुवे नमः Aapko koti koti naman hai sir, jo ye gyan aapne youtube ke jariye hum sabko diya
@muhammadumair1280
@muhammadumair1280 2 жыл бұрын
Love from KARACHI ,PAKISTAN.
@akshaythakor5501
@akshaythakor5501 6 ай бұрын
Actually I was learning PCA for the first time. When I watched the video for the first time I didn't understand it but when I watched it a second time then all the topics very clearly. This video is amazing
@brajesh2334
@brajesh2334 3 жыл бұрын
very excellent explanation.....
@lijindurairaj2982
@lijindurairaj2982 2 жыл бұрын
was very useful for me, thank you :)
@harsh2014
@harsh2014 Жыл бұрын
Thanks for explanations!
@gauravpundir97
@gauravpundir97 Жыл бұрын
Great explanation @Nitish
@RitikaSharma-pt9ox
@RitikaSharma-pt9ox Жыл бұрын
Thankyou sir for this amazing video.
@sushantsingh1133
@sushantsingh1133 Ай бұрын
You are best in business
@sarumangla6030
@sarumangla6030 Жыл бұрын
Justt soo awesome ! Cant describe !
@shaan200384
@shaan200384 Жыл бұрын
Excellent!!
@gauravagrawal8078
@gauravagrawal8078 6 ай бұрын
excellent explanation
@ParthivShah
@ParthivShah 3 ай бұрын
Thank You Sir.
@ajaykushwaha4233
@ajaykushwaha4233 3 жыл бұрын
Awesome 👏👍🏻
@DimLightPoetries
@DimLightPoetries Жыл бұрын
Pure Gold
@ali75988
@ali75988 6 ай бұрын
One small explanation of shortcut in lecture at 16:04, co-variance actual formula includes xmean and ymean, here both were zero, that's why shortcut sum(x*y)/3 formula for covariance is: covariance(x,y) = summation[(x-xmean) (y-ymean)] / n basically this is the same reason, covariance matrix has variance at diagonal 22:57 both features are same x so covariance(x,x) = summation[(x-xmean)(x-xmean)]/n which is actually the formula for variance
@roboioters
@roboioters Жыл бұрын
The best explanation
@sahilkirti1234
@sahilkirti1234 3 ай бұрын
00:02 PCA aims to reduce dimensionality while maintaining data essence. 02:55 Projection and unit vector for PCA 10:27 Principle Component Analysis (PCA) helps to find the direction for maximum variance. 12:48 Variance measures the spread of data 19:22 Principal Component Analysis (PCA) helps in understanding the spread and orientation of data. 21:56 PCA provides complete information about data spread and orientation. 27:10 Principle Component Analysis involves transformations and changing directions of vectors. 29:39 Linear transformation does not change vector direction. 34:24 Principal Component Analysis (PCA) uses eigenvectors for linear transformation. 36:36 Principal Component Analysis (PCA) helps identify vectors with the highest variation in data. 41:55 Principal Component Analysis allows transforming data and creating new dimensions. 44:15 PCA involves transforming the dataset to a new coordinate system 49:14 Using PCA to find the best two-dimensional representation of 3D data 52:07 Principle component analysis (PCA) involves transforming and transporting the data. Crafted by Merlin AI.
@amirman6
@amirman6 Жыл бұрын
pretty good explanation of doing PCA computatinally without using the sklearn
@VIP-ol6so
@VIP-ol6so 3 ай бұрын
clearly explained
@pravinshende.DataScientist
@pravinshende.DataScientist 2 жыл бұрын
wow is my 1 st expression after watching this vedio....
@pavangoyal6840
@pavangoyal6840 Жыл бұрын
Excellent !!!
@rafibasha1840
@rafibasha1840 2 жыл бұрын
Thanks for the excellent video bro ,@16:21 in covariance we substrate values from mean and then multiply right
@AbcdAbcd-ol5hn
@AbcdAbcd-ol5hn Жыл бұрын
😭😭😭😭thanks a lot sir, thank you so much
@jiteshsingh6030
@jiteshsingh6030 2 жыл бұрын
I am Gonna Mad😶 ; You are Truly Legend 🔥
@nitinchityal583
@nitinchityal583 Жыл бұрын
Speechless .....you deserve a million subscribers at least
@campusx-official
@campusx-official Жыл бұрын
Natural Language Processing(NLP): kzfaq.info/sun/PLKnIA16_RmvZo7fp5kkIth6nRTeQQsjfX
@core4032
@core4032 2 жыл бұрын
very very valuable.
@heetbhatt4511
@heetbhatt4511 9 ай бұрын
Thank you sir
@lakshityagi684
@lakshityagi684 Жыл бұрын
Amazing!
@rashmiranjannayak8965
@rashmiranjannayak8965 9 ай бұрын
wow kudus fot the explanation
@adityabhatt04
@adityabhatt04 2 жыл бұрын
This is even better than Josh Starmer's video.
@pravinshende.DataScientist
@pravinshende.DataScientist 2 жыл бұрын
wow content ! you are playing a big role for me to make me as a data scientist .. thank you sir!
@akshaypatil8155
@akshaypatil8155 Жыл бұрын
Hi pravin if u have got the job, could you guide me a little. I have questions related to how the work gets distributed in a data science dept. of a company. How the data science dept works and how the work gets distributed etc etc...Could u plz share ur email?
@shubhamagrahari9745
@shubhamagrahari9745 8 ай бұрын
​@@akshaypatil8155no 👎
@pradeepmarpatla5498
@pradeepmarpatla5498 7 ай бұрын
You are ML guru, 🙏
@pavangoyal6840
@pavangoyal6840 Жыл бұрын
Request you to continue deep learning series
@abrarvlogs6931
@abrarvlogs6931 9 ай бұрын
love you sir ❣ crush updated
@descendantsoftheheroes_660
@descendantsoftheheroes_660 11 ай бұрын
guru G aapke charan kha h ...GOD bless you sir
@balrajprajesh6473
@balrajprajesh6473 Жыл бұрын
Best!
@kosttavmalhotra5899
@kosttavmalhotra5899 9 ай бұрын
bhai chad explanation hai
@krishnendubarman8490
@krishnendubarman8490 Жыл бұрын
You are really a good teacher, I am in IIT Bombay, Environmental Engineering, Mtech , but I wanted to learn ML, this playlist is so far best understandable for me.
@shubhamagrahari9745
@shubhamagrahari9745 8 ай бұрын
Bhai isse acha private se CS krliya hota
@bahubaliavenger472
@bahubaliavenger472 3 күн бұрын
Lmao​@@shubhamagrahari9745
@nitinchityal583
@nitinchityal583 Жыл бұрын
Do a series on time series analysis and NLP please
@core4032
@core4032 2 жыл бұрын
Sir, I want to know ki how libraries are working , agar aap uspe basic explanation batade , baki to this series is very awesome .
@Hellow_._
@Hellow_._ Жыл бұрын
One stop everything
@yashjain6372
@yashjain6372 Жыл бұрын
best💗
@morancium
@morancium Жыл бұрын
One small correction: 52:55 Eigen Vectors are the COLUMNS of the matrix which is given as the output of np.linalg.eig() not the rows which you have used... please correct me if I am wrong
@islamicinsights6342
@islamicinsights6342 2 жыл бұрын
wow sir thanks, you are the best pr apne aage videos Q nhi bnayi hai unsupervised learning p mein wt kr rha hu plz reply
@yashjain6372
@yashjain6372 Жыл бұрын
best
@pramodshaw2997
@pramodshaw2997 2 жыл бұрын
one question do we need to sort the eigen vectors based on highest eigen values and then choose the eigen vectors accordingly? also sum of top K eigen values will show us how many eigen vectors we need to take (in case of high dimensional data)
@priyadarshichatterjee7933
@priyadarshichatterjee7933 7 ай бұрын
most elegant explanation of Eigen Vector and Eigen values ever seen. Thanks sir for this one
@IRFANSAMS
@IRFANSAMS 2 жыл бұрын
@CampusX, Sir please help with t-Sne Algorithm also
@josebgeorge227
@josebgeorge227 3 ай бұрын
Hi Sir just to understand the concept well, so when we do the transformation of the data D, do we use the matrix of Eigen Vector (Calculated using the Covariance Matrix) or do we use the Covariance Matrix itself? Its using the matrix of Eigen Vector right?
@mustafachenine7942
@mustafachenine7942 2 жыл бұрын
Is it possible to have an example of pictures to classify them into two categories? If the dimensions are reduced in pca and classification in knn is better , please
@BAMEADManiyar
@BAMEADManiyar 8 ай бұрын
I think, There can only be n eigen values for a n*n matrix. And n unit eigen vectors for it. but can be as many as eigen vectors as possible. we need to just multiply by some k to to that unit eigen vectors to get some more eigen vectors. :)
@kindaeasy9797
@kindaeasy9797 6 ай бұрын
sabse bada eigenvector sabse bade eigen value ke corresponding hoga , but ek eigen value ke corresponding more than one eigen vector hotai hai , infact poora eigen space hota hai (except 0 vector ofcourse)!! in R2 plane ,it will have uncountable eigen vectors corresponding to the largest eigen value
@kindaeasy9797
@kindaeasy9797 6 ай бұрын
ooh i figured it out , i think if the eigen vectors are LD they have the same direction and direction is what matters , and i we have and LI one , then we will have one more u which works equivalently good
МАМА И STANDOFF 2 😳 !FAKE GUN! #shorts
00:34
INNA SERG
Рет қаралды 4,3 МЛН
Мы никогда не были так напуганы!
00:15
Аришнев
Рет қаралды 5 МЛН
Неприятная Встреча На Мосту - Полярная звезда #shorts
00:59
Полярная звезда - Kuzey Yıldızı
Рет қаралды 7 МЛН
StatQuest: Principal Component Analysis (PCA), Step-by-Step
21:58
StatQuest with Josh Starmer
Рет қаралды 2,8 МЛН
Principal Component Analysis (PCA)
26:34
Serrano.Academy
Рет қаралды 401 М.
Principal Component Analysis (PCA) - easy and practical explanation
10:56
Gaussian Mixture Model
15:07
ritvikmath
Рет қаралды 101 М.
Principal Component Analysis (PCA)
6:28
Visually Explained
Рет қаралды 192 М.
StatQuest: PCA main ideas in only 5 minutes!!!
6:05
StatQuest with Josh Starmer
Рет қаралды 1,2 МЛН
МАМА И STANDOFF 2 😳 !FAKE GUN! #shorts
00:34
INNA SERG
Рет қаралды 4,3 МЛН