Machine Learning Series- Revising Complete Linear Regression,Cost Function,Convergence Algorithm

  Рет қаралды 50,386

Krish Naik

Krish Naik

Күн бұрын

In this video we will be revising the entire Linear Regression algorithm, cost function and the convergence algorithm with simple linear regression and multiple linear regression.
Support me by joining membership so that I can upload these kind of videos
/ @krishnaik06
▬▬▬▬▬ Contents of this video ▬▬▬▬▬
00:00:00 - Simple Linear Regression Introduction
00:05:52 - Regression Mathematical Equation
00:14:22 - Regression Cost Function
00:34:18 - Convergence Algorithms
00:45:55 - Convergence Algorithms Mathematical intution
01:02:30 - Multiple Linear Regression
------------------------------------------------------------------------------------------------------------------
Join the PWSKILLS Data Science Masters Course
Best Affordable Data Science Course From Pwskills(6-7 Months)
Impact Batch 2.0:- Data-Science-Masters (Full Stack Data Science)
1. Data Science Masters Hindi: bit.ly/3TPdrDz (Hindi)
2. Data Science Masters English: bit.ly/40gZ9hn (English)
Direct call to our team in case of any queries
+9186600 34247
+919880055539
+918147625763
+918660034247
+918951939425

Пікірлер: 49
@aviralsharma2005
@aviralsharma2005 Жыл бұрын
Giving such high quality content away for free REGULARLY, despite running a paid programme of his own. What a selfless gem, you sir! 🙌🏻
@shanky6343
@shanky6343 4 ай бұрын
Excellent video, was scoping out videos that needed derivations and equations, this was perfect!
@mjxsh3lm
@mjxsh3lm Жыл бұрын
Thank you Krish! Great video as always!
@saketmotekar5570
@saketmotekar5570 4 ай бұрын
Best video I found on liner regression on KZfaq thank you sir for providing this
@khandakersharminashrafi7086
@khandakersharminashrafi7086 4 ай бұрын
Very impressive lecture, really enjoyed it!
@saurabhsoni6912
@saurabhsoni6912 Жыл бұрын
One of the best video on Linear Regression that I have found ...❤ Please continue this lecture series upto Deep Learning..🙏
@gohilmilansinh1673
@gohilmilansinh1673 Жыл бұрын
Finally these series is continue Thank you sir
@Tech_Enthusiasts_Shubham
@Tech_Enthusiasts_Shubham Жыл бұрын
Thanks a lot for starting this series sir it is really helpful for me
@Alien-ik1tt
@Alien-ik1tt 10 ай бұрын
Thank U So Much Sir to give Valuable content
@sruthia134
@sruthia134 9 ай бұрын
This video gives a very good clarity on linear regression than the other ones! Very simple and clear!
@terminatorgamer97
@terminatorgamer97 12 күн бұрын
You are the teacher I was looking for. I thank you for your effort and knowledge ❤
@saikrishna-pj4db
@saikrishna-pj4db Ай бұрын
Best way of teaching. Thanks
@sushilkumar-cw9cw
@sushilkumar-cw9cw 4 ай бұрын
Awesome explanation.
@sivadurga9204
@sivadurga9204 2 ай бұрын
ML Demi God for non technical students 🙏🙏🙏
@TheGuts09
@TheGuts09 Жыл бұрын
Neat and clean explanation. I was confused about cost function and gradient descent and now It’s clear
@policevenkatsuhas7982
@policevenkatsuhas7982 10 күн бұрын
same doubt but now its cleared
@krishj8011
@krishj8011 3 күн бұрын
Great Tutorial...
@24amit
@24amit 2 ай бұрын
Hey Krish, apart from the depth knowledge & great way of presenting, you have very good handwriting. This is just awesome!
@priyanshusingh8904
@priyanshusingh8904 6 ай бұрын
Plzz sir Continue this series, with elaborating more fields in AI like Computer vision. Fuzzy Logic. Expert systems. Robotics. Machine learning. Neural networks/deep learning. Natural language processing.
@sivachaitanya6330
@sivachaitanya6330 Жыл бұрын
Hi Krish .......now a days in data science inteviews questions about deployment and model and performance monitoring also asking so can you make a video on performance monitoring also ....it will be very helpfull........thank you
@rubayetalam8759
@rubayetalam8759 11 ай бұрын
PLEASE UPLOAD MORE VIDEOES ASAP!
@hawkeye02292
@hawkeye02292 3 ай бұрын
wow sir thanks a lot
@deepanshuvishwakarma316
@deepanshuvishwakarma316 2 ай бұрын
amazing video going thrice through it, btw what is the name of the software used for this drawing/ writing ?
@praveensevenhills
@praveensevenhills Жыл бұрын
Am thinking about it appeared Tnq
@VishalSingh-jc7ep
@VishalSingh-jc7ep 3 ай бұрын
Sir is this video covering “intrinsically liner regression “ term is not used in the video . Please confirm if it is the same thing
@sridineshr6598
@sridineshr6598 Жыл бұрын
Can you please make one video of comparing time series vs regression techniques for demand prediction and the limitation of those?
@ShubhamKumar-tj5jw
@ShubhamKumar-tj5jw Күн бұрын
Amazing
@mehuldarak8927
@mehuldarak8927 3 ай бұрын
Quick question: 32:30 Isn't gradient descent used for finding local minima?
@naruto5437
@naruto5437 Ай бұрын
loved it
@parthavi9702
@parthavi9702 28 күн бұрын
Thankuuuu sirr
@michealdas198
@michealdas198 5 ай бұрын
I really enjoyed the video but I would really like to have the notebook you created while explanation. Could you please provide the notebook file?
@anmol9943
@anmol9943 Ай бұрын
How would the machine know whether slope is negative or positive? how are we coding for it?
@kesavarapuvenkataramana9880
@kesavarapuvenkataramana9880 11 ай бұрын
when we have both theta 0 and theta 1 for convergence alogorithm why j values is considered for both 0 and 1. Exactly 50:32 time line in your video
@geekyprogrammer4831
@geekyprogrammer4831 Жыл бұрын
Krish on 🔥🔥🔥
@sanchaythalnerkar9736
@sanchaythalnerkar9736 Жыл бұрын
I liked your Deep Learning series , can you create a tutorial series for the tensorflow developer exam?
@geekyprogrammer4831
@geekyprogrammer4831 Жыл бұрын
Tensorflow is useless. PyTorch is more used nowadays
@anmol9943
@anmol9943 Ай бұрын
How do we calculate and find out whether slope is positive or negative? Machine needs to calculate it right, as human we can see graph and say, but how does machine does it?
@IshanTyagi-fo9ej
@IshanTyagi-fo9ej Ай бұрын
what is difference between thetaJ and slope
@sandeepgurram8
@sandeepgurram8 2 ай бұрын
Hi, in the MSE formula why did you write 1/2m instead of 1/m - could you please clarify. [normally mean is sum of the inputs divided by number of inputs, here my question is why 1/2 is multiplied additionally)
@kshitijnishant4968
@kshitijnishant4968 Ай бұрын
For normalisation of the average error found, error found can be really high/huge and to explain/view it in simpler terms we divide by 2 as well.
@arslanahmadbhatti6267
@arslanahmadbhatti6267 3 ай бұрын
lovely
@MuhammadKhan-ok3hf
@MuhammadKhan-ok3hf 10 ай бұрын
Karish, you are Asian Andrew NG, Thanks
@AL_Rheem1
@AL_Rheem1 18 күн бұрын
Hi , Is deep learning in this paid series ?
@yashwanthyashwanth8080
@yashwanthyashwanth8080 6 ай бұрын
sir, we need to remember the derivation
@_arpanchandra
@_arpanchandra Ай бұрын
sir I think in 23min41sec there will be a chance of mistake !! cause [ h theta (x)^i ] & you just write as (2-2)^2, I think this will as (4-4)^2 please tell me I'm write or wrong ??.........and big fan sir !!
@saikrishna-pj4db
@saikrishna-pj4db Ай бұрын
I think it is (2-2)^2 for the cost function of the 2nd variable. Sir has taken [ h theta (x)^i ] when theat0(Intercept =0) , theat 1(slope =1 ) and independent variable x =2 and with this when we place in the cost equation [ h theta (x)^i ] = y predicted and (y)^i is actual output, thus (2-2)^2. My understanding
@kaleemullah247
@kaleemullah247 7 ай бұрын
When your KZfaq tutorial guy says "nothing but XYZ", you know he is an Indian. 😅
@feel4ever769
@feel4ever769 11 ай бұрын
14:22 I dont know how perfect code is but i tried a bit import matplotlib.pyplot as plt import numpy as np x = [1,2,3,4,5,6] y = [1,2,3,4,5,6] graph = plt.figure() fig1 = graph.add_axes([0,0,1,1]) fig1.plot(x,y,marker='s',color='b',label="Slop") fig1.set_xlabel("X",fontsize=14) fig1.set_ylabel("Y",fontsize=14) fig1.legend() j = [] for k in range(6): y1 = [] for i in range(6): y1.append(k*x[i]) j.append((1/(2*6))*(np.square(y1[0]-y[0])+np.square(y1[1]-y[1])+np.square(y1[2]-y[2])+np.square(y1[3]-y[3])+np.square(y1[4]-y[4])+np.square(y1[5]-y[5]))) print(j) slop = [0,1,2,3,4,5] plt.plot(slop,j,marker='o',color='r') plt.xlabel("Slop",fontsize=16) plt.ylabel("J(0)",fontsize=16) graph2 = plt.figure() fig4 = graph2.add_axes([0,0,2,2]) fig3 = graph2.add_axes([0.1,0.9,1,1]) fig3.plot(x,y,color='g',marker='o',label='First_line') fig4.plot(slop,j,color='r',marker='o',label='Second_line') fig3.set_xlabel("x") fig3.set_ylabel('y') fig4.set_xlabel("Slop") fig4.set_ylabel('J') fig3.text(3.3,3.5,"Original",rotation=38,color='b') fig3.legend() fig4.legend()
@jatingawri
@jatingawri 5 ай бұрын
Seriously Great quality content man , I am from commerce background but still you explained it so well I used to backoff from the things which used to involve all this science terms derivative n all but after watching this playlist . Just want to say this Kudos too your struggle man Seriously Keep it up .🫡❤❤
Gradient Descent, Step-by-Step
23:54
StatQuest with Josh Starmer
Рет қаралды 1,2 МЛН
Chips evolution !! 😔😔
00:23
Tibo InShape
Рет қаралды 42 МЛН
WHY DOES SHE HAVE A REWARD? #youtubecreatorawards
00:41
Levsob
Рет қаралды 36 МЛН
Tutorial 12- Stochastic Gradient Descent vs Gradient Descent
12:17
Linear Regression, Clearly Explained!!!
27:27
StatQuest with Josh Starmer
Рет қаралды 206 М.