PyTorch Tutorial 03 - Gradient Calculation With Autograd

  Рет қаралды 164,338

Patrick Loeber

Patrick Loeber

Күн бұрын

New Tutorial series about Deep Learning with PyTorch!
⭐ Check out Tabnine, the FREE AI-powered code completion tool I use to help me code faster: www.tabnine.com/?... *
In this part we learn how to calculate gradients using the autograd package in PyTorch.
This tutorial contains the following topics:
- requires_grad attribute for Tensors
- Computational graph
- Backpropagation (brief explanation)
- How to stop autograd from tracking history
- How to zero (empty) gradients
Part 03: Gradient Calculation With Autograd
📚 Get my FREE NumPy Handbook:
www.python-engineer.com/numpy...
📓 Notebooks available on Patreon:
/ patrickloeber
⭐ Join Our Discord : / discord
If you enjoyed this video, please subscribe to the channel!
Official website:
pytorch.org/
Part 01:
• PyTorch Tutorial 01 - ...
You can find me here:
Website: www.python-engineer.com
Twitter: / patloeber
GitHub: github.com/patrickloeber
#Python #DeepLearning #Pytorch
----------------------------------------------------------------------------------------------------------
* This is a sponsored link. By clicking on it you will not have any additional costs, instead you will support me and my project. Thank you so much for the support! 🙏

Пікірлер: 95
@simonbernard4216
@simonbernard4216 3 жыл бұрын
I appreciate that you followed the official tutorial from the Pytorch documentation. Also your comments make things much clearer
@patloeber
@patloeber 3 жыл бұрын
Thanks! Glad you like it
@GurpreetSingh-si2gh
@GurpreetSingh-si2gh Жыл бұрын
I appreciate your efforts, you readily explained every detail. Thanks very much. 🙏
@DanielWeikert
@DanielWeikert 4 жыл бұрын
This is great work. I really like the detailed explanation using simple practical examples. Please continue doing this exploring the various options in pytorch. Happy holidays
@patloeber
@patloeber 4 жыл бұрын
Thank you! Glad you like it. I have more videos planned for the next few days :) Happy holidays!
@kevinw6237
@kevinw6237 Жыл бұрын
This video is very clear in explaining things. Thank you so much sir! Keep up the good work pls!
@user-tz4ms1ki3l
@user-tz4ms1ki3l 2 жыл бұрын
This tutorial is more than great! Thank you!
@barbaraulitsky8292
@barbaraulitsky8292 7 ай бұрын
This video is super helpful! Explanations are very understandable. Thank you so much!!🤩👍🙏
@marcoantonioarmentaarmenta1350
@marcoantonioarmentaarmenta1350 4 жыл бұрын
This video is extremely useful. Thank you!
@shazinhotech2155
@shazinhotech2155 Жыл бұрын
what an amazing teacher you are
@haroldsu1696
@haroldsu1696 Жыл бұрын
Great work Sir, and Thank you!
@wiktormigaszewski8684
@wiktormigaszewski8684 4 жыл бұрын
very nice, congrats!
@henrygory
@henrygory Жыл бұрын
Excellent video. Thanks!
@nishaqiao828
@nishaqiao828 3 жыл бұрын
Thank you! Your video helps a lot to my undergraduate final project!
@patloeber
@patloeber 3 жыл бұрын
that's nice to hear :)
@warrior_1309
@warrior_1309 2 жыл бұрын
Great Work Sir.
@exxzxxe
@exxzxxe 4 жыл бұрын
Very well done. An excellent video!
@patloeber
@patloeber 4 жыл бұрын
Thanks !
@foodsscenes5891
@foodsscenes5891 3 жыл бұрын
Quite love these courses! Great thanks
@patloeber
@patloeber 3 жыл бұрын
Thank you :)
@gordonxie7173
@gordonxie7173 3 жыл бұрын
Great work!
@alexandredamiao1365
@alexandredamiao1365 3 жыл бұрын
Great tutorial! Thanks for this material!
@patloeber
@patloeber 3 жыл бұрын
thanks :)
@andywang4189
@andywang4189 4 ай бұрын
Very clear, thank you very much
@mihaidumitrescu1325
@mihaidumitrescu1325 2 ай бұрын
Fantastic work! slight recommendation of how to improve this: if u use the same naming scheme in the Jacobian and in the code (l vs z), we can follow easier the chain rule!
@user-vm3jn7ih8j
@user-vm3jn7ih8j 3 жыл бұрын
This tutorial is brilliant. It is super friendly to people who are new to Pytorch!
@patloeber
@patloeber 3 жыл бұрын
glad you like it!
@maithilijoshi796
@maithilijoshi796 Жыл бұрын
thankyou for your efforts
@dfdiasbr
@dfdiasbr 3 ай бұрын
awesome! Thank you
@samirelamrany5323
@samirelamrany5323 Жыл бұрын
great work thank you
@charithas90
@charithas90 3 жыл бұрын
very nice tutorial. thank you!
@patloeber
@patloeber 3 жыл бұрын
thanks :)
@sanjaykrish8719
@sanjaykrish8719 3 жыл бұрын
All in one pytorch.. yeahhh.. fantastic.. thanks a ton🎉🎊🎊
@patloeber
@patloeber 3 жыл бұрын
glad you like it!
@soumyajitsarkar2372
@soumyajitsarkar2372 4 жыл бұрын
Really amazing and very well explained ! Thank You . Btw what IDE are you using ? looks so cool and handy , love the output option below .
@TheaterOfDreamss
@TheaterOfDreamss 4 жыл бұрын
Visual Studio Code
@patloeber
@patloeber 4 жыл бұрын
Exactly :)
@yishu4244
@yishu4244 3 жыл бұрын
This video helps me greatly. I like your language speed since English is not my mother tongue. Thank you a lot.
@patloeber
@patloeber 3 жыл бұрын
Nice, glad you like it
@DiegoAndresAlvarezMarin
@DiegoAndresAlvarezMarin 2 жыл бұрын
Thank you very much for your tutorial. However, I did not plenty understand some details, like the variable v or the use of the optimizer.
@Murmur1131
@Murmur1131 2 жыл бұрын
@ 8:16 do you have a simple real life "example" why we have to use the v for the right size and why it wouldn't work without it? I know silly question, but I don't really grasp the concept behind it..
@bhomiktakhar8226
@bhomiktakhar8226 2 жыл бұрын
Just on to my 4 lec in pytorch series. Don't know if it's a complete series on pytorch ,but definitely whatever is there it's depicted nicely.
@patloeber
@patloeber 2 жыл бұрын
nice, yes it's a complete beginner series
@Muhammad_Abdullah_Sultan
@Muhammad_Abdullah_Sultan Ай бұрын
Amazing
@anurajms
@anurajms 10 ай бұрын
thank you
@xkaline
@xkaline 3 жыл бұрын
Great PyTorch Tutorial videos! May I know what is the VS Code extension you use to autocomplete the PyTorch line?
@patloeber
@patloeber 3 жыл бұрын
In this video it was only the built-in autocompletion through the official Python extension
@jorgemercadog
@jorgemercadog 3 жыл бұрын
Thank you!!
@patloeber
@patloeber 3 жыл бұрын
thanks for watching
@milk_steak_the_original
@milk_steak_the_original 2 жыл бұрын
When do you learn the why of things ? why am I making a gradient ? when would I use it ? I feel like these things are often explained in DS videos/classes
@alejandrosaenz4551
@alejandrosaenz4551 4 жыл бұрын
I just discoverd your channel and it's really good! One question, I dont totally understand why we should use .detach() or no_grad() when updating weights... we are creating a new graph or something like that? what "prevent to tracking the gradient" exactly means? Hope you can help me with that. Keep the good job (:
@patloeber
@patloeber 4 жыл бұрын
You should use this for example after the training when you evaluate your model. Because otherwise for each operation you do with your tensor, PyTorch will calculate the gradients. This is time consuming and expensive, so we don't want this anymore after the training because we no longer need backpropagation. It will reduce memory usage and speed up computations.
@popamaji
@popamaji 4 жыл бұрын
interesting thing that I realized is that eventhough after set z=z.mean which changes grad_fn of z from mulbackward to meanbackward(so z doesnt have mulbackward anymore), it is still able to track gradient
@patloeber
@patloeber 4 жыл бұрын
if requires_grad=true, any operation that we do with z tracks the gradient
@mahdiamrollahi8456
@mahdiamrollahi8456 Жыл бұрын
If i want to play with it, suppose that we have y=3x, and we should get the answer of 3 for the gradient. So, how can i do that in pytorch?
@dataaholic
@dataaholic 3 жыл бұрын
@ 8:00 what is the difference between z.backward() vs z.backward(v)? z.backward() ==> is calculating dz/dx is z.backward(v) ==> is calculating dz/dv ?
@alanzhu7538
@alanzhu7538 2 жыл бұрын
Also, how do we decide the value for v?
@gautame
@gautame 2 жыл бұрын
z.backward(v) calculates dz/dx*v
@peixinwu4631
@peixinwu4631 2 жыл бұрын
@@alanzhu7538 You got any idea now?😂
@peixinwu4631
@peixinwu4631 2 жыл бұрын
@@gautame You mean v * (dz/dx)? Or dz/(dx * v)?
@faseehahmed6164
@faseehahmed6164 2 жыл бұрын
@@peixinwu4631 v * (dz/dx)
@danielpietschmann5294
@danielpietschmann5294 3 жыл бұрын
why can you just specify v as [0.1, 1.0, 0.001], why not some other numbers?
@crivatz
@crivatz Жыл бұрын
Did you find out?
@minister1005
@minister1005 11 ай бұрын
It doesn't matter. You just need to have the same shape
@taku8751
@taku8751 4 жыл бұрын
Really good tutorial, but I wanna know what the app which you use to draw is.
@patloeber
@patloeber 4 жыл бұрын
I used an Ipad for this (some random notes app)
@consistentthoughts826
@consistentthoughts826 3 жыл бұрын
@5:46 in the image it is J^T . V
@cypherecon5989
@cypherecon5989 Ай бұрын
I don't get why you need the 3 methods presented at 8:06 for preventing the gradient. Can one not simply put requires_grad=False in the x tensor?
@13579Josiah
@13579Josiah 2 жыл бұрын
At 9:30, isn’t another way to do x.volatile = False ?
@vanamaliis7185
@vanamaliis7185 Жыл бұрын
Getting this error, RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.
@iposipos9342
@iposipos9342 4 жыл бұрын
Thanks for the video. for the tensor v, are the values of any importance or its only the size that is of importance?
@patloeber
@patloeber 4 жыл бұрын
Hi, good question! Yes this can be a bit confusion, so I provided some links from the pytorch forum which might help: discuss.pytorch.org/t/how-to-use-torch-autograd-backward-when-variables-are-non-scalar/4191/4 discuss.pytorch.org/t/clarification-using-backward-on-non-scalars/1059
@sebimoe
@sebimoe 4 жыл бұрын
@@patloeber I can't find anyone describe how to use it there, are you able to give a quick summary? Help much be appreciated as I can't find anything, including values 0.1, 1, 0.001
@jesuispac
@jesuispac 4 жыл бұрын
@@sebimoe Apparently he doesn't know either....Typical
@priyanka5516
@priyanka5516 3 жыл бұрын
In my understanding, only the size will matter. So you can write v=torch.randn(3). Then pass v in backward function.
@dataaholic
@dataaholic 3 жыл бұрын
Did anyone find a good explanation for v ? I'm also bit confused here @ 8:00 what is the difference between z.backward() vs z.backward(v)? z.backward() ==> is calculating dz/dx is z.backward(v) ==> is calculating dz/dv ?
@yuyangxi3359
@yuyangxi3359 3 жыл бұрын
pretty good
@patloeber
@patloeber 3 жыл бұрын
Thanks!
@my_opiniondemocracy6584
@my_opiniondemocracy6584 Жыл бұрын
how did you decide values in vector v?
@luanmartins8068
@luanmartins8068 Ай бұрын
The gradient in 5:46 is the multiplication of the jacobian matrix and a directional vector. When the function which you want to calculate the gradient has a one dimensional output, there is no need to determine a direction for derivative since it is unique, that's why for the mean gradient it accepts no arguments. On other hand, see that for a multidimensional output as x+2 = y (y is a multidimensional output, a vector), you have to specify what direction you want to take your gradient. That is where the v vector enters. He arbitrarily choose components just to show that the function requires an vector to define the directional derivative, but usually for statistical learning it is chosen the direction where the gradient is steepest
@andrei642
@andrei642 2 жыл бұрын
I had a german professor who also pronounced "value" as "ualue". Can you explain why do you sometimes pronounce it that way? I am very intrigued. P.S. you have the best pytorch series on youtube
@oleholgerson3416
@oleholgerson3416 4 жыл бұрын
That’s good, ja? :)
@user-ss7bm2id5c
@user-ss7bm2id5c 3 жыл бұрын
I cant understand,why must gradient equal +gradient (gradient=+gradient) in each epoch.Where Can I find consistent mathematic formula?Can you explain me once?
@patloeber
@patloeber 3 жыл бұрын
I'm not exactly sure what you mean. Can you point me to the time in the video where I show this?
@user-ph3qi5to4r
@user-ph3qi5to4r 7 ай бұрын
This is so f**king useful, thank you sooo much!!
@saurabhkumar-wj1nz
@saurabhkumar-wj1nz 3 жыл бұрын
Hey, could you explain what it means by tracking the gradient. I mean why its a issue??
@patloeber
@patloeber 3 жыл бұрын
Tracking each operation is necessary for the backpropagation. But it is expensive, so after the training we should disable it
@chq7759
@chq7759 3 жыл бұрын
When I run the following code, I encountered an error. Could you help me? Thank you very much! weights = torch.ones(4, requires_grad=True) optimizer = torch.optim.SGD(weights, lr=0.01) The error is Traceback (most recent call last): File "D:/1pytorch-tutorial/my_try/learning-PythonEngineer/learning.py", line 113, in optimizer = torch.optim.SGD(weights, lr=0.01) File "D:\Anaconda3\lib\site-packages\torch\optim\sgd.py", line 68, in __init__ super(SGD, self).__init__(params, defaults) File "D:\Anaconda3\lib\site-packages\torch\optim\optimizer.py", line 39, in __init__ torch.typename(params)) TypeError: params argument given to the optimizer should be an iterable of Tensors or dicts, but got torch.FloatTensor
@LeonardoRocha0
@LeonardoRocha0 Жыл бұрын
torch.optim.SGD([weights], lr=0.01)
@my_opiniondemocracy6584
@my_opiniondemocracy6584 Жыл бұрын
in the end...it became heavy
@filosofiadetalhista
@filosofiadetalhista Жыл бұрын
I'm not sure how you managed to be unclear on the third video of the series. What you said about gradients, .backwards(), .step(), and .zero_grad() were not clear at all.
@mittalmayankcool
@mittalmayankcool Жыл бұрын
Too many advertisements
@suecheng3755
@suecheng3755 Жыл бұрын
Great work!
PyTorch Tutorial 04 - Backpropagation - Theory With Example
13:13
Patrick Loeber
Рет қаралды 106 М.
What is Automatic Differentiation?
14:25
Ari Seff
Рет қаралды 103 М.
Как быстро замутить ЭлектроСамокат
00:59
ЖЕЛЕЗНЫЙ КОРОЛЬ
Рет қаралды 13 МЛН
She ruined my dominos! 😭 Cool train tool helps me #gadget
00:40
Go Gizmo!
Рет қаралды 43 МЛН
Balloon Stepping Challenge: Barry Policeman Vs  Herobrine and His Friends
00:28
When Jax'S Love For Pomni Is Prevented By Pomni'S Door 😂️
00:26
PyTorch Tutorial 02 - Tensor Basics
18:28
Patrick Loeber
Рет қаралды 233 М.
PyTorch Tutorial 06 - Training Pipeline: Model, Loss, and Optimizer
14:16
Building a Neural Network with PyTorch in 15 Minutes | Coding Challenge
20:34
PyTorch Tutorial 09 - Dataset and DataLoader - Batch Training
15:27
Patrick Loeber
Рет қаралды 181 М.
What is PyTorch? (Machine/Deep Learning)
11:57
IBM Technology
Рет қаралды 21 М.
LLM Chronicles #3.2: Gradient Descent in PyTorch with Autograd (Lab)
20:55
Как быстро замутить ЭлектроСамокат
00:59
ЖЕЛЕЗНЫЙ КОРОЛЬ
Рет қаралды 13 МЛН