No video

Numerical Differentiation with Finite Difference Derivatives

  Рет қаралды 41,978

Steve Brunton

Steve Brunton

Күн бұрын

Approximating derivatives numerically is an important task in many areas of science and engineering, especially for simulating differential equations. In this video, I introduce several approaches to approximate derivatives using finite difference schemes. The error of each method is explored with Taylor series.
Playlist: • Engineering Math: Diff...
Course Website: faculty.washington.edu/sbrunto...
@eigensteve on Twitter
eigensteve.com
databookuw.com
This video was produced at the University of Washington
%%% CHAPTERS %%%
0:00 Numerical differentiation and finite difference
6:58 Understanding error with Taylor series
11:04 Forward difference derivative
15:39 Backward difference derivative
18:18 Central difference derivative
24:01 Matlab code example
32:53 Python code example

Пікірлер: 45
@jamesrav
@jamesrav Жыл бұрын
this was a big part of my MA in Applied Math ; considering that not many real-world equations have closed form solutions, this is the only way to go, and its a beautiful branch of Math and Computer Science.
@macmos1
@macmos1 Жыл бұрын
same.. you can get arbitrarily close to any analytical function
@kindoblue
@kindoblue Жыл бұрын
I watched so many videos from prof. Brunton that I hear “welcome back” in my head already at the opening 😂
@alexandervassilev8602
@alexandervassilev8602 Жыл бұрын
A wonderful lecture, as usual. Thank you, Prof. Brunton!
@user-gj8ir4cd1n
@user-gj8ir4cd1n 9 күн бұрын
Thankyou Steven. You are a good teacher and it is teachers like yourself that motivate me to continue on my path of education.
@richardcasey4439
@richardcasey4439 Жыл бұрын
The best math instruction on the Internet
@gooblepls3985
@gooblepls3985 Жыл бұрын
Thank you so much Steve! You're an inspiration
@theludvigmaxis1
@theludvigmaxis1 Жыл бұрын
I saw Steven at APS DFD yesterday, was great.
@Pedritox0953
@Pedritox0953 Жыл бұрын
Great video!
@ThePiMan0903
@ThePiMan0903 Жыл бұрын
A very nice video sir! Thanks for this.
@AJ-et3vf
@AJ-et3vf Жыл бұрын
Great video. Thank you
@YC_Ch
@YC_Ch 3 күн бұрын
Perfect explain!! Thanks
@fIb6914
@fIb6914 18 күн бұрын
thank you for your effort, ı appreciate this video and you a lot!
@hoseinzahedifar1562
@hoseinzahedifar1562 Жыл бұрын
Thank you very much...❤❤❤.
@fabianaltendorfer11
@fabianaltendorfer11 9 ай бұрын
Awesome explanation! I really enjoy your videos
@Eigensteve
@Eigensteve 8 ай бұрын
Happy to hear it :) Thanks for watching!
@ScuffedF1
@ScuffedF1 5 ай бұрын
You are the GOAT
@lewisngeno4789
@lewisngeno4789 9 ай бұрын
you made life easier for me in my graduate study. Thank you
@Eigensteve
@Eigensteve 8 ай бұрын
Happy to help! Thanks for watching :)
@dzanc
@dzanc Жыл бұрын
I'd be looking forward to see a treatment for partial derivatives, i.e. stencils for PDEs. I never took the time to properly study this stuff and now whenever some pde blows up on me (expecially with Python, it happens a lot even for trivial stuff like a slightly modified diffusion equation) I'm left wondering whether I'm misusing functions or the damned thing just doesn't work very well
@davidkos5326
@davidkos5326 Жыл бұрын
Thank you so much
@Pakkids_in_china
@Pakkids_in_china Жыл бұрын
nice to hear this video. I have some basic questions related to extremum seeking control. although I watch many many your past videos and read many papers. still, small things confuse me. How I will ask?
@marc-andredesrosiers523
@marc-andredesrosiers523 Жыл бұрын
🙂 I'm hoping Functional Data Analysis will be covered. It's a very powerful numerical framework.
@sgjbslover
@sgjbslover 6 ай бұрын
Thank you
@dr.neetugarg1770
@dr.neetugarg1770 9 ай бұрын
beautifully explained. 👍👍
@Eigensteve
@Eigensteve 8 ай бұрын
Thanks!
@chensong254
@chensong254 Жыл бұрын
Thanks for the video! At 3:53, f(x) seems to be a typo for f(t). At 22:17, O(∆t^5) seems to be a typo for O(∆t^4).
@rizkamilandgamilenio9806
@rizkamilandgamilenio9806 11 ай бұрын
Do you know why the error is tend to fourth power?
@klave8511
@klave8511 Жыл бұрын
Would it be true that the central difference approximation is not just dt^2 vs dt in the forward and backward approximations but has the divide by 3 from the 3! advantage too. So to get 100 times smaller error you would only need 10/1.7 smaller dt? The 1.7 is sqrt(3) Really looking forward to differentiating real data to see how you treat the noise from digitizations and noisy real data. Thanks for the clear explanations!
@ntokozoamandile3253
@ntokozoamandile3253 3 ай бұрын
the 1st code, which IDE did you use Prof?
@Jackthewolf
@Jackthewolf 2 ай бұрын
I think Prof Brunton is missing a 2 in the denominator of the delta t squared term at 22:00 (Not that relevant for the result anyway), great lecture as always
@enisten
@enisten Жыл бұрын
Δt is the leading order error term, only if Δt < 1, right? That's the only way Δt is actually smaller than its higher powers. So the unit of t matters. If we use seconds, instead of hours, the numerical value of Δt may actually be larger than 1, making the first-order term not necessarily smaller than higher order terms and more specifically, dependent on the numerical values of their products with the derivative terms (rescaled by the factorials) in our chosen system of units. Correct?
@dankodnevic3222
@dankodnevic3222 7 ай бұрын
Wouldn't it be better to polyfit cubic spline around (t) and then extract first and second derivatives from coefs?
@sgjbslover
@sgjbslover 6 ай бұрын
Can you please talk about HIGH ORDER finite difference methods?
@MsE_0
@MsE_0 9 ай бұрын
hi mr. i'm kia i wanna discustion with you maybe you can. i have study with jurnal about high order compact finite difference. but i have problem in my metode. maybe can you help me?
@josesaldivar655
@josesaldivar655 8 күн бұрын
Do you really write backwards, or use software to invert it ???
@palapapa0201
@palapapa0201 Жыл бұрын
How did you record these videos? Did you write on a piece of glass and then mirror the video?
@enisten
@enisten Жыл бұрын
Probably, so we have the same perspective as him.
@NoamWhy
@NoamWhy Жыл бұрын
Call me lazy, but I just take a polynomial regression of N neighboring points, and the polynomial coefficients of this regression give me the first, second, third, etc derivatives of the function at the neighborhood's origen. Done!
@johnsinclair1447
@johnsinclair1447 Жыл бұрын
Wow, very clever. I didn't know fit coefficients were the derivatives. I'm going to go try that out now. Thanks for sharing -- I'll subscribe to your channel too.
@NoamWhy
@NoamWhy Жыл бұрын
@@johnsinclair1447 Just remember to multiply the Nth coefficient by N!, and you'll get the Nth derivative of the polynomial at its origin. This is a pretty straightforward result that you can prove by differentiating the polynomial N times and then setting x=0. Another way of deriving it comes from the Taylor expansion of a function.
@NoamWhy
@NoamWhy Жыл бұрын
@@johnsinclair1447 And let me know how it went.
@ash9788
@ash9788 Жыл бұрын
Hey you are right handed here 😃
@stockwatch9479
@stockwatch9479 Жыл бұрын
Please professor kindly upload a lot of python related videos from basic to advanced By vasanth from india,tamilnadu
The Finite Difference Method
8:34
singingbanana
Рет қаралды 94 М.
Fast and Furious: New Zealand 🚗
00:29
How Ridiculous
Рет қаралды 43 МЛН
The Finite Difference Method (1D)
23:18
Dave's Space
Рет қаралды 17 М.
This is why you're learning differential equations
18:36
Zach Star
Рет қаралды 3,4 МЛН
Forward Difference Method Theory | Numerical Methods
8:10
StudySession
Рет қаралды 8 М.
The derivative isn't what you think it is.
9:45
Aleph 0
Рет қаралды 696 М.
Finite Differences
8:35
Numerical Analysis by Julian Roth
Рет қаралды 62 М.