Proving a Representer Theorem
10:50
Examples of kernel functions
14:54
Riesz representation theorem
32:12
Hilbert Projection Theorem
47:59
4 ай бұрын
Abstract Hilbert space
23:32
5 ай бұрын
An example of the Hilbert space
24:24
Laurent series vs. Fourier series
7:09
Fourier series (a brief review)
12:52
Positive-definite matrices
11:56
6 ай бұрын
Positive semi-definite matrices
15:22
Symmetric matrices
18:19
6 ай бұрын
Orthogonal Projection
21:18
6 ай бұрын
Inner product in Linear Algebra
16:29
Пікірлер
@tykilee9683
@tykilee9683 8 күн бұрын
King real King!
@Wingwing-by5me
@Wingwing-by5me 14 күн бұрын
Sir, please keep this playlist of video in yt forever. There are very helpful ! thank you!
@BruneiMathClub
@BruneiMathClub 13 күн бұрын
Will do (as long as YT lasts)!
@paulsanyang4739
@paulsanyang4739 25 күн бұрын
Thank you.
@BruneiMathClub
@BruneiMathClub 23 күн бұрын
You're welcome!
@narimanmammadli7169
@narimanmammadli7169 Ай бұрын
Perplexity forwarded me here :) Thank you for the proof.
@BruneiMathClub
@BruneiMathClub Ай бұрын
Welcome!
@angelinausim9863
@angelinausim9863 Ай бұрын
Geometric process have discrete time and continuous space 01:35
@BruneiMathClub
@BruneiMathClub Ай бұрын
Right! Thanks for pointing it out. c.f., Wikipedia: en.wikipedia.org/wiki/Geometric_process
@ickywitchy4667
@ickywitchy4667 2 ай бұрын
Best video i could find for this topic!
@BruneiMathClub
@BruneiMathClub 2 ай бұрын
Thanks!
@JaafarJoulakMuhammad
@JaafarJoulakMuhammad 3 ай бұрын
How can I study the Uniform Convergence for the series of function ∑(x/(x^2+1))^k Where x is from R
@BruneiMathClub
@BruneiMathClub 3 ай бұрын
I'm not sure what you mean exactly. Why don't you make a video and let me know when you figure it out?
@JaafarJoulakMuhammad
@JaafarJoulakMuhammad 3 ай бұрын
The intent of my statement is that I want a way to study the Uniform Convergence of the above series of functions
@tjerkharkema7378
@tjerkharkema7378 3 ай бұрын
Thank's a lot for your excellent explanation Dr. Akira, maybe you lost a factor of 2 in the denominator in exp to get the following result: ρ_z(z) = 1⧸√(2π(σ_x^2+σ_y^2)) exp[-(z-(μ_x+μ_y))^2/2(σ_x^2+σ_y^2)] TJ
@keanub.1693
@keanub.1693 4 ай бұрын
King
@lebl3278
@lebl3278 4 ай бұрын
Very good thanks
@BruneiMathClub
@BruneiMathClub 4 ай бұрын
Most welcome!
@junma3575
@junma3575 4 ай бұрын
The P(X) should be P(Xi)*P(Xj) in the variance term, still using P(X) could be a mistake?
@BruneiMathClub
@BruneiMathClub 4 ай бұрын
It is P(X) = P(X1, X2, ..., Xn) (joint probability density), not P(Xi)*P(Xj). Note Xi and Xj may not be independent.
@junma3575
@junma3575 4 ай бұрын
@@BruneiMathClub Thank you so much. I finally get it.
@linfengdu7636
@linfengdu7636 4 ай бұрын
Why is there a 1/2 timed to the covariance constraint? Should the degree of freedom of the covariance matrix be D(D+1)/2?
@BruneiMathClub
@BruneiMathClub 4 ай бұрын
That 1/2 in the covariance constraint is not essential. It's there mostly for an aesthetic reason (it looks nicer after differentiation). You get the same result without the 1/2 factor (try it!), as it can be absorbed in the Lagrange multipliers (γ's).
@linfengdu7636
@linfengdu7636 4 ай бұрын
@@BruneiMathClub Yes indeed. Thank you for your reply and fantastic videos! I’ve been working on the exercise of the Pattern Recognition and Machine Learning book and your videos helped a lot!
@linfengdu7636
@linfengdu7636 4 ай бұрын
@@BruneiMathClub BTW you can also evaluate the stationary point in full matrix form using the trace operator for the quadratic term, which I find is pretty neat.
@lefteriseleftheriades7381
@lefteriseleftheriades7381 4 ай бұрын
And what is it used for?
@BruneiMathClub
@BruneiMathClub 4 ай бұрын
For example, the regression problem can be cast as finding a projection onto a subspace "generated" by a dataset. Future videos will explain such applications.
@raul1827
@raul1827 5 ай бұрын
Can you please tell me what is the referencee to this demonstration?
@BruneiMathClub
@BruneiMathClub 5 ай бұрын
It's in quite a few textbooks. For example, in "Elements of Information Theory" by Cover and Thomas, See also the Wikipedia page: en.wikipedia.org/wiki/Jensen%27s_inequality
@raul1827
@raul1827 5 ай бұрын
@@BruneiMathClub thank's a lot.
@cvtncavidan8351
@cvtncavidan8351 5 ай бұрын
You are grate mannn, thanks, god bless you
@BruneiMathClub
@BruneiMathClub 5 ай бұрын
You're welcome! God bless you, too.
@fynnzentner3964
@fynnzentner3964 5 ай бұрын
Great video! I was looking for a video about uniform convergence of the Fourier Series and your video really helped. Thanks.
@BruneiMathClub
@BruneiMathClub 5 ай бұрын
Glad it was helpful!
@LETHERL1VE
@LETHERL1VE 5 ай бұрын
thank you for accessible explanation!
@BruneiMathClub
@BruneiMathClub 5 ай бұрын
Glad it was helpful!
@ethanbottomley-mason8447
@ethanbottomley-mason8447 5 ай бұрын
It's nice to see someone doing proper math in a short. By proper math, I just mean something beyond basic calc/multivariable calc.
@BruneiMathClub
@BruneiMathClub 5 ай бұрын
Thanks. Shorts can be helpful sometimes.
@detranquoc2608
@detranquoc2608 5 ай бұрын
nice, thanks alot for sharing
@minabasil
@minabasil 5 ай бұрын
very clear ♥ new fan😍
@BruneiMathClub
@BruneiMathClub 5 ай бұрын
Thanks and welcome!
@XKALOS7
@XKALOS7 5 ай бұрын
I Love it
@omargaber3122
@omargaber3122 6 ай бұрын
@hyperduality2838
@hyperduality2838 6 ай бұрын
The eigen basis is dual to the standard basis -- conjugacy is dual, spectral decomposition. The integers are self dual as they are their own conjugates. "Always two there are" -- Yoda. Real is dual to imaginary -- complex numbers are dual. Antipodal points identify for the rotation group SO(3) -- stereographic projection.
@hyperduality2838
@hyperduality2838 6 ай бұрын
Projections imply two dual perspectives. Increasing or creating new dimensions or states is an entropic process -- Gram-Schmidt procedure. Decreasing or destroying dimensions or states is a syntropic process. Divergence (entropy) is dual to convergence (syntropy) -- increasing is dual to decreasing. "Always two there are" -- Yoda.
@hyperduality2838
@hyperduality2838 6 ай бұрын
Perpendicularity, orthogonality = Duality! "Perpendicularity in hyperbolic geometry is measured in terms of duality" -- universal hyperbolic geometry. Orthonormality is dual. "Always two there are" -- Yoda. Vectors are dual to co-vectors (forms) -- vectors are dual. Space is dual to time -- Einstein.
@hyperduality2838
@hyperduality2838 6 ай бұрын
Sine is dual to cosine or dual sine -- the word co means mutual and implies duality. "Always two there are" -- Yoda.
@BruneiMathClub
@BruneiMathClub 6 ай бұрын
For this and the coming videos, I use typeset notes instead of handwritten notes for presentation. I'd appreciate it if you let me know which one you prefer.
@NaN_000
@NaN_000 6 ай бұрын
Thank you
@BruneiMathClub
@BruneiMathClub 6 ай бұрын
You're welcome.
@NaN_000
@NaN_000 6 ай бұрын
Is it's f'(x) is signum function right?
@BruneiMathClub
@BruneiMathClub 6 ай бұрын
Almost, but not exactly. They differ at x = 0. For f(x) = |x|, f'(0) = 1, whereas signum(0) = 0.
@omargaber3122
@omargaber3122 6 ай бұрын
Great thanks
@BruneiMathClub
@BruneiMathClub 6 ай бұрын
You are welcome.
@juthisarker3789
@juthisarker3789 6 ай бұрын
Hello sir.. I'm sorry Bangladesh.
@johannesaaen248
@johannesaaen248 6 ай бұрын
I swear I was so god damn confused about the definition of the ball (N_epsilon) set, and how it is used to determine whether a set is open or closed before your video. For some reason our material lacks any visualisation, so you video really really helped me out :)
@KillerNoam
@KillerNoam 6 ай бұрын
אני מעדיף אורתוגונליות
@theblinkingbrownie4654
@theblinkingbrownie4654 7 ай бұрын
HUGE!
@BruneiMathClub
@BruneiMathClub 7 ай бұрын
Is it?
@user-hr8uj4qw4k
@user-hr8uj4qw4k 7 ай бұрын
There's one issue of mine: By the same argument, the density function of the joint r.v seems to split into product of the marginal pdf of the components, implying that the components are automatically independent, which is obviously wrong. The same question is posted on MSE titled "Conceptual misunderstanding of the multivariate normal distribution", for some reason the comment can't include the original link without being taken down, may I ask what might be the cause to this confusion?
@BruneiMathClub
@BruneiMathClub 7 ай бұрын
Do you mean that although x_i and x_j are not independent, after some transformation, why z_i and z_j seem to be independent? That's because, after the orthogonal transformation, z_i and z_j ARE independent. This is one of the peculiar properties of the multivariate normal distribution: We can always transform it into a joint distribution of independent variables by some orthogonal transformation. By the way, regarding the question on MSE (Math StackExchange), I think that's a different issue, and it has a mistake (the Borel set S should also be transformed along with the variables).
@user-hr8uj4qw4k
@user-hr8uj4qw4k 7 ай бұрын
@@BruneiMathClub Thank you for taking time to respond, I do see where the derivation went wrong now. But may I still ask how to derive the marginal distribution of each component X_i without resorting to the moment generating function? (As most sources do to get the distribution of the linear transformed AX+b) The initial attempt was to simplify the joint distribution by a sequence of change of variables and hopefully to be able to evaluate P(X_i in S_i) = P(X in R x ... x S_i x ... x R), but since this works only when the components are independent, it doesn't seem to apply in the general case.
@jongxina3595
@jongxina3595 7 ай бұрын
Very neat derivation!
@BruneiMathClub
@BruneiMathClub 7 ай бұрын
Thanks!
@theblinkingbrownie4654
@theblinkingbrownie4654 7 ай бұрын
Ty fam!
@user-hr8uj4qw4k
@user-hr8uj4qw4k 7 ай бұрын
It would be nice to include the derivation of some of the basic properties of the multivariate normal distribution. For example, that the expectation of each component r.v is the corresponding component of the mean vector, that the entries of the positive definite real symmetric matrix Sigma actually give covariance, and equivalence between uncorrelatedness and independence, etc.
@theblinkingbrownie4654
@theblinkingbrownie4654 7 ай бұрын
How would i derive the multivariate normal distribution anyways?
@BruneiMathClub
@BruneiMathClub 7 ай бұрын
That's an interesting question! There are many ways to answer this, but it's basically the same as the univariate case. I will make a video about this in the near future. Stay tuned.
@theblinkingbrownie4654
@theblinkingbrownie4654 7 ай бұрын
@@BruneiMathClub Sure! I figured they would be similar, I just wanted to know where covariance comes into play
@nick45be
@nick45be 7 ай бұрын
In the mean square convergence L², why if there are some x point in which the difference fn - f result not converge in zero so the integral converge to zero ? Maybe because these particular point doesn't give any contribution? But why they don't give any contribution?
@cmdcs1
@cmdcs1 7 ай бұрын
Looking forward to following this series 😊
@BruneiMathClub
@BruneiMathClub 7 ай бұрын
I hope this helps.
@dhruvbisht7844
@dhruvbisht7844 8 ай бұрын
Padane ka tarika thoda casual hai
@javierweeb4428
@javierweeb4428 8 ай бұрын
Very nice demonstration of the problem 🙏🙏🙏
@BruneiMathClub
@BruneiMathClub 8 ай бұрын
Many many thanks!
@YechenHan
@YechenHan 8 ай бұрын
老师我们家子涵说她以后都不学随机过程了
@BruneiMathClub
@BruneiMathClub 8 ай бұрын
Well, good luck!
@thorblessing4015
@thorblessing4015 9 ай бұрын
Thank you for the help! Best video I have found about this problem!
@BruneiMathClub
@BruneiMathClub 8 ай бұрын
Glad it helped!
@josephdays07
@josephdays07 9 ай бұрын
Excellent video. I have demostrated and solved it is in different way. kzfaq.info/get/bejne/eL5hqcqf2bKXmn0.htmlsi=dRkM3NxYzbb43zTB Excellent video. When I saw this video I remenber my equation. You can solve this equation. By this methodoloy that I have developed:(1-cos(x))/x=2sen(x/2)*sen(x/2)/(2*sen(x/2))=2sen(x/2); Appliying the limit it is equal 0. kzfaq.info/get/bejne/eL5hqcqf2bKXmn0.htmlsi=dRkM3NxYzbb43zTB Excellent video. When I saw this video I remenber my equation. You can solve this equation and integrate. By this methodoloy that I have developed. You can use this identity x=2sin(x/2) for little angles. :x/tan(x)=4*sen(x/2)-2Ln[/sec(x/2)+Tan(x/2)/] +b. kzfaq.info/get/bejne/eL5hqcqf2bKXmn0.htmlsi=dRkM3NxYzbb43zTB
@jakeaustria5445
@jakeaustria5445 9 ай бұрын
I wanna use gausian kernel pdf estimation, so I need a multivariate version of it for my use case. I am just not familiar with vectors haha, so it's hard.
@BruneiMathClub
@BruneiMathClub 9 ай бұрын
Yes, the multivariate version looks scary at first. But it becomes easier if you carefully compare and recognize the similarities with and differences from the univariate version. After all, it's the most natural and straightforward generalization.
@jakeaustria5445
@jakeaustria5445 9 ай бұрын
@@BruneiMathClub Is there a great visualisation for the 4d gaussian haha. I know it's very hard or almost impossible to do. Probably just assign the 4th dimension to color and it will work. Just wondering, is there a simpler kernel than the gaussian that produces similar results. I know bandwidth matters a lot in this case, but the paper I read about this topic did not elaborate on the alternative kernels that we can use. There are square kernels, triangular kernel, etc. The paper also said that the shape of kernels don't matter, since the pdf will still be approximated with enough data. The exception is the bandwidth. Different bandwidths result in different limiting functions. How do we compute the bandwidth for the triangular kernel, square kernel, and others.
@jakeaustria5445
@jakeaustria5445 9 ай бұрын
Yey