Deep Learning Lecture 6: Optimization
58:19
Deep Learning Lecture 1: Introduction
52:16
Machine learning - Deep learning I
1:15:05
Machine learning - Neural networks
1:04:24
Machine learning - Logistic regression
1:13:47
Machine learning - Random forests
1:16:55
Machine learning - Decision trees
1:06:06
Machine learning - Gaussian processes
1:17:28
Пікірлер
@user-nr3ej2ud5j
@user-nr3ej2ud5j 2 ай бұрын
isn't 22:19 the right side formula for x1|x2 not for x2|x1?
@Sheriff_Schlong
@Sheriff_Schlong 3 ай бұрын
at 1:02:40 IK this teacher was a legend. 11years late and still able to gain much valuable knowledge from these lectures!
@crestz1
@crestz1 4 ай бұрын
beautifully linked the idea of maximising likelihood by illustrating the 'green line' @ 51:41
@crestz1
@crestz1 4 ай бұрын
Amazing lecturer
@forughghadamyari8281
@forughghadamyari8281 4 ай бұрын
hi. Thanks for wonderful videos. please introduce a book to study for this course.
@ratfuk9340
@ratfuk9340 4 ай бұрын
Thank you for this
@bottomupengineering
@bottomupengineering 5 ай бұрын
Great explanation and pace. Very legit.
@terrynichols-noaafederal9537
@terrynichols-noaafederal9537 5 ай бұрын
For the noisy GP case, we assume the noise is sigma^2 * the identity matrix, which assumes iid. What if the noise is correlated, can we incorporate the true covariance matrix?
@m0tivati0n71
@m0tivati0n71 6 ай бұрын
Still great in 2023
@huuducdo143
@huuducdo143 6 ай бұрын
Hello Nando, thank you for your excellent course. Following the bell example, the muy12 and sigma12 you wrote should be for the case that we are giving X2=x2 and try to find the distribution of X1 given X2=x2. Am I correct? Other understanding is welcomed. Thanks a lot!
@newbie8051
@newbie8051 7 ай бұрын
It amazes me that people were discussing these topics when I was studying about the water-cycle lol.
@ScieLab
@ScieLab 8 ай бұрын
Hi Nando, is it possible to access the codes that you have mentioned in the lecture?
@S25plus
@S25plus 8 ай бұрын
Thanks prof. Freitas, this is extremely helpful
@TheDeatheater3
@TheDeatheater3 9 ай бұрын
super good
@marcyaudrey6608
@marcyaudrey6608 10 ай бұрын
This lecture is amazing Professor. From the bottom of my heart, I say thank you.
@concoursmaths8270
@concoursmaths8270 10 ай бұрын
professor Nando, thank you a lot!!
@bodwiser100
@bodwiser100 11 ай бұрын
One thing that remained confusing for me for a long time and which I don't think he clarified in the video was that the N and the summation from i = 1 to i = N does not refer to the # of data points in our dataset but to the number of times of we run the Monte Carlo simulation.
@truongdang8790
@truongdang8790 Жыл бұрын
Amazing example!
@guliyevshahriyar
@guliyevshahriyar Жыл бұрын
Thank you very much.
@bingtingwu8620
@bingtingwu8620 Жыл бұрын
Thanks!!! Easy to understand👍👍👍
@subtlethingsinlife
@subtlethingsinlife Жыл бұрын
He is a hidden gem .. I have gone through a lot of his videos , they are great in terms of removing jargon .. and bringing clarity
@fuat7775
@fuat7775 Жыл бұрын
This is absolutely the best explanation of the Gaussian!
@nikolamarkovic9906
@nikolamarkovic9906 Жыл бұрын
49:40 str 46
@el-ostada5849
@el-ostada5849 Жыл бұрын
Thank you for everything you have given to us.
@charlescoult
@charlescoult Жыл бұрын
This was an excellent lecture. Thank you.
@cryptogoth
@cryptogoth Жыл бұрын
Great lecture, abrupt ending. I believe this is the short (but dense) book mentioned by Criminisi about decision forests www.microsoft.com/en-us/research/wp-content/uploads/2016/02/CriminisiForests_FoundTrends_2011.pdf
@chenqu773
@chenqu773 Жыл бұрын
It looks like that the notation of the axis in the graph on the right side of the presentation, @ around 20:39, is not correct. It could probably be the x1 on x-axis. I.e: it would make sense if μ12 refered to the mean of variable x1, rather than x2, judging from the equation shown on the next slide.
@kianbehdad
@kianbehdad Жыл бұрын
You can olny "die" once. That is how I remember die is singular :D
@hohinng8644
@hohinng8644 Жыл бұрын
The use of notation at 23:00 is confusing for me
@rikki146
@rikki146 2 жыл бұрын
Learning advanced ml concepts for free! What a time to be alive. Thanks a lot for the vid!
@marouanbelhaj7881
@marouanbelhaj7881 2 жыл бұрын
To this day, I keep coming back to your videos to refresh ML concepts. Your courses are a Masterpiece!
@emmanuelonyekaezeoba6346
@emmanuelonyekaezeoba6346 2 жыл бұрын
Very elaborate and simple presentation. Thank you.
@Gouda_travels
@Gouda_travels 2 жыл бұрын
This is when got really interesting 22:02 typically, I'm given points and I am trying to learn the mu's and the sigma's
@MrStudent1978
@MrStudent1978 2 жыл бұрын
1:12:24 What is mu(x)? Is that different from mu?
@ahmed_mohammed_1
@ahmed_mohammed_1 2 жыл бұрын
I wish if i discovered your courses a bit earlier
@adamtran5747
@adamtran5747 2 жыл бұрын
Love the content. <3
@michaelcao9483
@michaelcao9483 2 жыл бұрын
Thank you! Really great explanation!!!
@augustasheimbirkeland4496
@augustasheimbirkeland4496 2 жыл бұрын
5 minutes in and its already better than all 3 hours at class earlier today!
@truptimohanty9386
@truptimohanty9386 2 жыл бұрын
This is the best video for understanding the Bayesian Optimization. It would be a great help if you could you post a video on multi objective Bayesian optimization specifically on expected hyper volume improvement. Thank you
@htetnaing007
@htetnaing007 2 жыл бұрын
Don't stop sharing these knowledge for those are vital to the progress of humankind!
@jeffreycliff922
@jeffreycliff922 2 жыл бұрын
access to the source code to do this would be useful
@gottlobfreige1075
@gottlobfreige1075 2 жыл бұрын
So, basically, it's partial derivatives?
@gottlobfreige1075
@gottlobfreige1075 2 жыл бұрын
I don't understand, it's basically a lot of derivatives within the layers.. correct?
@jx4864
@jx4864 2 жыл бұрын
After 30mins, I am sure that he is top 10 teacher in my life
@cicik57
@cicik57 2 жыл бұрын
the best way to explain gamma function, is that is continuous factorial. you should point that P(teta) you write is probability DENCITY function here
@jhn-nt
@jhn-nt 2 жыл бұрын
Great lecture!
@gottlobfreige1075
@gottlobfreige1075 2 жыл бұрын
How do you understand the math part with depth? Anyone? Help me!
@xinking2644
@xinking2644 2 жыл бұрын
if their is a mistake in 21:58 ? it should be condition on x1 instead of x2 ?
@FabulusIdiomas
@FabulusIdiomas 2 жыл бұрын
People is scared because your explanation sucks. You should do a better job as teacher
@austenscruggs8726
@austenscruggs8726 2 жыл бұрын
This is an amazing video! Clear and digestible.