Machine learning - Random forests

  Рет қаралды 237,704

Nando de Freitas

Nando de Freitas

11 жыл бұрын

Random forests, aka decision forests, and ensemble methods.
Slides available at: www.cs.ubc.ca/~nando/540-2013/...
Course taught in 2013 at UBC by Nando de Freitas

Пікірлер: 78
@lwoltersyt
@lwoltersyt 7 жыл бұрын
I simply love all explanatory video's of Nando de Freitas; clear and effective. Straight to the point
@siddarthjay3787
@siddarthjay3787 8 жыл бұрын
Excellent video. Professors like him are the reason learning is fun. They throw exciting ideas at you as if they were no thing.
@breckenpeter5343
@breckenpeter5343 2 жыл бұрын
I guess im randomly asking but does anyone know a trick to log back into an instagram account? I somehow lost my account password. I appreciate any tips you can offer me.
@arthurlayne6350
@arthurlayne6350 2 жыл бұрын
@Brecken Peter Instablaster ;)
@breckenpeter5343
@breckenpeter5343 2 жыл бұрын
@Arthur Layne I really appreciate your reply. I got to the site on google and I'm trying it out now. Seems to take a while so I will reply here later with my results.
@breckenpeter5343
@breckenpeter5343 2 жыл бұрын
@Arthur Layne it did the trick and I actually got access to my account again. I am so happy! Thank you so much, you saved my account :D
@arthurlayne6350
@arthurlayne6350 2 жыл бұрын
@Brecken Peter You are welcome :D
@meganmaloney192
@meganmaloney192 8 жыл бұрын
Incredibly helpful! Thank you for posting.
@comadano
@comadano 11 жыл бұрын
Thanks for posting these great lectures to be publicly available. I hope the remaining lecture videos get posted soon!
@ApiolJoe
@ApiolJoe 7 жыл бұрын
The ressource that helped me the most in least amount of time. Thanks for sharing.
@LuisFelipeZeni
@LuisFelipeZeni 7 жыл бұрын
Excellent professor, thanks for sharing your Knowledge with us.
@GoBlue7171
@GoBlue7171 7 жыл бұрын
Wow this is amazing. Such a clear and informative lecture.
@alefranc100
@alefranc100 10 жыл бұрын
Tks Nando ... This lecture is really good and useful !
@antonosipov100
@antonosipov100 9 жыл бұрын
Very good introduction to Random Forest. Thank you!
@kellyli1920
@kellyli1920 9 жыл бұрын
This is so great clear and easy to understand!!! Thank you so much!!!
@junfu8695
@junfu8695 8 жыл бұрын
+Kelly Li it is
@jiansenxmu
@jiansenxmu 9 жыл бұрын
A Perfect and Amazing Lecture on Random Forest, dank u wel !
@EmreOzanAlkan
@EmreOzanAlkan 9 жыл бұрын
Thank you! It's really nice and easy to understand!
@ravimadhavan1984
@ravimadhavan1984 8 жыл бұрын
Great lecture!
@BangsterDK
@BangsterDK 8 жыл бұрын
Amazing. Thank you so much!
@dinofranceschelli3969
@dinofranceschelli3969 7 жыл бұрын
Excellent ! Really easy to understand thanks for sharing !
@rodrigo100kk
@rodrigo100kk 4 жыл бұрын
This course is amazing !
@aadilraf
@aadilraf 6 жыл бұрын
Amazing lecture!
@drbhojrajghimire3908
@drbhojrajghimire3908 8 жыл бұрын
Very good classroom video which is very simple, interesting and clearly explained.
@BoredFOMOape
@BoredFOMOape 11 жыл бұрын
Thanks for posting! Great lecture
@agnerraphael
@agnerraphael 7 жыл бұрын
Thanks for publising, Great Help for my Project
@naiden100
@naiden100 7 жыл бұрын
Thanks for a great lecture!
@aidaelkouri7050
@aidaelkouri7050 7 жыл бұрын
Hi! Loved the video. Was wondering how the problem would work if you chose greater than 2 features to look at? Would you plot them in the 3d then? Thank you
@d0msch
@d0msch 7 жыл бұрын
thanks for publishing, helped me a lot :)
@fatemehsaki9020
@fatemehsaki9020 11 жыл бұрын
Really Great lecture and teacher !!! Thanks so much............
@helenlundeberg
@helenlundeberg 8 жыл бұрын
Around 1:08:25, the professor talks about Bayesian optimization using a GP prior or something. What I'm not clear on is what is the prior on ?
@AbhijeetSachdev
@AbhijeetSachdev 9 жыл бұрын
Brilliant lecture :)
@PravinMaske1
@PravinMaske1 8 жыл бұрын
excellent video. very helpful easily understood...
@ronthomas8331
@ronthomas8331 8 жыл бұрын
excellent lecture!!!!
@PrakashMatthew
@PrakashMatthew 6 жыл бұрын
Thank you @Nando de Freitas. Is the second part of the lecture, the one to be taken by the grad student, available online?
@jihoonkim6819
@jihoonkim6819 7 жыл бұрын
Thank you for the nice lecture.
@tonyperez8878
@tonyperez8878 9 жыл бұрын
how to compute the info gain? what is the relation of it with the entropy and mutual information ?
@BudiMulyo
@BudiMulyo 7 жыл бұрын
Thank you,, ! It's all very clear posting video.. !
@andreiherasimau7800
@andreiherasimau7800 6 жыл бұрын
excellent video!
@sameenatasneemshaikh8349
@sameenatasneemshaikh8349 7 жыл бұрын
Thank You sir excellent lecture please share other videos like you said Bayesian optimization lecture please share that lecture also.
@satter87henne
@satter87henne 5 жыл бұрын
I looked up the ressource (Criminisi) and I wonder: Is there an R package or a Python package where this specific algorithm is implemented. There are many R packages and I know there is sci-kit learn in Python. However, I want to make sure I use the more general model as outlied by Criminisi.
@hyperzoanoid
@hyperzoanoid 11 жыл бұрын
Do random forests need classification trees with more than 1 node to be effective?
@rezayousufi4200
@rezayousufi4200 4 жыл бұрын
Super explanation!!!!
@sheikhsaqib6323
@sheikhsaqib6323 9 жыл бұрын
Can anyone here please help me out, am trying to classify accelerometer and gyroscope values of a robot using random forests. The classification labels are walking and under peturbation using diffenrent forces. From what i understand this is data that evolves over time as the robot walks. What my question is can i use the technique described in the video to do the classification or do I need to do it a different way(if so please point me to the right material where i can read about how to classify such data)
@kaleeswaranm2679
@kaleeswaranm2679 7 жыл бұрын
Very good lecture, but I would suggest watching in *1.5 speed.
@aaronbrinker2613
@aaronbrinker2613 6 жыл бұрын
Nando for President
@jennifermew8386
@jennifermew8386 7 жыл бұрын
Thank you
@shubhamjha1
@shubhamjha1 8 жыл бұрын
The next lecture(the one about kinect and stuff) seems pretty interesting. Could i have a link to that?
@zenzafine4500
@zenzafine4500 5 жыл бұрын
did u get the link?
@zenzafine4500
@zenzafine4500 5 жыл бұрын
i think is this one kzfaq.info/get/bejne/l76hfKaXrZq-nHU.html
@reneveloso
@reneveloso 9 жыл бұрын
Thank you! Great lecture! You have a very brazilian name... :-)
@tpinto9
@tpinto9 6 жыл бұрын
Portuguese?
@abhijeet24patil
@abhijeet24patil 11 жыл бұрын
in step 2 of algo Random Forest : "until the minimum node size n_min is reached."-i dint understand this line. when we should stop Selecting m variables at random from the p variables.
@tadinglesby8997
@tadinglesby8997 7 жыл бұрын
Can one have a greater level of tree depth in their model (say D=10) with only 8 features. Or does it have to be a a maximum depth of 8 corresponding to 8 features?
@ecemilgun9867
@ecemilgun9867 7 жыл бұрын
You can actually use a feature in more than 1 splits. It sometimes causes overfitting but is sometimes needed: For instance both deficiency and abundance of glucose in blood levels indicate a disease.
@user-qt5mw5xd2e
@user-qt5mw5xd2e 7 жыл бұрын
great!
@fathiafaraj479
@fathiafaraj479 6 жыл бұрын
Hello sir: I am a graduate student in computer science My scientific research (object detection using random forest and local binary pattern ) I hope you help in RF code ;i useing MATLAB
@cfelixcfelix6568
@cfelixcfelix6568 9 жыл бұрын
I did not get the thing with the 3 trees and the historgrams. Does not each tree give a definite answer?
@AbhijeetSachdev
@AbhijeetSachdev 9 жыл бұрын
+Cfelix Cfelix Ofcourse it will give, but in case of classification you will just take the sign of average value. . which is exactly the same as majority. In case of regression, "averaging" actually comes into account
@nicooteiza
@nicooteiza 8 жыл бұрын
+Cfelix Cfelix Actually, not necessarily. In many real-life cases, data is not separable on the features, so even a very big tree would not be able to give definite answers. Moreover, each individual tree could be "pruned" to a certain depth, so the answers would not be definite classes, but "probabilities" for each class. For example, in a 3 class problem, for each tree you would have a [p1, p2, p3] result, where p1+p2+p3 = 1, and the final prediction can be obtained by adding all the tree's vectors element-wise and dividing by the number of trees.
@alkodjdjd
@alkodjdjd 8 жыл бұрын
Te enviei correio e nao obtive resposta. Obrigado
@fangliren
@fangliren 11 жыл бұрын
I'm not sure I understand your question; with only one node (which would have to be the "parent" node, by definition), the tree wouldn't have any "children" nodes into which to sort the data based on the criterion in the parent node. The tree wouldn't do anything at all... Perhaps you are asking a different question?
@R1CH4RDGZA
@R1CH4RDGZA 5 жыл бұрын
Loquacious, now that's one good obscure word just off the cuff
@KrishnaDN
@KrishnaDN 4 жыл бұрын
My dream is to work with professor like him
@leicaandrei
@leicaandrei 8 жыл бұрын
why is the bias very low at 40:20?
@shervin0
@shervin0 8 жыл бұрын
I'm not completely sure, but it could be because of the way trees are created. They try to maximize information gain with simple decisions. Due to the simple decisions and use of the information gain cannot cause very biased divisions (As I said, I'm not sure if this is the reason! :) )
@deepakk1944
@deepakk1944 6 жыл бұрын
What is bagging?
@sourceschaudes9587
@sourceschaudes9587 5 жыл бұрын
boosting
@theamazingjonad9716
@theamazingjonad9716 5 жыл бұрын
Bagging is an ensemble technique whereby you randomly build k models from k subset data from the original data. The subset data are built using bootstrap resampling.
@yuzhou1
@yuzhou1 6 жыл бұрын
great video, watch at 1.5x speed
@melaxxl
@melaxxl 9 жыл бұрын
walla shaza.
@donnasara70
@donnasara70 8 жыл бұрын
Спасибо, что без акцента
@kikokimo2
@kikokimo2 6 жыл бұрын
play in x2 speed!
@muneebaadil1898
@muneebaadil1898 8 жыл бұрын
Although I appreciate giving away the knowledge and trying but I'm sorry, VERY slow lecture, leading to a boring class.
@prathameshaware7433
@prathameshaware7433 7 жыл бұрын
Yeah..it was slow but anyways there is an option in KZfaq to increase the speed so u can use it next time :)
Machine learning - Random forests applications
51:42
Nando de Freitas
Рет қаралды 18 М.
Machine learning - linear prediction
1:04:20
Nando de Freitas
Рет қаралды 70 М.
Amazing weight loss transformation !! 😱😱
00:24
Tibo InShape
Рет қаралды 61 МЛН
Gym belt !! 😂😂  @kauermtt
00:10
Tibo InShape
Рет қаралды 17 МЛН
Задержи дыхание дольше всех!
00:42
Аришнев
Рет қаралды 3,6 МЛН
Я обещал подарить ему самокат!
01:00
Vlad Samokatchik
Рет қаралды 7 МЛН
Support Vector Machines: All you need to know!
14:58
Intuitive Machine Learning
Рет қаралды 139 М.
Machine Learning Lecture 31 "Random Forests / Bagging" -Cornell CS4780 SP17
47:25
Machine learning - Importance sampling and MCMC I
1:16:18
Nando de Freitas
Рет қаралды 86 М.
Machine learning - Neural networks
1:04:24
Nando de Freitas
Рет қаралды 31 М.
Probability vs. Likelihood ... MADE EASY!!!
7:31
Brian Greco - Learn Statistics!
Рет қаралды 26 М.
Data Science - Part V -  Decision Trees & Random Forests
51:07
Derek Kane
Рет қаралды 78 М.
Machine learning - Unconstrained optimization
1:16:19
Nando de Freitas
Рет қаралды 17 М.
Vanilla Bayesian Optimization Performs Great in High Dimensions
35:19
AutoML Seminars
Рет қаралды 2,4 М.
Machine learning - Bayesian learning
1:17:40
Nando de Freitas
Рет қаралды 61 М.
Amazing weight loss transformation !! 😱😱
00:24
Tibo InShape
Рет қаралды 61 МЛН