10 Tree Models and Ensembles: Decision Trees, Boosting, Bagging, Gradient Boosting (MLVU2018)

  Рет қаралды 11,483

MLVU

MLVU

Күн бұрын

slides: mlvu.github.io/lecture10/
2019 version: • 10 Tree Models and Ens...
In this lecture we (finally) show how decision trees are trained. We also dicuss ensembling: a technique that can be used on any machine learning model, but is often combined with tree learning. Lecturer: Peter Bloem. See the PDF for image credits.

Пікірлер: 8
@sreejithmunthikodu
@sreejithmunthikodu 5 жыл бұрын
Thanks for such a great video. Concepts are well understood.
@mariav1234
@mariav1234 3 жыл бұрын
Well done! Thanks for this great video!
@BidyutPatranitrkl
@BidyutPatranitrkl 6 жыл бұрын
How does Adaboost use weights to pick samples from one iteration to the next iteration ? I understood that it gives emphasize on wrongly classified examples. Let say: There are 10 examples, after first iteration, weights of 1,2,3 examples are 0.331 and rest others are 0.01. Please explain which samples would be picked and in systematic way ?
@riskone1
@riskone1 6 жыл бұрын
As I understand it, this depends on the model you're using. Some models, like linear classifiers and decision trees can train on a weighted data set. They will simply let the first 3 examples weigh much more heavily in the training than the rest. If your classifier can't do this, you can resample a new dataset with replacement: each instance is sampled according to its weight, and replaced. You keep doing this until your dataset is the same size as the original. Then you retrain on this new dataset. In your example the new dataset would have 10 instances, probably roughly equal amounts example 1, 2 and 3, with a small probability of one of the other examples occurring.
@maheshbhosale1838
@maheshbhosale1838 4 жыл бұрын
Generally speaking, don't we have separate classification trees which is a subtype of general decision trees?
@SarahFL
@SarahFL 5 жыл бұрын
Hello thank you so much for the video its really help me to understand what is boosting bagging and gradient boosting means 😃. There's something that I really want to ask that related to my final project in collage, what do you think if I use XGBoost for Text Classification ? thanks in advance 😊
@riskone1
@riskone1 5 жыл бұрын
Hi Sarah. There's certainly nothing wrong with that choice. XGBoost is known for working pretty well out-of-the-box. I would recommend comparing with a few simpler algorithms like Naive Bayes and Logistic regression, to see if the complexity of XGBoost is warranted. To be honest, the choice of classifier will probably matter a lot less than _which features_ you choose to extract to convert your text to feature vectors. If you want to try something more involved, you could also go for a model that is specifically designed to consume text (and doesn't require feature extraction). See lecture 11 for some examples.
@SarahFL
@SarahFL 5 жыл бұрын
thank you so much @@riskone1 👍
Gradient Boosting Method and Random Forest - Mark Landry
40:56
ПРОВЕРИЛ АРБУЗЫ #shorts
00:34
Паша Осадчий
Рет қаралды 7 МЛН
НЫСАНА КОНЦЕРТ 2024
2:26:34
Нысана театры
Рет қаралды 1,2 МЛН
39kgのガリガリが踊る絵文字ダンス/39kg boney emoji dance#dance #ダンス #にんげんっていいな
00:16
💀Skeleton Ninja🥷【にんげんっていいなチャンネル】
Рет қаралды 8 МЛН
EVOLUTION OF ICE CREAM 😱 #shorts
00:11
Savage Vlogs
Рет қаралды 8 МЛН
6.034 Recitation 10: Boosting (Adaboost)
51:57
Jessica Noss
Рет қаралды 30 М.
16. Learning: Support Vector Machines
49:34
MIT OpenCourseWare
Рет қаралды 1,9 МЛН
Machine learning - Random forests
1:16:55
Nando de Freitas
Рет қаралды 237 М.
Trevor Hastie - Gradient Boosting Machine Learning
44:14
H2O.ai
Рет қаралды 150 М.
Regression Trees, Clearly Explained!!!
22:33
StatQuest with Josh Starmer
Рет қаралды 623 М.
AdaBoost, Clearly Explained
20:54
StatQuest with Josh Starmer
Рет қаралды 742 М.
ПРОВЕРИЛ АРБУЗЫ #shorts
00:34
Паша Осадчий
Рет қаралды 7 МЛН