Decision trees for classification. Slides available at: www.cs.ubc.ca/~nando/540-2013/... Course taught in 2013 at UBC by Nando de Freitas
Пікірлер: 53
@Technoslerphile10 жыл бұрын
Excellent! This is how a teacher should teach.
@michaelturniansky795910 жыл бұрын
Thank you very much for this and the following session's lecture. I got my CS degree 25 years ago, and it's nice to learn about things like how to automatically decide which questions to ask first.
@nitinat35909 жыл бұрын
Superb lecture..Thank you very much for sharing it..I was struggling with the subject before watching this video, but now am quite comfortable and i think ill be able to manage using decision trees in my project.. Thank you again :)
@prajwalshenoy91175 жыл бұрын
Tremendous Explanation! This is what even courses should focus on. Instead of just giving details on the surface and start importing packages and implementing for viewer's satisfaction, it is more fruitful to start from the scratch, dig the mathematics and intuition behind and appreciate the concept.
@GatoNordico10 жыл бұрын
Nice lecture! I came here for Decision Trees but I think I'll have a look at your other videos as well
@alhoqani27509 жыл бұрын
great lecture, I have a question, is there any session for building decision tree manually?
@chandrabhatt9 жыл бұрын
Great lecture. crystal clear!
@kevinsluder37118 жыл бұрын
Excellent! Can subsequent levels in the tree use the same attribute for the decision at a node? For instance in the 4 color, 2 dimension example, if the root level split is based on x-sub-i, can the next level node use a rule based on x-sub-i (obviously a different split)?
@newbie80516 ай бұрын
It amazes me that people were discussing these topics when I was studying about the water-cycle lol.
@SahibzadaIrfanUllahNaqshbandi8 жыл бұрын
thank you very much..it really helped sir....and one thing I wanna tell that you have got a sweet voice.
@rahulchandra7597 жыл бұрын
Does anyone know where is the data file available or we just type it in from the slide Prof has
@oreoluwa2411 жыл бұрын
Your Lectures are very explanatory; even as an undergrad I understood them. Thanks! I was wondering if you covered multivariate decision trees in any of your lectures.
@snehotoshbanerjee19389 жыл бұрын
Best Lecture on Decision Tree.Which measure is the best - Entropy or Gini?
@0S0L011 жыл бұрын
Hey Ore! Did you find any lecture on multivariate decision trees?
@ayushrastogi60895 жыл бұрын
can you provide the link for the report by antonio criminisi referred by you at 52:50
@yuanyuan30567 жыл бұрын
The most clear ML course I had
@GoyalMrManish8 жыл бұрын
Nice Explanation to decision tree :)
@TheHarperad10 жыл бұрын
"To understand what a forest is we first need to understand the tree" :D
@olegstolyar612710 жыл бұрын
Thank you.
@AoAo-mt4dl8 жыл бұрын
Thank you so much..!!!
@zxxNikoxxz8 жыл бұрын
I suppose this is how Arkinator guess who you are thinking of.
@thungp8 жыл бұрын
When I did the calculation for I(Patrons) at time roughly 46:36 for the number of bits of information, I get .541 (not .0541) as in his slide deck. Also, I had to find from a difference refernce that when you have a Log(0), which is normally undefined, they assume it is 0.
@ashimgupta95388 жыл бұрын
I think they don't assume log(0) to be zero but 0*log(0) to be zero.
@ayushrastogi60895 жыл бұрын
yes it is 0*log(0), but also all log calculations are with base 2.
@sonilshrivastava14289 жыл бұрын
nice lecture.Thankyou very much sir..Can anybody share the referenced 'Criminisi et al, 2011' paper link?
@user-oq1pc8oc3t6 жыл бұрын
Excellent
@mercurichinc10 жыл бұрын
great shareing,Thank you.
@ZestyCrunchy11 жыл бұрын
Over 200kg? That's a whale! Awesome lecture by the way :)
@jobsamuel9 жыл бұрын
Could you help me with the calculations at 48:23? I haven't figured it out why I(Patrons) is equal to 0,541 bits :(
@woowooNeedsFaith9 жыл бұрын
Jobsamuel Núñez Remember to use logarithm base 2. Most calculators use natural logarithm by default.
@tobiaspahlberg15068 жыл бұрын
+Jobsamuel Núñez Only the last term within the brackets contributes because 0*log2(x) = 0 and 1*log2(1) = 0. The expression simplifies to 1 - [6/12 * (-2/6*log2(2/6) - 4/6*log2(4/6))] = 0.5409....
@zwep8 жыл бұрын
+Tobias Pahlberg Exactly, so that means that there still is a typo in the lecture, right? Since he states 0.0541.. edit: wooohps, nevermind
@tobiaspahlberg15068 жыл бұрын
zwep Yes, but I think someone in the audience pointed that out later
@ayushrastogi60895 жыл бұрын
All log calculations for entropy are with base 2 ??
@sarahjamal865 жыл бұрын
Yes!
@mohammadkamruddin63998 жыл бұрын
Good lecture on decision tree. Can you please share Antonio Criminisi technical report link here. Thank you.
@saadorj8 жыл бұрын
+Mohammad Kamruddin Google this: "Decision Forests for Classification, Regression, Density Estimation, Manifold Learning and Semi-Supervised Learning"
@KrishnaDN8 жыл бұрын
fantastic............:)
@TheHarperad10 жыл бұрын
"If you go to the left, you are 100% red"
@deepakk19446 жыл бұрын
Thanks
@whiteshadow30008 жыл бұрын
22:08 square yards? awesome lectures by this teacher btw
@Rokel19937 жыл бұрын
this is excelent but i want to learn m5 model tree any one help me how to learn any linkgive me
@peterv.2767 жыл бұрын
ist es erlaubt, das video auf z.b. sozialen Plattformen zu teilen?
@chuckiechuckster3497 жыл бұрын
Das Video befindet sich auf KZfaq. Solange nur ein HTTP Verweis (URL) benutzt wird, ja natürlich.
@funkyweezy80718 жыл бұрын
Patron is pronounced "pay-tren" :)
@ulmermanfred46 жыл бұрын
Did I here a freudin slip? He said arround 22:55 in a greece a greedy fashion. :-). Greece is not greedy but media make us believe?
@lynnwilliam7 жыл бұрын
its hard to make money in AI. No restaurant or builder can afford to hire someone to do AI. Only a small fraction of AI developers get a job, sadly AI is not really used everywhere.