Hidden Markov models. The slides are available here: www.cs.ubc.ca/~nando/340-2012/... This course was taught in 2012 at UBC by Nando de Freitas
Пікірлер: 68
@jmcarter9t6 жыл бұрын
Nando's lectures are certainly some of the best on the web. Mastery is step one. Communicating that mastery is a real gift!
@blaznsmasher6 жыл бұрын
Amazing! I've watched a couple videos on HMM before this and this is by far the most clear and easy to understand
@htetnaing0072 жыл бұрын
Don't stop sharing these knowledge for those are vital to the progress of humankind!
@qorod1236 жыл бұрын
Having teachers like Prof. Nando makes me fell in love with science. Thank you so much Professor.
@azkasalsabila53286 жыл бұрын
The best lectures I have ever watched on KZfaq!!! Great professor. The explanation is easy to follow. Thank you.
@jmrjmr82549 жыл бұрын
Great! Now this paper I'm reading finally starts to make sense! Most helpful video on this topic !
@mehr1methanol8 жыл бұрын
Very very helpful!! Unfortunately when I joined UBC you were already gone to Oxford. But I'm so glad you have the lectures here.
@23karthikb6 жыл бұрын
Fantastic explanation Nando - great lecture! Thank you!
@OmarCostillaReyes11 жыл бұрын
Great presentation Nando. you succeed on teaching excellence on this lecture. you made you presentation interesting, funny and knowledgeable. Congratulations!
@user-eh5wo8re3d7 жыл бұрын
Really nice lecture. Very engaging and informative as well.
@tzu-minghuang71008 жыл бұрын
Great video for understanding HMM, worth every minutes
@PoyanNabati9 жыл бұрын
This is fantastic, thank you Nando!
@noorsyathirahmohdisa27206 жыл бұрын
Best explanation of all. Tq you helped my research on speech recognition
@MrA09897418189 жыл бұрын
Very good lecture!!! Thanks so much for saving me a large amount of time!
@ilnurgazizov29594 жыл бұрын
Excellent! A great and clear explanation of HMM!
@acltm9 жыл бұрын
Hi- could you also tell how to estimate the transition matrix?
@nargeschinichian62867 жыл бұрын
He is an amazing teacher! I suggest you to watch his class!
@qureshlokhandwala38669 жыл бұрын
Amazing intuitive explantation.Thanks!
@amizan865310 жыл бұрын
Thank you very much for posting this!
@doggartwuttus108211 жыл бұрын
Thank you! This was really engaging and helpful.
@wackyfeet7 жыл бұрын
I am so glad someone told him to fix the colour on the screen. I was losing it hoping he would fix the issue.
@abdulrahmansattar28736 жыл бұрын
Thankyou so much for these lectures!
@ThuleTheKing10 жыл бұрын
It is by far the best presentation of HMM out there. However I miss an example going from x0 - x2, so both cases (initial and t-1) are illustrated. It could also be nice, if you uploaded the assignments and answers.
@allisonzhang65278 жыл бұрын
awesome! thanks ,Nando!
@IndrianAmalia9 жыл бұрын
thanks for the amazing lecture ! helps me alot :)
@asharigamage74866 жыл бұрын
Very clear explanation and best examples from GOT ..Thank you so much :-D....
@phillipblunt5 жыл бұрын
Really fantastic lecture, thanks a lot!
@pandalover5557 жыл бұрын
"You were so happy to see those dragons" LOL this guy is hilarious
@upinsanity7 жыл бұрын
absolute masterpiece!
@fisherh91114 жыл бұрын
This guy has a great sense of humour.
@aashishraina28317 жыл бұрын
I loved teh material. Thanks a lot
@DrINTJ8 жыл бұрын
Most lectures seem to go in length and repeat ad infinitum the obvious parts, then jump over the important bits.
@ghufranghuzlan44046 жыл бұрын
omg the best explanation eveeeeeeeeeer .very helpfull thank u sooo much
@ralphsaymoremakuyana71268 жыл бұрын
great. well explained!!
@shineminem11 жыл бұрын
OMG this is so helpful!
@AnekwongYoddumnern7 жыл бұрын
Dear sir, if I use pir sensor with markov chain how many state that I should to setting?
@riyasamanta32365 жыл бұрын
simple..easy to understand..real world problems..thank you for this video.Can you upload more about the applications and extension of HMM ??
@gopalnarayanan42177 жыл бұрын
very good explanation
@hsinyang17964 жыл бұрын
I'm at student at UBC, we need you back teaching this course :
@pebre7911 жыл бұрын
wow. thanks for posting!
@bhagzz6 жыл бұрын
Really good one :)
@yuvrajsingh-wn3up10 жыл бұрын
If the events W,S,C and F are not mutually exclusive then what changes we need to make to present HMM?
@engomasri8 жыл бұрын
Thanks so much !
@saijaswanth50852 жыл бұрын
Can i know reference book of above lecture?
@Michael-kt3tf4 жыл бұрын
Just wondering since we just care about the posterior. Why the forward algorithm compute the joint distribution? What is the point of that?
@abcborgess8 жыл бұрын
brilliant.
@tina38296 жыл бұрын
Super!
@ivansorokin80549 жыл бұрын
If node x0 do not have y0 on the graph then we assume P(x0/y0) = P(x0) when predicting P(x1/y0) or we don't count transition between x0 to x1. I think graph must have node y0 and we can use Bayes rule to compute P(x0/y0) like in the beginning of the lecture. Anyway, thanks for sharing lectures.
@_dhruvawasthi2 жыл бұрын
At 43:22 why it is outside the markov blanket?
@youssefdirani3 жыл бұрын
I didn't know the smoothing assignment ... What was it ?
@arnabpaul69964 жыл бұрын
Me Before this Video: X(0): Sad Y(0): Crying Me After this Video: X(1): Happy Y(1): Watching GoT Me After Watching GoT: X(2): Sad Y(2): Crying, because the last season sucks
@ddarhe7 жыл бұрын
at the beginning of the lecture; shouldnt the columns of the table add up to 1 instead of rows? P(y|x) + P(y|~x) = 1, right?
@gggrow6 жыл бұрын
No... P(y|x) + P(~y|x) = 1 Whereas P(y|x) + P(y|~x) means "the probability of y given x plus the probability of y given not x". That could equal more than 1 if y is likely in both cases, or less than 1 if y is unlikely in both cases
@gggrow6 жыл бұрын
So... P(sad|crying) + P(sad| not crying) doesnt have to equal one because maybe I'm not likely to be sad either way, but P(sad|crying) + P(not sad| crying) = 1 because that exhausts the list of possible states; I have to be either sad or not!
@charliean92376 жыл бұрын
That's what I thought too. Summing rows to 1 means this puppy always does one of the 4 things, and the puppy never eats. However, summing cols to 1 means the puppy is either happy or sad, which makes more sense.
@dr.loucifhemzaemmysnmoussa76867 жыл бұрын
Great!
@yatmoparni9935 жыл бұрын
Dr. Loucif Hemza Emmys n Moussa
@shashanksagarjha28075 жыл бұрын
someone please let me know.. can HMM be used for anomaly detection.. if yes . does it work better than techniques such as SMOTEENN and wighted class
@yuezhao86575 жыл бұрын
I do not feel either HMM or smote is a major anomaly detection technique. The more common approaches are LOF, Isolation forest, ocsvm, abod, loci and so on.
@shashanksagarjha28075 жыл бұрын
@@yuezhao8657 as far as i know LOF, isolation forest works better in case of unsupervised learning but technique such as weighted class or smoteEnn work better when we have labels. How much accuracy we can get in HMM
@noorsyathirahmohdisa27206 жыл бұрын
Where to find his next video on HMM?
@shirishbajpai99733 жыл бұрын
were you able to find it? It seems the HMM part is missing in this lecture, there is a slide present though which is not covered in this lecture
@WilliamStevenDonald7 жыл бұрын
very good
@samidelhi61504 жыл бұрын
YuePeng Guo why
@kunwaravikalnathmathur20035 жыл бұрын
This video has baye's theorem applied in full form
@RelatedGiraffe10 жыл бұрын
6:37 We are all gonna be there one day? Speak for yourself! :P