undergraduate machine learning 9: Hidden Markov models - HMM

  Рет қаралды 166,489

Nando de Freitas

Nando de Freitas

11 жыл бұрын

Hidden Markov models.
The slides are available here: www.cs.ubc.ca/~nando/340-2012/...
This course was taught in 2012 at UBC by Nando de Freitas

Пікірлер: 68
@jmcarter9t
@jmcarter9t 6 жыл бұрын
Nando's lectures are certainly some of the best on the web. Mastery is step one. Communicating that mastery is a real gift!
@blaznsmasher
@blaznsmasher 6 жыл бұрын
Amazing! I've watched a couple videos on HMM before this and this is by far the most clear and easy to understand
@htetnaing007
@htetnaing007 2 жыл бұрын
Don't stop sharing these knowledge for those are vital to the progress of humankind!
@qorod123
@qorod123 6 жыл бұрын
Having teachers like Prof. Nando makes me fell in love with science. Thank you so much Professor.
@azkasalsabila5328
@azkasalsabila5328 6 жыл бұрын
The best lectures I have ever watched on KZfaq!!! Great professor. The explanation is easy to follow. Thank you.
@jmrjmr8254
@jmrjmr8254 9 жыл бұрын
Great! Now this paper I'm reading finally starts to make sense! Most helpful video on this topic !
@mehr1methanol
@mehr1methanol 8 жыл бұрын
Very very helpful!! Unfortunately when I joined UBC you were already gone to Oxford. But I'm so glad you have the lectures here.
@23karthikb
@23karthikb 6 жыл бұрын
Fantastic explanation Nando - great lecture! Thank you!
@OmarCostillaReyes
@OmarCostillaReyes 11 жыл бұрын
Great presentation Nando. you succeed on teaching excellence on this lecture. you made you presentation interesting, funny and knowledgeable. Congratulations!
@user-eh5wo8re3d
@user-eh5wo8re3d 7 жыл бұрын
Really nice lecture. Very engaging and informative as well.
@tzu-minghuang7100
@tzu-minghuang7100 8 жыл бұрын
Great video for understanding HMM, worth every minutes
@PoyanNabati
@PoyanNabati 9 жыл бұрын
This is fantastic, thank you Nando!
@noorsyathirahmohdisa2720
@noorsyathirahmohdisa2720 6 жыл бұрын
Best explanation of all. Tq you helped my research on speech recognition
@MrA0989741818
@MrA0989741818 9 жыл бұрын
Very good lecture!!! Thanks so much for saving me a large amount of time!
@ilnurgazizov2959
@ilnurgazizov2959 4 жыл бұрын
Excellent! A great and clear explanation of HMM!
@acltm
@acltm 9 жыл бұрын
Hi- could you also tell how to estimate the transition matrix?
@nargeschinichian6286
@nargeschinichian6286 7 жыл бұрын
He is an amazing teacher! I suggest you to watch his class!
@qureshlokhandwala3866
@qureshlokhandwala3866 9 жыл бұрын
Amazing intuitive explantation.Thanks!
@amizan8653
@amizan8653 10 жыл бұрын
Thank you very much for posting this!
@doggartwuttus1082
@doggartwuttus1082 11 жыл бұрын
Thank you! This was really engaging and helpful.
@wackyfeet
@wackyfeet 7 жыл бұрын
I am so glad someone told him to fix the colour on the screen. I was losing it hoping he would fix the issue.
@abdulrahmansattar2873
@abdulrahmansattar2873 6 жыл бұрын
Thankyou so much for these lectures!
@ThuleTheKing
@ThuleTheKing 10 жыл бұрын
It is by far the best presentation of HMM out there. However I miss an example going from x0 - x2, so both cases (initial and t-1) are illustrated. It could also be nice, if you uploaded the assignments and answers.
@allisonzhang6527
@allisonzhang6527 8 жыл бұрын
awesome! thanks ,Nando!
@IndrianAmalia
@IndrianAmalia 9 жыл бұрын
thanks for the amazing lecture ! helps me alot :)
@asharigamage7486
@asharigamage7486 6 жыл бұрын
Very clear explanation and best examples from GOT ..Thank you so much :-D....
@phillipblunt
@phillipblunt 5 жыл бұрын
Really fantastic lecture, thanks a lot!
@pandalover555
@pandalover555 7 жыл бұрын
"You were so happy to see those dragons" LOL this guy is hilarious
@upinsanity
@upinsanity 7 жыл бұрын
absolute masterpiece!
@fisherh9111
@fisherh9111 4 жыл бұрын
This guy has a great sense of humour.
@aashishraina2831
@aashishraina2831 7 жыл бұрын
I loved teh material. Thanks a lot
@DrINTJ
@DrINTJ 8 жыл бұрын
Most lectures seem to go in length and repeat ad infinitum the obvious parts, then jump over the important bits.
@ghufranghuzlan4404
@ghufranghuzlan4404 6 жыл бұрын
omg the best explanation eveeeeeeeeeer .very helpfull thank u sooo much
@ralphsaymoremakuyana7126
@ralphsaymoremakuyana7126 8 жыл бұрын
great. well explained!!
@shineminem
@shineminem 11 жыл бұрын
OMG this is so helpful!
@AnekwongYoddumnern
@AnekwongYoddumnern 7 жыл бұрын
Dear sir, if I use pir sensor with markov chain how many state that I should to setting?
@riyasamanta3236
@riyasamanta3236 5 жыл бұрын
simple..easy to understand..real world problems..thank you for this video.Can you upload more about the applications and extension of HMM ??
@gopalnarayanan4217
@gopalnarayanan4217 7 жыл бұрын
very good explanation
@hsinyang1796
@hsinyang1796 4 жыл бұрын
I'm at student at UBC, we need you back teaching this course :
@pebre79
@pebre79 11 жыл бұрын
wow. thanks for posting!
@bhagzz
@bhagzz 6 жыл бұрын
Really good one :)
@yuvrajsingh-wn3up
@yuvrajsingh-wn3up 10 жыл бұрын
If the events W,S,C and F are not mutually exclusive then what changes we need to make to present HMM?
@engomasri
@engomasri 8 жыл бұрын
Thanks so much !
@saijaswanth5085
@saijaswanth5085 2 жыл бұрын
Can i know reference book of above lecture?
@Michael-kt3tf
@Michael-kt3tf 4 жыл бұрын
Just wondering since we just care about the posterior. Why the forward algorithm compute the joint distribution? What is the point of that?
@abcborgess
@abcborgess 8 жыл бұрын
brilliant.
@tina3829
@tina3829 6 жыл бұрын
Super!
@ivansorokin8054
@ivansorokin8054 9 жыл бұрын
If node x0 do not have y0 on the graph then we assume P(x0/y0) = P(x0) when predicting P(x1/y0) or we don't count transition between x0 to x1. I think graph must have node y0 and we can use Bayes rule to compute P(x0/y0) like in the beginning of the lecture. Anyway, thanks for sharing lectures.
@_dhruvawasthi
@_dhruvawasthi 2 жыл бұрын
At 43:22 why it is outside the markov blanket?
@youssefdirani
@youssefdirani 3 жыл бұрын
I didn't know the smoothing assignment ... What was it ?
@arnabpaul6996
@arnabpaul6996 4 жыл бұрын
Me Before this Video: X(0): Sad Y(0): Crying Me After this Video: X(1): Happy Y(1): Watching GoT Me After Watching GoT: X(2): Sad Y(2): Crying, because the last season sucks
@ddarhe
@ddarhe 7 жыл бұрын
at the beginning of the lecture; shouldnt the columns of the table add up to 1 instead of rows? P(y|x) + P(y|~x) = 1, right?
@gggrow
@gggrow 6 жыл бұрын
No... P(y|x) + P(~y|x) = 1 Whereas P(y|x) + P(y|~x) means "the probability of y given x plus the probability of y given not x". That could equal more than 1 if y is likely in both cases, or less than 1 if y is unlikely in both cases
@gggrow
@gggrow 6 жыл бұрын
So... P(sad|crying) + P(sad| not crying) doesnt have to equal one because maybe I'm not likely to be sad either way, but P(sad|crying) + P(not sad| crying) = 1 because that exhausts the list of possible states; I have to be either sad or not!
@charliean9237
@charliean9237 6 жыл бұрын
That's what I thought too. Summing rows to 1 means this puppy always does one of the 4 things, and the puppy never eats. However, summing cols to 1 means the puppy is either happy or sad, which makes more sense.
@dr.loucifhemzaemmysnmoussa7686
@dr.loucifhemzaemmysnmoussa7686 7 жыл бұрын
Great!
@yatmoparni993
@yatmoparni993 5 жыл бұрын
Dr. Loucif Hemza Emmys n Moussa
@shashanksagarjha2807
@shashanksagarjha2807 5 жыл бұрын
someone please let me know.. can HMM be used for anomaly detection.. if yes . does it work better than techniques such as SMOTEENN and wighted class
@yuezhao8657
@yuezhao8657 5 жыл бұрын
I do not feel either HMM or smote is a major anomaly detection technique. The more common approaches are LOF, Isolation forest, ocsvm, abod, loci and so on.
@shashanksagarjha2807
@shashanksagarjha2807 5 жыл бұрын
@@yuezhao8657 as far as i know LOF, isolation forest works better in case of unsupervised learning but technique such as weighted class or smoteEnn work better when we have labels. How much accuracy we can get in HMM
@noorsyathirahmohdisa2720
@noorsyathirahmohdisa2720 6 жыл бұрын
Where to find his next video on HMM?
@shirishbajpai9973
@shirishbajpai9973 3 жыл бұрын
were you able to find it? It seems the HMM part is missing in this lecture, there is a slide present though which is not covered in this lecture
@WilliamStevenDonald
@WilliamStevenDonald 7 жыл бұрын
very good
@samidelhi6150
@samidelhi6150 4 жыл бұрын
YuePeng Guo why
@kunwaravikalnathmathur2003
@kunwaravikalnathmathur2003 5 жыл бұрын
This video has baye's theorem applied in full form
@RelatedGiraffe
@RelatedGiraffe 10 жыл бұрын
6:37 We are all gonna be there one day? Speak for yourself! :P
@roseb2105
@roseb2105 7 жыл бұрын
i dont understand the equation
@nickizcool20
@nickizcool20 3 жыл бұрын
HMM 🤔
Hidden Markov Model : Data Science Concepts
13:52
ritvikmath
Рет қаралды 115 М.
Пранк пошел не по плану…🥲
00:59
Саша Квашеная
Рет қаралды 6 МЛН
How Many Balloons Does It Take To Fly?
00:18
MrBeast
Рет қаралды 195 МЛН
КАК ДУМАЕТЕ КТО ВЫЙГРАЕТ😂
00:29
МЯТНАЯ ФАНТА
Рет қаралды 10 МЛН
Hidden Markov Models
30:18
Bert Huang
Рет қаралды 85 М.
A friendly introduction to Bayes Theorem and Hidden Markov Models
32:46
Serrano.Academy
Рет қаралды 471 М.
Why Does Diffusion Work Better than Auto-Regression?
20:18
Algorithmic Simplicity
Рет қаралды 252 М.
undergraduate machine learning 1: Introduction to machine learning
46:17
Nando de Freitas
Рет қаралды 109 М.
Regime Switching Models with Machine Learning | Piotr Pomorski
23:19
UCL Artificial Intelligence Society
Рет қаралды 8 М.
Part-of-speech (POS) tagging with Hidden Markov Model (HMM)
21:50
Kuldeep Singh Sidhu
Рет қаралды 50 М.
Water powered timers hidden in public restrooms
13:12
Steve Mould
Рет қаралды 296 М.
Machine learning - Importance sampling and MCMC I
1:16:18
Nando de Freitas
Рет қаралды 86 М.
Hidden Markov Model | Part 1
20:22
Binod Suman Academy
Рет қаралды 91 М.
Inside Google’s $2.1B Office in a Transformed Freight Terminal | WSJ Open Office
8:56
Пранк пошел не по плану…🥲
00:59
Саша Квашеная
Рет қаралды 6 МЛН