17 Probabilistic Graphical Models and Bayesian Networks

  Рет қаралды 93,568

Bert Huang

Bert Huang

Күн бұрын

Virginia Tech
Machine Learning
Fall 2015

Пікірлер: 27
@perkelele
@perkelele 4 жыл бұрын
The course at my university is well given but goes into a lot of details. Your videos help me see the forest through the trees while still being complete and correct. Thank you for the quality content.
@markregev1651
@markregev1651 4 жыл бұрын
perkele torille
@tradertim
@tradertim 3 жыл бұрын
Wow... this is amazing. Why cant my university professors teach this clearly? In this age where we do not read textbooks much (and rely on lectures after lectures) there needs to be major improvements in teaching...
@juliocardenas4485
@juliocardenas4485 4 жыл бұрын
This was EXCELLENT !. Thank you
@sanyuktasuman4993
@sanyuktasuman4993 3 жыл бұрын
I finally understood the concept of conditional independence, thanks a lot!
@sanjaykrish8719
@sanjaykrish8719 5 жыл бұрын
Extremely good. Thanks a ton Bert
@kaiyongong1894
@kaiyongong1894 4 жыл бұрын
Excellent way of explaining. Probably can share the tables size reduction with variable elimination to benefit those that is still not familar with computing the table sizes.
@johng5295
@johng5295 5 жыл бұрын
Thanks beyond measurement in money, Bert!
@ytpah9823
@ytpah9823 8 ай бұрын
🎯 Key Takeaways for quick navigation: 00:00 📊 Probabilistic graphical models, such as Bayesian networks, represent probability distributions through graphs, enabling the visualization of conditional independence structures. 01:34 🎲 Bayesian networks consist of nodes (variables) and directed edges representing conditional dependencies, allowing the representation of full joint probability distributions. 03:21 🔀 Bayesian network structures reveal conditional independence relationships, simplifying the calculation of conditional probabilities and inference. 09:10 🧠 Naive Bayes and logistic regression can be viewed as specific Bayesian networks, with the former relying on conditional independence assumptions. 11:55 📜 Conditional independence is a key concept in Bayesian networks, defining that each variable is independent of its non-descendants given its parents. 15:15 ⚖️ Inference in Bayesian networks often involves calculating marginal probabilities efficiently, which can be achieved through variable elimination, avoiding full enumeration. 23:54 ⚙️ Variable elimination is a technique used in Bayesian networks to replace summations over variables with functions, reducing computational complexity for inference. 24:05 🧮 Variable elimination is a technique used to compute marginal probabilities efficiently by eliminating variables one by one. 28:07 ⏱️ In tree-structured Bayesian networks, variable elimination can achieve linear time complexity for exact inference. 29:02 📊 Learning in a fully observed Bayesian network is straightforward, involving counting probabilities based on training data.
@user-kdl495
@user-kdl495 2 жыл бұрын
That was super clear explanation for me. Thanks !
@rezaqorbani1327
@rezaqorbani1327 Жыл бұрын
Greate explanation! Thank you for the video!
@catherineiniguez2449
@catherineiniguez2449 2 жыл бұрын
Thanks, Great explanation!!!
@arthurzhang8759
@arthurzhang8759 4 жыл бұрын
great explanation!
@hongkyulee9724
@hongkyulee9724 Жыл бұрын
Thank you for the good explanation :D
@JazzLispAndBeer
@JazzLispAndBeer Жыл бұрын
Great for getting up to speed again!
@dianajumaili1572
@dianajumaili1572 3 жыл бұрын
Thanks!!!
@adityanjsg99
@adityanjsg99 2 жыл бұрын
Thank You is a small word....
@AllTheFishAreDead
@AllTheFishAreDead 4 жыл бұрын
Sorry, why doesn't the f function depend on r at 21:50? I mean I know it's in the conditional, but to me that means if r changes the function would change (so yeah its a parameter not a variable but should still be there?)
@berty38
@berty38 4 жыл бұрын
Yeah that's totally correct and a mistake in my notation. I should have written f_C(w, r), and then we could eliminate w and end up with just a function of r, which is what we're looking for.
@harcourtpameela9444
@harcourtpameela9444 3 жыл бұрын
please can you give more personal tutorial on how to carry calculations on Bayesian network
@Raventouch
@Raventouch 4 жыл бұрын
I have a question about the independence in bayes nets: at 13:14 you say that C is independent of B and D (because of the first rule). at 14:30 you say that a variable is independent of every other variable than its Markov blanket, but there D is included. Does that mean still that C is independant of D because of the first rule or not. I'm a little confused at this point. Anyways great videos, great explanations, thank you very much for creating them.
@user-qh8zx7zo2u
@user-qh8zx7zo2u 3 жыл бұрын
At 13:14 the first rule applies and hence C is conditionally independent of B and D, since they are the non-descendants and we have observed C's parent, which is A. The second rule stipulated that given a variable's markov blanket (so given that we have observed every variable in the markov blanket) that variable is conditionally independent of all other variables not belonging to the markov blanket. Imagine this was a bigger network but the markov blanket remained A,D,E and we observed every variable in the blanket. (C|A,D,E) will be independent of all other variables in that network. Someone correct if i'm wrong please.
@ghady31
@ghady31 2 жыл бұрын
How can I used Baysien Network for Machine Learning, and what is the suitable software available for that?
@LouisChiaki
@LouisChiaki 4 жыл бұрын
Sorry, I am confused the first rule for independence in Bayes nets: "Each variable is conditionally independent of its non-descendents given its parents". Why the non-descendents of a node has to do with its parents?
@huangbinapple
@huangbinapple 3 жыл бұрын
I still don't understand the difference between enumeration and variable elimination.
@reanwithkimleng
@reanwithkimleng Ай бұрын
❤❤❤❤❤
@UsefulMotivation365
@UsefulMotivation365 6 ай бұрын
With the given respect to you, but not to the people that created this "Variable elimination" thing, this variable "elimination" sounds like bullshit because you are already computing all the possible states of the variable that you are going to eliminate, meaning that you aren't eliminating nothing already. Or I'm wrong?
Markov Models
18:32
Bert Huang
Рет қаралды 12 М.
Bayesian Networks
39:57
Bert Huang
Рет қаралды 309 М.
Khóa ly biệt
01:00
Đào Nguyễn Ánh - Hữu Hưng
Рет қаралды 13 МЛН
When someone reclines their seat ✈️
00:21
Adam W
Рет қаралды 28 МЛН
Китайка и Пчелка 10 серия😂😆
00:19
KITAYKA
Рет қаралды 2,1 МЛН
Универ. 13 лет спустя - ВСЕ СЕРИИ ПОДРЯД
9:07:11
Комедии 2023
Рет қаралды 4,6 МЛН
Undirected Graphical Models
18:27
Bert Huang
Рет қаралды 63 М.
Bayes' Theorem, Clearly Explained!!!!
14:00
StatQuest with Josh Starmer
Рет қаралды 337 М.
Probabilistic ML - Lecture 17 - Factor Graphs
1:23:35
Tübingen Machine Learning
Рет қаралды 16 М.
Hidden Markov Models
30:18
Bert Huang
Рет қаралды 84 М.
Bayes theorem, the geometry of changing beliefs
15:11
3Blue1Brown
Рет қаралды 4,2 МЛН
Bayesian Network | Introduction and Workshop
24:22
LiquidBrain Bioinformatics
Рет қаралды 12 М.
Gaussian Mixture Model
15:07
ritvikmath
Рет қаралды 100 М.
Probability vs. Likelihood ... MADE EASY!!!
7:31
Brian Greco - Learn Statistics!
Рет қаралды 24 М.
Samsung S24 Ultra professional shooting kit #shorts
0:12
Photographer Army
Рет қаралды 22 МЛН
TOP-18 ФИШЕК iOS 18
17:09
Wylsacom
Рет қаралды 771 М.
🔥Идеальный чехол для iPhone! 📱 #apple #iphone
0:36
Gizli Apple Watch Özelliği😱
0:14
Safak Novruz
Рет қаралды 1,5 МЛН
i like you subscriber ♥️♥️ #trending #iphone #apple #iphonefold
0:14