L02.7 Total Probability Theorem

  Рет қаралды 80,095

MIT OpenCourseWare

MIT OpenCourseWare

6 жыл бұрын

MIT RES.6-012 Introduction to Probability, Spring 2018
View the complete course: ocw.mit.edu/RES-6-012S18
Instructor: John Tsitsiklis
License: Creative Commons BY-NC-SA
More information at ocw.mit.edu/terms
More courses at ocw.mit.edu

Пікірлер: 15
@cycla
@cycla 3 ай бұрын
Thank you professor Tsitsiklis!
@SecretEscapist
@SecretEscapist Жыл бұрын
Thanks a lot for this amazing lecture ☺
@richardgamor9959
@richardgamor9959 4 жыл бұрын
Very understandable
@louerleseigneur4532
@louerleseigneur4532 3 жыл бұрын
Thanks MIT
@SD-uf4ng
@SD-uf4ng 2 жыл бұрын
weight should be p(B/Ai) and probability should be p(Ai) cause sum of p(Ai)=1 isnt it?
@therasmataz2168
@therasmataz2168 Жыл бұрын
Not correct. we are adding each probability of B with respect to an Ai that contributes the total probability of B. The Ai's are collectively exhaustive so you can imagine some Aj that does not have some B in it the contribution to B's total probability would be zero although its weight would be P(Aj)
@berkcancosar6313
@berkcancosar6313 4 жыл бұрын
Can someone explain why he says weighted average at 3:52? We are not dividing by 3 so it should not be average?
@hugol354
@hugol354 4 жыл бұрын
it's not the average as we usually think of, but it's the average of the event P(B/Ai) happening(he says weighted because each P(Ai) is already weighted because they sum to 1)
@berkcancosar6313
@berkcancosar6313 4 жыл бұрын
Hugo L yeah i got it thx a lot
@SD-uf4ng
@SD-uf4ng 2 жыл бұрын
​@@hugol354 weight should be p(B/Ai) and probability should be p(Ai) cause sum of p(Ai)=1 isnt it?
@hugol354
@hugol354 2 жыл бұрын
@@SD-uf4ng no, it is the weights on the conditional prob that B will happen knowing that A happens
@tarunpahuja3443
@tarunpahuja3443 2 жыл бұрын
@@hugol354 I think probability of a scenario is already a fraction so need not to divide.
@capowang2446
@capowang2446 4 жыл бұрын
day2
@MrGustavier
@MrGustavier 6 ай бұрын
Is that a Greek accent ?
@ProfessionalTycoons
@ProfessionalTycoons 5 жыл бұрын
1
L02.8 Bayes' Rule
4:28
MIT OpenCourseWare
Рет қаралды 52 М.
The Law of Total Probability
10:22
jbstatistics
Рет қаралды 159 М.
Pleased the disabled person! #shorts
00:43
Dimon Markov
Рет қаралды 27 МЛН
Inside Out 2: Who is the strongest? Joy vs Envy vs Anger #shorts #animation
00:22
L08.7 Cumulative Distribution Functions
12:48
MIT OpenCourseWare
Рет қаралды 68 М.
Bayes theorem, the geometry of changing beliefs
15:11
3Blue1Brown
Рет қаралды 4,3 МЛН
L06.7 Joint PMFs and the Expected Value Rule
10:16
MIT OpenCourseWare
Рет қаралды 77 М.
Bayes' Theorem of Probability With Tree Diagrams & Venn Diagrams
19:14
The Organic Chemistry Tutor
Рет қаралды 966 М.
Bayes' Theorem EXPLAINED with Examples
8:03
Ace Tutors
Рет қаралды 284 М.
Conditional Probability
12:29
MIT OpenCourseWare
Рет қаралды 302 М.
The Bayesian Trap
10:37
Veritasium
Рет қаралды 4 МЛН