Probability isn’t always about randomness (

  Рет қаралды 22,418

Syst3ms

Syst3ms

2 жыл бұрын

CORRECTION : at 3:15, the edge of lengths of the parallelotope must of course be of length less than 1 for the result to hold, my bad.
vvv NO MUSIC, SCROLL DOWN FOR INFO vvv
An excursion into a problem of higher-dimensional geometry and its surprising probabilistic solution.
Special thanks to my brother for greatly helping me out with the audio side of this project, and to the SoME2 Discord server for being a nice community in which to have worked on this.
Image credit :
Photo at 06:01 : Myrabella / Wikimedia Commons / CC BY-SA 3.0 & GFDL
Chapters :
00:00 Parallelotopes
05:53 Probability??
08:34 A Monte Carlo proof
16:55 …and it works!
19:16 Tying up loose ends
20:43 Conclusion : Gratitude
Music :
I didn't bother adding music because the contest is really stringent when it comes to copyright, and also because it isn't that necessary. Though If you feel like silence is going to be too heavy, here's something to listen to at low volume : • Klaus Schulze - Sequen...

Пікірлер: 59
@maxwellhunt3732
@maxwellhunt3732 2 жыл бұрын
Wow, that's an incredible video! One of the best Some2 submissions in my opinion.
@michaelzumpano7318
@michaelzumpano7318 Жыл бұрын
Beautiful video and brilliantly done. I paused at points so I could really drink it in. Great job. I’m in awe. And that was a heartwarming tribute to a great teacher.
@NR-ft6cj
@NR-ft6cj Жыл бұрын
Makes it so beautiful once you see it all come together. Brilliant work!
@ashwinjain5566
@ashwinjain5566 Жыл бұрын
lately been reading more about probabilistic proofs and honestly they are so beautiful.
@markusboeck
@markusboeck Жыл бұрын
17:45 since the expected value of V is p, on the left hand side we have the variance of V. Which can be computed by summing the variance of the individual independent Bernoulli variables C_i which is p_i (1 - p_i) what we have at 18:43. Nice Video! (Or something like that. We have random vectors, but I think we can make an argument along these lines)
@NoNTr1v1aL
@NoNTr1v1aL Жыл бұрын
Absolutely amazing video! Subscribed.
@RodniOcomments
@RodniOcomments Жыл бұрын
Each time we add a new vector to the basis, longest distance between opposite vertices grows at least as sqrt(n): there were two opposite vertices with distance at least sqrt(n-1) by induction, adding a new basis vector to both of them gives us two more points, which all together form a parallelogram, its longest diagonal (opposite to obtuse angle) is at least sqrt((sqrt(n-1))^2+1^2)=sqrt(n). Let us then take any point and these two opposite vertices: it lies at least sqrt(n)/2 away from one of them (if both distances are shorter, triangle inequality is not satisfied). So the maximum distance is at least sqrt(n)/2.
@dannycrytser7268
@dannycrytser7268 Жыл бұрын
Very nice video. I always enjoyed probabilistic proofs when they arose in surprising areas. Erdös' bound on Ramsey numbers is a simple one but the argument is still charming. More complicated subjects in discrete math often hinge on probabilistic tools like the Lovász local lemma or Szemerédi's regularity lemma (used in proof of the Green-Tao theorem).
@cmilkau
@cmilkau Жыл бұрын
Basically, probability is used to make the proof more intuitive. But it's not really probability, it's just measuring some abstract volumes.
@pamdemonia
@pamdemonia Жыл бұрын
What a lovely epilogue!
@MooImABunny
@MooImABunny Жыл бұрын
@13:28 a pop team epic screenshot, I see you're a man of culture
@arianj2863
@arianj2863 Жыл бұрын
Awesome! I remember reading some stuff on how the kolmogorov probability axioma is just a set of rules, which we link to randomness because of the way we often use this. This video showed me atleast what the writers of that statement meant; our analysis op the axioms of probability can be applied in situations where those axioms hold, even if the object of interest does not have to do with randomness!
@gideonk123
@gideonk123 Жыл бұрын
At 17:40 Jensen’s inequality should probably be mentioned. The modification from proving about E[d] to proving about E[d^2] is not trivial, since in general (E[d])^2 does not equal E[d^2]. Fortunately for this problem, using Jensen’s inequality, we have (E[d])^2
@syst3ms413
@syst3ms413 Жыл бұрын
Actually, the reason for the equivalence is simpler and hinted at in the video. If 𝔼[D²] ≤ A², then there exists a vertex for which D² ≤ A² (I can't be bothered to properly typeset the bound), which means that D ≤ A, which is what we had set out to prove. So the two formulations are equivalent because "D² ≤ A² ⇔ D ≤ A" in this context.
@gideonk123
@gideonk123 Жыл бұрын
@@syst3ms413 Yes, I agree the two formulations are equivalent for the proof of this particular problem. Of course the two expectations are not equal, just wanted to highlight that. I’m sure you know this 🙂 Nice video and great proof!
@AssemblyWizard
@AssemblyWizard Жыл бұрын
Thanks for mentioning it, I came looking for such a comment. Btw you can also use Variance instead of Jensen: we know Var(d) ≥ 0, and Var(d) = E[d²] - E[d]², so E[d²] ≥ E[d]². We can even get a strict inequality because d is not constant, cheers
@victorscarpes
@victorscarpes Жыл бұрын
I think that the best way to describe probability to a layperson is that it is a way to quantify how much we do not know about system. It could be actually random or just that we do not know enought for us to perceive it at random. This way of thinking nicely ties in with information theory and to direct applications such as telecomunications and stuff.
@syst3ms413
@syst3ms413 Жыл бұрын
The Bayesian interpretation of probability is very reasonable, I just felt like it wasn't "hands-on" enough for my purposes here, I'm just more comfortable with the frequentist approach for an explanation.
@handsanitizer2457
@handsanitizer2457 Жыл бұрын
Love this it's very visual.
@amaarquadri
@amaarquadri Жыл бұрын
What a cool proof!
@RSLT
@RSLT 2 жыл бұрын
Very Interesting Great Job
@lior_shiboli
@lior_shiboli Жыл бұрын
Well this is a video I'll have to watch a second to fully understand But it is really cool
@nestorv7627
@nestorv7627 Жыл бұрын
Deserves more views!
@johnchessant3012
@johnchessant3012 2 жыл бұрын
Great video, and your professor sounds amazing! Sort of unrelated, but one of my favorite counterintuitive facts: Which point inside the unit square maximizes the product of distances to the four vertices? (It's not the center.)
@iamrepairmanman
@iamrepairmanman Жыл бұрын
Halfway in between two of the vertices?
@iamrepairmanman
@iamrepairmanman Жыл бұрын
As in, on the perimeter
@columbus8myhw
@columbus8myhw Жыл бұрын
@@iamrepairmanman I agree. Center gives you 1/4, midpoint of a side gives you 5/16 (which is 1/16 more).
@activeactor9728
@activeactor9728 2 жыл бұрын
excellent video and effort! one note: in 3:30 the conjecture should be "the furthest a point inside of it can be from its CLOSEST vertex is sqrt(n)/2", right?
@syst3ms413
@syst3ms413 2 жыл бұрын
You can interpret as being kind of the same thing. If a point is at a distance x from its closest vertex, then it's at a distance (at least) x from every other vertex. Maybe that wasn't really explicit though.
@mathlitmusic3687
@mathlitmusic3687 Жыл бұрын
Nice video!
@viniciuskenji4445
@viniciuskenji4445 Жыл бұрын
Bravo!
@iamtraditi4075
@iamtraditi4075 Жыл бұрын
Awesome video! +1 sub
@MrRyanroberson1
@MrRyanroberson1 Жыл бұрын
simple: the maxidistant point is always equidistant to opposing vertices, and the bisector point between opposing vertices of a parallelotope is by definition the center; the average position of all vertices. Proof by symmetry
@ethancheung1676
@ethancheung1676 Жыл бұрын
good video. it would be better to say around 4:30 the equations are elsewhere to make room for subtitles. it will greatly benefit the hearing impaired to read the equations at the same time as the subtitles
@benjamingoldstein14
@benjamingoldstein14 Жыл бұрын
Should the conjecture at 3:30 also specify a maximal edge length of 1 in the parallelotope?
@syst3ms413
@syst3ms413 Жыл бұрын
Yes, it should, my bad.
@Elitekross
@Elitekross 2 жыл бұрын
Without getting very far into this, I'm gonna guess that you get an unexpected answer in higher dimensions due to the same effect that can cause the hyperspheres at the corners of a hypercube to become bigger than the cube itself. I forget the full description of the phenomenon but I do know there are a few videos about it.
@syst3ms413
@syst3ms413 2 жыл бұрын
Well no, for once the result in higher dimensions is actually a clean generalization of the lower-dimensional cases.
@Elitekross
@Elitekross 2 жыл бұрын
@@syst3ms413 found that out lol 😆
@neur303
@neur303 2 жыл бұрын
1:13 Would not there be two off-center points, where the minimum distance would be maximized? The distance from the points on the shorter diagonal would increase if I move the point on the longer diagonal, so it can't be at a maximum. What did I misunderstand?
@spacenoodles5570
@spacenoodles5570 2 жыл бұрын
You are right, but the max distance is less than in a square
@neur303
@neur303 2 жыл бұрын
Aaah, the argument was about the upper bound being sqrt(2)/2. Now the sides being
@lexinwonderland5741
@lexinwonderland5741 Жыл бұрын
alright aside from a brilliant illustration of a topic i've almost never seen discussed, your memes are fucking HILARIOUS keep it up m8
@soyoltoi
@soyoltoi Жыл бұрын
cool
@the_hidden_library
@the_hidden_library Жыл бұрын
I'd say this can be understood as a rounding argument. Instead of rounding to the closest, round the coordinates probabilistically by using Bernoulli random variable. The value closer to 1 has higher probability of becoming 1 as so on.
@syst3ms413
@syst3ms413 Жыл бұрын
I think that here by rounding you actually mean "finding the closest vertex", which yes, is the object of the problem.
@the_hidden_library
@the_hidden_library Жыл бұрын
@@syst3ms413 No. I meant a random rounding. Let p = (p_1, ..., p_n). Let X_i ~ Be(p_i) be independent. Consider the random vertex V = (X_1p_1, ..., X_np_n) of the parallelotope. Then the E[|V - p|^2] can be computed as you did. The interpretation is that V is a "random rounding" of p. The i-th coordinate has the propensity to become equal to 1 with probability p_i, and thus if a coordinate is close to 1 it has high propensity to be rounded to 1. It makes intuitive sense to me that this way one will get a vertex close (not necessarily closest) to p with high probability. As a general philosophy, I think of probability as the mathematics of the very complex, rather than mathematics of randomness (of course this is just philosophy). Thus, no wonder, it appears as a method rather than just as a subject.
@syst3ms413
@syst3ms413 Жыл бұрын
Okay, I see your point. Sure, why not!
@buidelrat132
@buidelrat132 Жыл бұрын
Funny yellow dog does wacky and uncharacteristic math problem
@kasugaryuichi9767
@kasugaryuichi9767 2 жыл бұрын
@MaxG628
@MaxG628 Жыл бұрын
At 11:08, without looking up the formula somewhere else, should there be a division by n? And maybe a limit as n goes to infinity?
@MaxG628
@MaxG628 Жыл бұрын
Wait, I bet n is the number of dimensions we’re using.
@RomanNumural9
@RomanNumural9 Жыл бұрын
I follow the proof but I don't quite follow how having at least one point with distance less than √n/4 shows that the maximum distance is achieved at √n/4. Wouldn't you want to show there are no points with distance greater than √n/4?
@DarGViD
@DarGViD Жыл бұрын
You fix an arbitrary point inside a parallelotope. Then you choose a vertex of the parallelotope randomly. If on average the distance to the fixed point is less or equal than sqrt(n) / 2, then at least on of the vertexes is close enough. So for an arbitrary point we showed that there is a vertex such that the distance between them is less or equal than sqrt(n) / 2
@syst3ms413
@syst3ms413 Жыл бұрын
As the previous comment says, the point I fixed in the beginning is perfectly arbitrary. If I can show the theorem for an arbitrary point, it means it's true for all points.
@adissentingopinion848
@adissentingopinion848 Жыл бұрын
Probability is real. The math you use to operate on it is hard and true. As long as you don't actually have to work with a dataset somewhere, you can give perfectly accurate analysis. Expected value of the binomial distribution *is* p, because that's what it's defined as. You only get unsure if your sample size is anything below infinity. Just because matrix operations are all over video processing doesn't mean we can't use Graphics Processing Units for machine learning. It's just branding at that rate.
@tinymints3134
@tinymints3134 2 жыл бұрын
There's promise here, but honestly I was very lost. Lots of confusing words. Like this seems interesting if I had a bachelor's or maybe was working on my masters in mathematics. Or maybe I'm just dumb, which I'm not a genius, lol. But I feel like things were difficult to relate to each other.
@bobh6728
@bobh6728 Жыл бұрын
Don’t feel bad. I majored in math years ago and I didn’t follow everything. There are a lot of prerequisites to be able to follow everything.
@mathlitmusic3687
@mathlitmusic3687 Жыл бұрын
Yes, this is indeed slightly more advanced than many of the other submissions
Vectors are more awesome than you think (#SoME2)
33:07
The Armchair Intellectual
Рет қаралды 32 М.
The weirdest paradox in statistics (and machine learning)
21:44
Mathemaniac
Рет қаралды 1 МЛН
Stay on your way 🛤️✨
00:34
A4
Рет қаралды 26 МЛН
ПРОВЕРИЛ АРБУЗЫ #shorts
00:34
Паша Осадчий
Рет қаралды 7 МЛН
Double Stacked Pizza @Lionfield @ChefRush
00:33
albert_cancook
Рет қаралды 118 МЛН
Smart Sigma Kid #funny #sigma #memes
00:26
CRAZY GREAPA
Рет қаралды 2,7 МЛН
Introduction To Symbolic Dynamics #SoME2
18:09
SyllabusGames
Рет қаралды 26 М.
Teleporting Ants & Dynamic Programming #SoME2
12:42
A Bit Wiser
Рет қаралды 166 М.
Percolation: a Mathematical Phase Transition
26:52
Spectral Collective
Рет қаралды 356 М.
The Shadowy World of Umbral Calculus
15:01
Supware
Рет қаралды 122 М.
Introduction to Projective Geometry via Tic-Tac-Toe Grids
21:26
Sum and Product
Рет қаралды 50 М.
Seven Dimensions
14:41
Kieran Borovac
Рет қаралды 785 М.
The BEST Way to Find a Random Point in a Circle | #SoME1 #3b1b
18:35
$1 vs $100,000 Slow Motion Camera!
0:44
Hafu Go
Рет қаралды 29 МЛН
Как удвоить напряжение? #электроника #умножитель
1:00
Hi Dev! – Электроника
Рет қаралды 1,1 МЛН
Запрещенный Гаджет для Авто с aliexpress 2
0:50
Тимур Сидельников
Рет қаралды 1 МЛН
Частая ошибка геймеров? 😐 Dareu A710X
1:00
Вэйми
Рет қаралды 5 МЛН
Looks very comfortable. #leddisplay #ledscreen #ledwall #eagerled
0:19
LED Screen Factory-EagerLED
Рет қаралды 12 МЛН
Лучший браузер!
0:27
Honey Montana
Рет қаралды 1,1 МЛН