No video

Lecture 1: Introduction to Information Theory

  Рет қаралды 342,084

Jakob Foerster

Jakob Foerster

Күн бұрын

Пікірлер: 140
@eul3rr
@eul3rr Күн бұрын
As a math student who is interested in information theory and neural networks, i discovered this gem of a lecture series when i was looking for videos to fall asleep to! In fact i've finished the lectures when i was sleeping :D Now I decided to start it properly and just finished watching this lecture and taking notes. I would love to send David a mail when i finish the course. Thanks for leaving this behind my man, rest in peace.
@mohamedrabie4663
@mohamedrabie4663 Жыл бұрын
RIP prof David, you were, and still a great inspiration to us
@eddiehazel1259
@eddiehazel1259 7 ай бұрын
ah man sad to hear 😔
@prajwolgyawali6770
@prajwolgyawali6770 4 жыл бұрын
Fundamental Problem: Reliable communication over unreliable channel Binary Symmetric Channel: 9:35 Disk Drive Problem: 11:30 Redundancy: 23:00 Repetition Code: 24:35 Decoding is inference Inverse Probability: 31:33 Forward Probability: 40:30 Hamming Code: 47:10 (kzfaq.info/get/bejne/m8-odqqiydKrqIU.html%29) Capacity of channel: 58:10 Shannon Noisy Channel Coding Theorem: 58:45 The Weighing Problem 1:00:50
@wampwamp1458
@wampwamp1458 2 жыл бұрын
thank you :)
@fjs1111
@fjs1111 2 жыл бұрын
Unreliable channel = Unreliable information
@lesliefontenelle7224
@lesliefontenelle7224 8 жыл бұрын
I am not involved in information technology but this lecturer is making a difficult subject like information theory look so easy. You really must see this...
@shellingf
@shellingf 2 жыл бұрын
kinda boring though
@the_anuragsrivastava
@the_anuragsrivastava 3 жыл бұрын
One of the best lecture of " information theory and coding " I have ever seen....love from India 🇮🇳🇮🇳🇮🇳🇮🇳
@linlinzhao9085
@linlinzhao9085 5 жыл бұрын
Dr. Mackay is a great explainer. Anyone interested in machine learning and Bayesian statistics can also read his doctoral thesis.
@iwtwb8
@iwtwb8 8 жыл бұрын
I was saddened to read that David MacKay passed away earlier this year.
@JTMoustache
@JTMoustache 8 жыл бұрын
what a great teacher.. he will live on :)
@wunanzeng7051
@wunanzeng7051 7 жыл бұрын
He was a very great teacher! Very Articulate!
@yousify
@yousify 7 жыл бұрын
I was shocked when I read your comment. I'm following his lecture and his book on this course, it is a sad news.
@jimmylovesyouall
@jimmylovesyouall 7 жыл бұрын
what a great teacher.. he will live on
@bayesianlee6447
@bayesianlee6447 6 жыл бұрын
RIP For great teacher of human beings. His passion and endeavor for tech would remain for descendant
@JerryFrenchJr
@JerryFrenchJr 7 жыл бұрын
How am I just now discovering this lecture series??! This is awesome!
@JakobFoerster
@JakobFoerster 7 жыл бұрын
better late than never! glad you are finding it useful
@anantk2675
@anantk2675 5 жыл бұрын
i got it now bro, i am latter than ya ; )
@siweiliu9925
@siweiliu9925 2 жыл бұрын
@@anantk2675 I'm later than you, hhhh
@trueDeen911
@trueDeen911 11 ай бұрын
@@siweiliu9925 i am later than you
@dragonfly3139
@dragonfly3139 7 жыл бұрын
Thank you for these great lectures your memories will live on ... RIP
@IrfanAli-jl7vb
@IrfanAli-jl7vb 5 жыл бұрын
Thank you, Thank you, Thank you for sharing these excellent video lectures. Dr Mackay is amazing at teaching complicated topics. These lectures are great supplement to his excellent book on information theory which has so many excellent plots and graphs that enables one to visualize information theory. Information theory comes alive in pictures. Thank you for sharing these.
@AvindraGoolcharan
@AvindraGoolcharan 4 жыл бұрын
This may be the longest blackboard I've ever seen
@hyperduality2838
@hyperduality2838 4 жыл бұрын
Repetition (redundancy) is dual to variation -- music. Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle. Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics. Randomness (entropy) is dual to order (predictability) -- "Always two there are" -- Yoda. Teleological physics (syntropy) is dual to non teleological physics. Duality: two sides of the same coin.
@a_user_from_earth
@a_user_from_earth 8 ай бұрын
what an amazing lecture and great teacher. May you rest in peace. You will not be forgotten.
@baganatube
@baganatube 7 жыл бұрын
I don't mind the slowness. With some decoding, my brain is receiving a cleaner signal with that extra redundancy.
@sadimanesadimane6746
@sadimanesadimane6746 4 жыл бұрын
Bagana also does he have to write EVERYTHING
@monazy11
@monazy11 7 жыл бұрын
It was amazing, Finally I learned Shannon noisy channel theorem: there is exist an encoding and decoding system that could reach to the capacity of channel. so error correcting and detecting course is about to learn these encoding and decoding system. wow Amazing lots of thanks to the teacher
@AlexandriaRohn
@AlexandriaRohn 5 жыл бұрын
00:30 Information Theory invented by Claude Shannon to solve communication problems. Fundamental problem: Reliable communication over an unreliable channel. e.g. [Voice->(Air)->Ear], [Antenna->(Vacuum)->Mars Rover], [Self->(magnetized film)->Later Self] 04:00 Received signal is approximately equal to the transmitted signal because of added noise. 05:45 What solutions are there for having received and transmitted signal be the same? Either physical solutions or system solutions. 07:30 Source message -> [Encoder] -> Coded transmission -> [Channel (noise introduced)] -> Received message -> [Decoder] -> Best guess at original source 08:45 Encoder is some system that adds redundancy. Decoder makes use of this known system to try to infer both the source message and n.
@george5120
@george5120 5 жыл бұрын
So nice to watch a video like this that does not have music.
@fireflystar5333
@fireflystar5333 2 жыл бұрын
This is a great lecture. The design of case problem is really helpful here. Thanks for the lecture.
@kikirizki4318
@kikirizki4318 4 жыл бұрын
Thankyou very much prof David MacKay, Your lecture helps me understand information theory. Btw I live in the country where toilet is not so common
@the_anuragsrivastava
@the_anuragsrivastava 3 жыл бұрын
You are from where??
@TheNiro87
@TheNiro87 2 жыл бұрын
This is great, thank you! The lecture is far more entertaining than just reading the book.
@HeMan-tm8wl
@HeMan-tm8wl 2 ай бұрын
whihc book is it?
@palfers1
@palfers1 5 жыл бұрын
Genius camera work
@rafaelespericueta734
@rafaelespericueta734 5 жыл бұрын
Indeed so. It's so frustrating and irritating when the camera focuses on the lecturer when you really want to look at the slide with plots and equations.
@oscarbergqvist4992
@oscarbergqvist4992 5 жыл бұрын
Thank you for a great lecture, looking forward to follow the rest and study the book!
@abhishekpal5871
@abhishekpal5871 8 жыл бұрын
this really helps. I really wanted to learn information theory. this video series is really easy to understand and awesome.
@reyazali2768
@reyazali2768 6 жыл бұрын
awesome example of teaching style
@leduran
@leduran 2 жыл бұрын
These lectures are great. Thanks for sharing.
@jedrekwrzosek6918
@jedrekwrzosek6918 2 жыл бұрын
I looove the lectures! Thank you for the upload!
@dr.alaaal-ibadi8644
@dr.alaaal-ibadi8644 3 жыл бұрын
I like this channel. I'm already teaching this topic for student in Iraq. In Arabic.
@qeithwreid7745
@qeithwreid7745 4 жыл бұрын
This is fantastic I love it.
@Handelsbilanzdefizit
@Handelsbilanzdefizit 8 жыл бұрын
I would use SUDOKUS for communication. Only few numbers have to be transmitted correctly, and the other numbers/information can be restored by the decoder :-D
@Hastur876
@Hastur876 5 жыл бұрын
The problem is that only the sudoku numbers that you transmit count as information: the rest of the numbers are constrained by having to follow the rules of Sudoku, and can't be any numbers you want, thus they can't be information. So you're still having to get 100% data transmission.
@RippleAnt
@RippleAnt 9 жыл бұрын
Ah! thanks, was looking for a good series... this is just the one. A great cliffhanger at the end to be precise... ^_^
@GOODBOY-vt1cf
@GOODBOY-vt1cf 2 жыл бұрын
15:52
@GOODBOY-vt1cf
@GOODBOY-vt1cf 2 жыл бұрын
9:50
@gadepalliabhilash7575
@gadepalliabhilash7575 14 күн бұрын
Could anyone explain how is the flip pobability is 10^-13 and and 1% failure is 10^-15. Video reference is at 22:10
@GOODBOY-vt1cf
@GOODBOY-vt1cf 2 жыл бұрын
32:13
@spring74light
@spring74light 9 жыл бұрын
Cheering, in these days of advanced educational gizmolgy, to see Professor MacKay making extensive and effective use of a stick of chalk and a polychromatically absorbent surfaced board. {Tyneside, England.]
@ciceroaraujo5183
@ciceroaraujo5183 5 жыл бұрын
Thank you professor
@ozgeozcelik8921
@ozgeozcelik8921 9 жыл бұрын
awesome! thanks for sharing
@papatyavanroode2329
@papatyavanroode2329 2 жыл бұрын
3/8 3/8 2/8 cake and siblings That's the most equal way to share In the 3rd cake they are most close to equal 4/82/8 2/8 is far behind equal
@christopheguitton7523
@christopheguitton7523 11 ай бұрын
J'ai compris pourquoi répéter 3 fois la même information de suite à mes enfants n' était pas forcément efficace :-)
@deeplearningpartnership
@deeplearningpartnership 4 жыл бұрын
Thanks for these.
@oyindaowoeye467
@oyindaowoeye467 10 жыл бұрын
I wonder if he was making a joke when he said "Mr. Binomial" or if he actually meant to say "Bernoulli".
@vik24oct1991
@vik24oct1991 6 жыл бұрын
he is being sarcastic i think , he finds it silly that distributions are named after people who discovered them while it would have been more logical if they were named after what they represent (which binomial distribution does).
@PierrePark
@PierrePark 4 жыл бұрын
@@vik24oct1991 he doesn't find it silly, he is himself just being silly, because so many things are named after someone he's joking and extending that to binomial
@Omar-th5vv
@Omar-th5vv 2 жыл бұрын
I'm new to information theory. At 4:24 why is the received signal is "approximately" and not "equal" to the transmitted signal + noise? like what else is there other than the transmitted signal and the noise? Thanks for sharing such helpful lectures.
@caribbeansimmer7894
@caribbeansimmer7894 2 жыл бұрын
It gets extremely difficult trying to model everything that affects the signal, but it's relatively easy to add noise to the signal, but not just noise but an assumption that the noise is additive (hence the plus sign), white, and gaussian( normally distributed). As you would know from statistics, a normal distribution has nice properties for estimation. This stuff gets really complex but I guess you get the idea. Note, we can make other assumptions about the characteristics of noise
@nishantarya98
@nishantarya98 Жыл бұрын
Received = Transmitted + Noise makes a *lot* of simplifying assumptions! In the real channel, the noise might be multiplied, not added. The transmitted signal might go through weird transformations, like you might receive log(Transmitted), or it might be modeled as a filter which changes different parts of your Transmitted signal in different ways!
@cupteaUG
@cupteaUG 9 жыл бұрын
Any idea about the final puzzle? My answer is 3. :-)
@deepkushagra
@deepkushagra 4 жыл бұрын
at 46:50, what is rate? i guess it is (1/no of repetitions) but what does it mean in layman terms
@Rockyzach88
@Rockyzach88 2 жыл бұрын
"People need 20 GB drives nowadays" Laughs in Call of Duty
@artmaknev3738
@artmaknev3738 10 ай бұрын
After listening to this lecture, my IQ went up 20 points!
@terrythibodeau9265
@terrythibodeau9265 2 жыл бұрын
I am curious to know how Shannon would have interpretted the internet as part of his theory... noise perhaps
@motikumar1442
@motikumar1442 6 жыл бұрын
Helpful lecture
@forheuristiclifeksh7836
@forheuristiclifeksh7836 5 ай бұрын
2:00
@ncckdr
@ncckdr 9 жыл бұрын
very great theory
@sahhaf1234
@sahhaf1234 Жыл бұрын
towards 56:00 he uses the term "bit error" and "block error" but doesnt define them properly..
@derekcrone4679
@derekcrone4679 10 жыл бұрын
is dis how u mak a plane
@AtharvSingh-vj1kp
@AtharvSingh-vj1kp 7 ай бұрын
I'm a bit curious about when these Lectures were recorded. Was it 2003??
@aritraroygosthipaty3662
@aritraroygosthipaty3662 5 жыл бұрын
38:19 I am unsure of the 1/2 in the denominator According to my calculations P(r=011) = (1-f)f^2+f(1-f)^2 This is done by the sum rule. The numerator should have 1/2 due to the P(s=1) term My final answer is P(s=1|r=001) = (1-f)/2 Could anybody help me with my concerns?
@dlisetteb
@dlisetteb 3 жыл бұрын
When you add up the probability of r given s=1 and given s=0, you must include the probability of each of those events. It results as P(r=011) = P(r=011, s=0) + P(r=011, s=1) P(r=011) = P(r=011/s=0) * P(s=0) + P(r=011/s=1) * P(s=1) P(r=011) = (1-f)f^2 * 1/2 + f(1-f)^2 * 1/2
@brandnatkinson5981
@brandnatkinson5981 10 жыл бұрын
K
@ridwanwase7444
@ridwanwase7444 Жыл бұрын
the link of getting free book is not working,can anybody tell from where i can get that free book? Thanks in advance
@driyagon
@driyagon 3 жыл бұрын
can someone explain how to solve the homework problems?
@minglee5164
@minglee5164 4 жыл бұрын
I have read the extraordinary book.
@dharmendrakamble6282
@dharmendrakamble6282 4 ай бұрын
🎉
@arthurk7270
@arthurk7270 7 жыл бұрын
I'm a bit confused. If we're trying to estimate the mean of the Binomial distribution, wouldn't we use the sigma/sqroot(n) formula for the +- bound? In other words, the mean +- the standard deviation of the estimator: 1000 +- 30/sqroot(10000) = 1000 +- 0.3? My statistics is a bit rusty.
@JakobFoerster
@JakobFoerster 7 жыл бұрын
Arthur K thanks for the comment. Which part of the lecture are you referring to?
@arthurk7270
@arthurk7270 7 жыл бұрын
Hi. I was referring to the discussion at 14:00.
@JakobFoerster
@JakobFoerster 7 жыл бұрын
Arthur K I believe you are confusing the standard deviation of the sample mean rate with the standard deviation of the mean total count. The standard deviation of the mean rate does indeed drop as 1 / (N)^0.5, while the standard deviation of the total count increases by (N)^0.5. Since we care about the total number of bits flipped, it's about the total count rather than the rate. You can see that multiplying the standard deviation of the rate, 1 / (N)^0.5, with the total number, N, results in a standard deviation of the total ~(N)^0.5. Please let me know if that clarifies.
@turkiym2
@turkiym2 8 жыл бұрын
Was Mr. Bionomial a joke that fell on dead ears or was it a genuine confusion with Bernoulli?
@MarkChimes
@MarkChimes 8 жыл бұрын
+omniflection I laughed when I heard it. I take it was just a very dry joke.
@yltfy
@yltfy 8 жыл бұрын
+Mark Chimes Hahah. Me too. This is when I clicked the pause and check the comments...
8 жыл бұрын
+Mark Chimes Well hello Mr. Chimes :)
@dharmendrakamble6282
@dharmendrakamble6282 2 жыл бұрын
Information theory
@izzyr9590
@izzyr9590 5 жыл бұрын
im learning this at school... yet I'am here watching a lecture on KZfaq ... I dont know why ... I should pay attention in my class I guess.
@javatoday5002
@javatoday5002 5 жыл бұрын
you're definitely exercising inference
@keshavmittal1077
@keshavmittal1077 2 жыл бұрын
hey is there any pre-requisite of it
@vedantjhawar7553
@vedantjhawar7553 3 жыл бұрын
Hi, I just wanted to ask what was meant when at 24:20 when he says that "1 is the same as 5, and if there was a 4, there would be a 0."
@MN-sc9qs
@MN-sc9qs 3 жыл бұрын
1 and 5 are both ofd so assigned 1, and 4 is even so assigned 0.
@vedantjhawar7553
@vedantjhawar7553 3 жыл бұрын
@@MN-sc9qs Thanks.
@mustafabagasrawala7790
@mustafabagasrawala7790 6 жыл бұрын
I'm fairly new to this. What is a "flip" ?
@aritraroygosthipaty3662
@aritraroygosthipaty3662 5 жыл бұрын
a flip is to change the bit from a 1 to 0 or a 0 to a 1.
@carloslopez7204
@carloslopez7204 4 жыл бұрын
What are the requirements to understand this lecture?
@AndreyAverkiev
@AndreyAverkiev 7 ай бұрын
As it says the only piece of Mathematics is binomial distribution 16:08
@stevealexander6425
@stevealexander6425 7 жыл бұрын
Good lecture except for the highly distracting camera work.
@pauldacus4590
@pauldacus4590 7 жыл бұрын
You know he's a jokster cuz at 1:00:13 the slide says his textbook weighs "~35 lbs".
@rarulis
@rarulis 7 жыл бұрын
XD, that's the price. 35 british pounds.
@Hussain1Salman
@Hussain1Salman 7 жыл бұрын
Thanks for the lectures. I am just curious about what happened to lecture 2. It says it was deleted.
@JakobFoerster
@JakobFoerster 7 жыл бұрын
Hi Hussain, thanks for catching this. I have reached out to youtube support to find out. Hopefully will be resolved soon.
@JakobFoerster
@JakobFoerster 7 жыл бұрын
Hi Hussain, Apparently there was a bug in the youtube system and they deleted it by accident. The video is back online now.
@MDAZHAR100
@MDAZHAR100 5 жыл бұрын
Binomial is more related to probability topics. Bernoulli is about hydraulics.
@JakobFoerster
@JakobFoerster 5 жыл бұрын
en.wikipedia.org/wiki/Bernoulli_distribution
@circlesinthenight3141
@circlesinthenight3141 7 жыл бұрын
rip david
@afarro
@afarro 4 жыл бұрын
I was able to achieve f=0.1 for this video with x1.5 speed ...
@GSSIMON1
@GSSIMON1 8 жыл бұрын
the fact is the received signal is not identical to sent signal due to corruption and distortion in the signal, in a process , so how much of the the original signal is received ,what would be the measurement in what unit ,,,,,thats why i do drugs ,and dont give a damm !
@javatoday5002
@javatoday5002 5 жыл бұрын
measurement would be in bits and I am not joking
@user-rl7wg8tp8w
@user-rl7wg8tp8w Жыл бұрын
channel immigration
@pandasworld4168
@pandasworld4168 6 жыл бұрын
hehe if you re like me, then you have that exam in one week
@tonewreck1
@tonewreck1 3 жыл бұрын
Profesor McKay is certainly extremely competent in the subject but this really is the most un-intuitive way of introducing information theory. We are given the answer right from the start and work our way backwards to see that it is an effective system, instead of trying to understand the problem and find the adequate solution. We are never trying to understand the nature of the problem but instead made to test the effectiveness of the solution. Typical of classical academic philosophy. Let's make knowledge as boring and abstruse as possible so the riff raff is kept out of our little club!
@vedantjhawar7553
@vedantjhawar7553 3 жыл бұрын
Do you know of any sources to check out that might teach the content in the method you are talking about? Thank you.
@tonewreck1
@tonewreck1 3 жыл бұрын
@@vedantjhawar7553 try this ...kzfaq.info/get/bejne/htqdrcmhu5yndHk.html
@vedantjhawar7553
@vedantjhawar7553 3 жыл бұрын
@@tonewreck1 Thank you. I took a look at it and felt that it described the act of measuring information very thoroughly. Are there any videos you recommend for learning about system solutions for transmission errors?
@tonewreck1
@tonewreck1 3 жыл бұрын
@@vedantjhawar7553 I am glad you found it helpful. I suggest to see the other videos in this series. Once you get a feel for what information theory is really about, you can go back to MacKay for error correction and all the nitty gritty, he does the details very effectively.
@renduvo
@renduvo 2 жыл бұрын
@tonewreck1 For some reason I'm unable to see the reply in which you've suggested the source that @Vedant Jhawar has requested. Could you please share the source again?
@PseudoAccurate
@PseudoAccurate 8 жыл бұрын
This is painfully slow. I'm sure the information is fantastic but you have to watch him write everything he says on the chalkboard. When I got to him telling the class to discuss how much 10% of 10,000 is I couldn't take it anymore. Can anyone suggest a similar lecture that moves more quickly?
@Meeran
@Meeran 8 жыл бұрын
use the youtube speed feature, set it to 2x, done
@PseudoAccurate
@PseudoAccurate 8 жыл бұрын
Lol, nice, good idea.
@unorthodoxresident7532
@unorthodoxresident7532 8 жыл бұрын
Lectors goal ultimatly is to share information and for audience to absorb as much as possible. There are different ways how people can do that and for each it's different. Some people are better at listenening others better absorb visual representation (either looking at chalkboard or writing notes themselves) :)
@PseudoAccurate
@PseudoAccurate 8 жыл бұрын
That's completely understandable - that's what it was like when I went to school. It's just that most lectors now give you the notes themselves so they don't have to take the time to write much and can spend the lecture explaining the material.
@MlokKarel
@MlokKarel 8 жыл бұрын
Now the question wasn't aimed at the 10% of 10k but, rather at the +- part, i.e. std.dev, IMO. Did you get that correctly as well?
@seweetgirlnay
@seweetgirlnay 3 жыл бұрын
Where in the fock am I?
@jabbatheplutocrat1074
@jabbatheplutocrat1074 6 жыл бұрын
There is no information here and when will it be realized?Never!!!
@csaracho2009
@csaracho2009 4 жыл бұрын
In the first example, p is equal to f? p should be 1-f = 0.90
@abdullahmertozdemir9437
@abdullahmertozdemir9437 Жыл бұрын
p is equal to f, and q is equal to 1 - f. Since we assumed f to be 0.1, 0.1 * (1 - 0.1) = 0.09 thus giving us the variance of 900 after multiplying 10.000 by 0.09
@GOODBOY-vt1cf
@GOODBOY-vt1cf 2 жыл бұрын
13:24
@GOODBOY-vt1cf
@GOODBOY-vt1cf 2 жыл бұрын
34:43
@alyssag8099
@alyssag8099 4 жыл бұрын
8:22
Fortunately, Ultraman protects me  #shorts #ultraman #ultramantiga #liveaction
00:10
The Joker saves Harley Quinn from drowning!#joker  #shorts
00:34
Untitled Joker
Рет қаралды 60 МЛН
Information Theory Basics
16:22
Intelligent Systems Lab
Рет қаралды 63 М.
Claude Shannon - Father of the Information Age
29:32
University of California Television (UCTV)
Рет қаралды 353 М.
Solving Wordle using information theory
30:38
3Blue1Brown
Рет қаралды 10 МЛН
16. Learning: Support Vector Machines
49:34
MIT OpenCourseWare
Рет қаралды 1,9 МЛН
David MacKay - final interview and tribute
23:22
Mark Lynas
Рет қаралды 25 М.
Think Fast, Talk Smart: Communication Techniques
58:20
Stanford Graduate School of Business
Рет қаралды 39 МЛН
Information, Evolution, and intelligent Design - With Daniel Dennett
1:01:45
The Royal Institution
Рет қаралды 559 М.
But what is a neural network? | Chapter 1, Deep learning
18:40
3Blue1Brown
Рет қаралды 17 МЛН