MIT 6.S191 (2023): Convolutional Neural Networks

  Рет қаралды 242,216

Alexander Amini

Alexander Amini

Күн бұрын

MIT Introduction to Deep Learning 6.S191: Lecture 3
Convolutional Neural Networks for Computer Vision
Lecturer: Alexander Amini
2023 Edition
For all lectures, slides, and lab materials: introtodeeplearning.com​
Lecture Outline
0:00​ - Introduction
2:37​ - Amazing applications of vision
5:35 - What computers "see"
12:38- Learning visual features
17:51​ - Feature extraction and convolution
22:23 - The convolution operation
27:30​ - Convolution neural networks
34:29​ - Non-linearity and pooling
40:07 - End-to-end code example
41:23​ - Applications
43:18 - Object detection
51:36 - End-to-end self driving cars
54:08​ - Summary
Subscribe to stay up to date with new deep learning lectures at MIT, or follow us @MITDeepLearning on Twitter and Instagram to stay fully-connected!!

Пікірлер: 110
@nynaevealmeera
@nynaevealmeera Жыл бұрын
We are so lucky to be alive at a time when we can attend these types of lectures for free
@CharleyMusselman
@CharleyMusselman 10 ай бұрын
Yeah! What an age for self-education, MITx to Wikipedia to ArXiv!
@gondwana6303
@gondwana6303 Жыл бұрын
Here's what I love about your lectures: You give the intuition and logic behind the architectures and this helps a lot as opposed to the stone tablet thrown down from the heavens approach. Not only is this important for learning but it also stimulates intuition for the next set of innovations!
@naumbtothepaine0
@naumbtothepaine0 8 ай бұрын
totally true, I just learned about CNN yesterday and prof talked for one hour and a half but I don't understand anything at all, partly because of me being tired, but this MIT lecture make it so easy for me to grasp all these concepts
@axel1rose
@axel1rose Жыл бұрын
This entire series on Deep Learning is a great pleasure to listen to and brainstorm about. There are limitless possibilities for AI applications, and I'm highly inspired for some of them.
@hoami8320
@hoami8320 Жыл бұрын
I'm self-studying deep learning without going through any school so I need sharers like you . thank you very much!
@Rashminagpal
@Rashminagpal Жыл бұрын
Such a brilliant session! I am totally in the awe of this course, and loved the way Dr. Alex dissects the concepts in simplified way!
@AAmini
@AAmini Жыл бұрын
Thank you!
@labjujube
@labjujube Жыл бұрын
Thank you very much for sharing!
@Nestorghh
@Nestorghh Жыл бұрын
the videos, slides and explanation keep getting better.
@nikteshy9131
@nikteshy9131 Жыл бұрын
Thanks Alex Amini and MIT ) 🥰😊
@hchattaway
@hchattaway Жыл бұрын
This free course on KZfaq is WAY better then a $2k course I took online from Carnegie Mellon University on CV...These MIT lectures are far more in-depth and provide much better examples...
@MuhammadAltaf146
@MuhammadAltaf146 Жыл бұрын
I am in awe. You have delivered these concepts so beautifully that I didn't need to look up into other resources. I have recently made a switch to this field and you happened to be my biggest motivator to pursue it further. Thank you.
@ayanah4821
@ayanah4821 20 күн бұрын
I really appreciate you posting this material!! Thank you 🙏
@shahidulislamzahid
@shahidulislamzahid Жыл бұрын
we are waiting Thanks Alex Amini
@bhairavphukan3267
@bhairavphukan3267 Жыл бұрын
Hello Alex! It’s great to join your class here 👍
@nbharwad4588
@nbharwad4588 2 ай бұрын
Thank you so much Alex. So much learning from you. God Bless you. 😊😊
@nizarnizo7225
@nizarnizo7225 Жыл бұрын
The Convolutional Neural Network, one of my Passion and with MIT is an ART
@md.sabbirrahmanakash7083
@md.sabbirrahmanakash7083 Жыл бұрын
Thank you for uploading this video ❤
@opalpearl3051
@opalpearl3051 18 күн бұрын
Thank you for sharing this wonderfully put together course for the general public's benefit. I would love to get some insight as to what goes in the lab work the student's go through as an adjunct to the course lectures. Will that be possible in the future.
@avivg643
@avivg643 7 ай бұрын
Thank you so much for this lecture!
@SukhdeepSingh-bj1sl
@SukhdeepSingh-bj1sl 10 ай бұрын
love from india as I'm not able to study at MIT but this series helps me a lot and I hope lots of people but if you can add the labs lecture that how we can build this practically so it would be a great honor
@manutube500
@manutube500 10 ай бұрын
Great Lecture. Thank you very much!
@aritraroy4275
@aritraroy4275 Жыл бұрын
Wow !! Really awesome lecture Alex sir . Nice explanation with perfect slides
@akashmechanical
@akashmechanical Жыл бұрын
It's unbelievable that you're doing this for free. Thanks a lot Sir. Your explanation is very clear and in an easy manner. Thanks again Sir.
@bestnews576
@bestnews576 Жыл бұрын
Thanks sir for this wonderful explanation.
@Antagon666
@Antagon666 8 ай бұрын
This presentation is really well put together.
@jennifergo2024
@jennifergo2024 5 ай бұрын
Thanks for sharing!
@karakusali
@karakusali Жыл бұрын
we are waiting excitingly 😀
@ethereum_go_zero_toyear
@ethereum_go_zero_toyear 4 ай бұрын
Thank you so much! I hope to see the 2024 version as soon as possible (I will brush it again
@brahimferjani3147
@brahimferjani3147 2 ай бұрын
Great. Thanks for sharing
@drelahej
@drelahej 7 ай бұрын
Thank you, Max, for these amazing lessons ftom you & Ava. Could you please share a little more how I can learn more about the vision system example you used in the lectures which helps the visually impaired run the trail?
@abusalehaligh.2745
@abusalehaligh.2745 6 ай бұрын
I can just say many thanks! I’ve been taking courses online campus about such topics but all make no sense for me, now i understand it, many thanks!
@BruWozniak
@BruWozniak Жыл бұрын
Wow, it's ridiculous, the more it goes, the better - I love every single minute of this course - A huge thank you!
@FalguniDasShuvo
@FalguniDasShuvo Жыл бұрын
Awesome!
@monome3038
@monome3038 5 ай бұрын
Greatly thankful to your efforts for making this great lectures free and so easily accessible, thank you Alexander Amini
@EngRiadAlmadani
@EngRiadAlmadani Жыл бұрын
I hope to explain backpropagation in the conv layer
@jongxina3595
@jongxina3595 5 ай бұрын
theres 2 formulas. Gradient wrt the weight/filter and the gradient wrt to the input. The gradient wrt to the weight is just the outer gradient convolved with the input. The gradient wrt to the input is more complex, its an operation similar to convolution but a bit different. This operation is done between weight and outer gradient.
@gabeohlsen3711
@gabeohlsen3711 2 ай бұрын
@jongxina you have the greatest user name on the internet
@user-ov8gi2oh7w
@user-ov8gi2oh7w 5 ай бұрын
What an insightful lecture! Appreaciations prof. Alexander
@eee8
@eee8 9 ай бұрын
Alexander Amini has splendid presentation skills
@AbulHassankakakhel
@AbulHassankakakhel Жыл бұрын
Now i have learned the whole CNN working. Great explanation
@arohawrami8132
@arohawrami8132 6 ай бұрын
Thanks a lot.
@muhannedalsaif153
@muhannedalsaif153 3 ай бұрын
thank you!
@ShaidaMuhammad
@ShaidaMuhammad Жыл бұрын
Hello Alexander, Please make a dedicated video on "Reinforcement Learning with Human Feedback"
@SuddenlySubtle
@SuddenlySubtle 8 ай бұрын
Damet garm professor Amini. What a pleasure to take these sessions.
@RajabNatshah
@RajabNatshah 11 ай бұрын
Thank you :)
@ramanraguraman
@ramanraguraman Жыл бұрын
Thank you Dr
@mehdismaeili3743
@mehdismaeili3743 Жыл бұрын
excellent.
@ee96072
@ee96072 4 ай бұрын
DL MIT classes are great overall, but there are three small errors in this lecture, please correct if you can: - as mentioned in the comments before, fully connected networks do keep spacial relationships, they actually have a much more rigid spatial relationship retention than CNNS - CNNs can be seen as a fully connected network with weight sharing and the great advantage is to force the network to give the same feature for the same input anywhere in the image (this makes the network spatially equivariant, or sometimes wrongly referenced as spatially invariant). Of course CNNs require less compute also. - Pooling (while it effectively reducing image size) has the main objective of spatially invariance, meaning that we can shift the image and get the same feature at some latent level (up to a point).
@L4ky13
@L4ky13 Жыл бұрын
Great Lecture, but last week Ava said this year's CV lecture will be about Vision Image Transformer!
@johnpaily
@johnpaily Ай бұрын
It calls for knowing the root of consciousness and creativity in life
@user-cu2ze2jn1n
@user-cu2ze2jn1n 4 ай бұрын
Sir, I am fond of deep- learning. And these lectures are amazing. Sir may you please share something you do in lab. I get really curious about that. It will become amazing if those algorithms can be used directly in directly in form of code.
@ArunKumar-eu4sc
@ArunKumar-eu4sc 5 ай бұрын
thanks a lot
@lestatdelamora
@lestatdelamora 4 ай бұрын
great lectures, are the lab portions of the course going to be available?
@kirankumar31
@kirankumar31 Жыл бұрын
I get so excited about the use cases and various possibilities of using CNNs. Excellent presentation. A master class in simplifying a complex subject.
@abdullahiabdislaan8907
@abdullahiabdislaan8907 Жыл бұрын
alex, i wanna ask you last lecture was sequencing in the website there's code lab related to that lecture can i walk through or you gonna assigning
@yurykalinin384
@yurykalinin384 10 ай бұрын
Super 👍
@johnpaily
@johnpaily Ай бұрын
It is time we have to go further to sense, smell and feel. For this we need to look deep into life. The future exists in mimicking life. Knowing life beyond the mind and going inward.
@NeerajSharma-yf4ih
@NeerajSharma-yf4ih 9 ай бұрын
Hi, After CNN performed, and pixels are flatten then can we add VAE with GAN to create the same probability distribution of input flattened array and as well some alternative derivativea Or distribution, like cycle gan, road to map. Am I connecting it correct or again watch the videos, Thank you for the videos
@alexe3332
@alexe3332 6 ай бұрын
So the random box instance is an n^2 algorithm and the pictures parameters by all means is just the color density and location plotted
@RahulGupta-sj8fn
@RahulGupta-sj8fn 7 ай бұрын
Great lecture and amazing teaching but I am having difficulty to grasp the code of lab. Is there any resources or anything better solution for it?
@salmataha4127
@salmataha4127 Жыл бұрын
Where can I find the Tensor Flow labs to practice?
@mudasserahmad6076
@mudasserahmad6076 4 ай бұрын
Does converting audio to mel spectrograms and classify with image classification models is right approach?
@joshuarodriguez2219
@joshuarodriguez2219 Жыл бұрын
Min 41:17. Why we look for 1024 layers as a "result" before the output?
@doctorshadow2482
@doctorshadow2482 Жыл бұрын
Thank you to the author. Does anybody get from this video how all this works with shift/rotation/scale of the image?
@saliexplore3094
@saliexplore3094 11 ай бұрын
Thanks Alex for sharing these lectures online. A quick comment about fully connected layer causing loss of spatial information @14:40. I don't think fully connected layers result in spatial information loss. All your network has to do is identify that certain indices in the flattened vector correspond to specific locations in the spatial map. We can lose some translation/spatial invariance but not necessary spatial information loss.
@swerve-dz4cr
@swerve-dz4cr Ай бұрын
wow, how i wish i could ever be in MIT
@sriharinakerakanti2193
@sriharinakerakanti2193 Жыл бұрын
im waiting
@sriharinakerakanti2193
@sriharinakerakanti2193 Жыл бұрын
Hello Alexaander im big fan your explanation ,im doing data science and machine learning course from University of Maryland College by upgrad ,thanks for posting videos in KZfaq it will help many students who want learn ai and ml ,thanks
@emanuelthiagodeandradedasi5918
@emanuelthiagodeandradedasi5918 Жыл бұрын
hello, I'm giving a course at my university on Brazil about Machine Learning, and i would like to ask to use some of your slides and translate your material for the next leasson which is about CNN
@user-kk5cv1rs5r
@user-kk5cv1rs5r Ай бұрын
Should we understand them as a sw developer ? do we need all these theoretical stuff?
@vitalispyskinas5595
@vitalispyskinas5595 Жыл бұрын
The math at 32:13, a double sum, is incorrect. Firstly, the filter is indexed with i,j starting at 1, so the input and output matrices are also probably indexed from 1. This means that the first output has p = 1, but since this is added to i, so we start with the row index of x being 2. Basically, we have to add p-1, rather than p, and q-1 rather than q. Secondly, The stride is meant to be 2, so we start our filter at double where it would be in the output. So instead of (p-1), we need to add 2(p-1). In conclusion, the subscript of x should be i + 2(p-1), j + 2(q-1) ; unless it shouldn't and I made a mistake 💀 Otherwise, loving the lectures 👍
@jamesperry4470
@jamesperry4470 Жыл бұрын
Are NNs not always fully connected? I just assumed they were from the math, unless a given weight is zero.
@hilbertcontainer3034
@hilbertcontainer3034 Жыл бұрын
Waiting The Third Lesson~
@laminsesay8299
@laminsesay8299 Жыл бұрын
😍
@deepakspace
@deepakspace Жыл бұрын
Can we get access to software labs with some hands on learning? I know codes are available but something else where we can learn from scratch.
@ahsenali7050
@ahsenali7050 Жыл бұрын
The best tutorial of CNN on earth.
@forheuristiclifeksh7836
@forheuristiclifeksh7836 Ай бұрын
7:00
@forheuristiclifeksh7836
@forheuristiclifeksh7836 Ай бұрын
7:38
@hoami8320
@hoami8320 Жыл бұрын
I was very impressed when I heard that the transformer model was created by a Vietnamese person
@forheuristiclifeksh7836
@forheuristiclifeksh7836 Ай бұрын
12:00
@jeschelliah9968
@jeschelliah9968 5 ай бұрын
HI Alexander Amini: Vivid Comprehensible Learner Sensitive presentation! Thanks! The AI sector currently the EXCLUSIVE domain of a MIGHTY minority Elite In comparison to billions of individuals - undoubtedly Future USERS?!! Mobile accessible to the fisherman- The English Teacher -Engineers - Ballet dancer - diverse clientele in ALL levels of Humanity- MIT And these AI Start Ups urgently expedite these kind of MIT classes to ensure Literacy in AI USE!!!! PLEASE ASAP 15.12.2023
@justinfleagle
@justinfleagle 6 ай бұрын
42:00
@user-tsynwei
@user-tsynwei 6 ай бұрын
@mehmetaliyavuz5023
@mehmetaliyavuz5023 8 ай бұрын
29:00
@linfan619
@linfan619 29 күн бұрын
"How to drive and steer this car into... the future"😄
@steel_gaming847
@steel_gaming847 5 ай бұрын
### Defining a network Layer ### # n_output_nodes: number of output nodes # input_shape: shape of the input # x: input to the layer class OurDenseLayer(tf.keras.layers.Layer): def __init__(self, n_output_nodes): super(OurDenseLayer, self).__init__() self.n_output_nodes = n_output_nodes def build(self, input_shape): d = int(input_shape[-1]) # Define and initialize parameters: a weight matrix W and bias b # Note that parameter initialization is random! self.W = self.add_weight("weight", shape=[d, self.n_output_nodes]) # note the dimensionality self.b = self.add_weight("bias", shape=[1, self.n_output_nodes]) # note the dimensionality def call(self, x): '''TODO: define the operation for z (hint: use tf.matmul)''' z =tf.matmul([x,self.n_output_nodes]) '''TODO: define the operation for out (hint: use tf.sigmoid)''' y = tf.sigmoid(z+self.b) return y # Since layer parameters are initialized randomly, we will set a random seed for reproducibility tf.random.set_seed(1) layer = OurDenseLayer(3) layer.build((1,2)) x_input = tf.constant([[1,2.]], shape=(1,2)) y = layer.call(x_input) # test the output! print(y.numpy()) mdl.lab1.test_custom_dense_layer_output(y) hello can anyone help me with this pls
@TheMortimor2
@TheMortimor2 Жыл бұрын
you try to figure out how to program the human mind, but you can't until you are able to create that spark of consciousness, that divine particle that makes a brain a brain.
@user-pi2db1ss2h
@user-pi2db1ss2h 5 ай бұрын
Can you develop and AI that will teach me how to learn deep learning.
@laminsesay8299
@laminsesay8299 Жыл бұрын
I think I was the only one waiting 😅
@TheMortimor2
@TheMortimor2 Жыл бұрын
Lamin Sesay@laminsesay82992 videa you asked me if we can stay in touch, my answer is, everyone can find me.
@TheMortimor2
@TheMortimor2 Жыл бұрын
the raw data comes from the universe where we live.
@TheMortimor2
@TheMortimor2 Жыл бұрын
the spark is what the drive is, but the drive changes according to the input. but it is exhaustible, it is the human body that dies. that's your second problem hahaha And now tell me if the program that runs in a person is created by genes or by that spark. I would describe it as a synopsis.
@alexanderskusnov5119
@alexanderskusnov5119 Жыл бұрын
Don't use dark theme for code: many chars are badly visible.
@TheMortimor2
@TheMortimor2 Жыл бұрын
I wonder if the 2 AIs can argue each other. haha and there could be a problem if you have 100 AI. 😂
@mysteriouscommentator
@mysteriouscommentator 7 ай бұрын
in reinforcement learning there is a concept called "multi-agent environment" which is when multiple agents which each have their own neural network or "AI" as it is known interact with each other directly.
@MohammedSaqib1
@MohammedSaqib1 8 ай бұрын
why is lena still used in these lectures?
@nandadulalbakshi3121
@nandadulalbakshi3121 3 ай бұрын
Spider net
@TheMortimor2
@TheMortimor2 Жыл бұрын
if I were to describe to you how I perceive the world, you would turn the brown thing into a textile.
@TheMortimor2
@TheMortimor2 Жыл бұрын
you're still only describing how the human eye works and what it sees turning into sex. 😂
@TheMortimor2
@TheMortimor2 Жыл бұрын
All man yone 2 videos blady chicken hahah
@agustinvillagra5172
@agustinvillagra5172 9 ай бұрын
Where do I find the labs for the practice?
@forheuristiclifeksh7836
@forheuristiclifeksh7836 Ай бұрын
40:00
MIT 6.S191 (2023): Deep Generative Modeling
59:52
Alexander Amini
Рет қаралды 295 М.
MIT Introduction to Deep Learning | 6.S191
1:09:58
Alexander Amini
Рет қаралды 208 М.
🍟Best French Fries Homemade #cooking #shorts
00:42
BANKII
Рет қаралды 25 МЛН
顔面水槽をカラフルにしたらキモ過ぎたwwwww
00:59
はじめしゃちょー(hajime)
Рет қаралды 37 МЛН
WHY DOES SHE HAVE A REWARD? #youtubecreatorawards
00:41
Levsob
Рет қаралды 35 МЛН
But what is a convolution?
23:01
3Blue1Brown
Рет қаралды 2,5 МЛН
MIT 6.S191: Convolutional Neural Networks
1:07:58
Alexander Amini
Рет қаралды 24 М.
MIT 6.S191 (2023): Reinforcement Learning
57:33
Alexander Amini
Рет қаралды 121 М.
The Most Important Algorithm in Machine Learning
40:08
Artem Kirsanov
Рет қаралды 229 М.
MIT 6.S191 (2023): Recurrent Neural Networks, Transformers, and Attention
1:02:50
CNN: Convolutional Neural Networks Explained - Computerphile
14:17
Computerphile
Рет қаралды 849 М.
16. Learning: Support Vector Machines
49:34
MIT OpenCourseWare
Рет қаралды 1,9 МЛН
Introduction to Poker Theory
30:49
MIT OpenCourseWare
Рет қаралды 1,3 МЛН
⌨️ Сколько всего у меня клавиатур? #обзор
0:41
Гранатка — про VR и девайсы
Рет қаралды 654 М.
Карточка Зарядка 📱 ( @ArshSoni )
0:23
EpicShortsRussia
Рет қаралды 291 М.
С Какой Высоты Разобьётся NOKIA3310 ?!😳
0:43
How To Unlock Your iphone With Your Voice
0:34
요루퐁 yorupong
Рет қаралды 15 МЛН
iPhone 15 Pro vs Samsung s24🤣 #shorts
0:10
Tech Tonics
Рет қаралды 9 МЛН
Не обзор DJI Osmo Pocket 3 Creator Combo
1:00
superfirsthero
Рет қаралды 670 М.
Эволюция телефонов!
0:30
ТРЕНДИ ШОРТС
Рет қаралды 6 МЛН