[CVPR'21 WAD] Keynote - Andrej Karpathy, Tesla

  Рет қаралды 44,725

WAD at CVPR

WAD at CVPR

Күн бұрын

Talk given on 2021/06/20.
Andrej is the Senior Director of AI at Tesla, where he leads the team responsible for all neural networks on the Autopilot. Previously, Andrej was a Research Scientist at OpenAI working onDeep Learning in Computer Vision, Generative Modeling and Reinforcement Learning. Andrej received his PhD from Stanford, where he worked with Fei-Fei Li on Convolutional/Recurrent
Neural Network architectures and their applications in Computer Vision, Natural Language Processing and their intersection. Over the course of his PhD, Andrej squeezed in two internships at Google where he worked on large-scale feature learning over KZfaq videos, and in 2015 he interned at DeepMind and worked on Deep Reinforcement Learning. Together with Fei-Fei, Andrej designed and taught a new Stanford class on Convolutional Neural Networks for Visual Recognition (CS231n). The class was the first Deep Learning course offering at Stanford and has grown from 150 enrolled in 2015 to 330 students in 2016, and 750 students in 2017.

Пікірлер: 37
@mydutube
@mydutube 3 жыл бұрын
Wow!! Autopilot and Active-Safety features are already using the Vision-only stack! That's a giant vote of confidence for the viability of their fundamental strategy. There's nothing more appealing for ML and Computer Vision engineers than seeing their work get rapidly deployed in real world and immediately start adding value to the customers.
@my_temporary_name
@my_temporary_name 3 жыл бұрын
Amazing work; pleasure to listen about!
@camelcai
@camelcai 3 жыл бұрын
Mind-blowing
@dennissat
@dennissat 3 жыл бұрын
It was a great speech, thanks for sharing.
@lixunbai9610
@lixunbai9610 3 жыл бұрын
Thanks for sharing
@nightlessbaron
@nightlessbaron 3 жыл бұрын
Amazing!
@maximilianosenberg890
@maximilianosenberg890 Жыл бұрын
I love how Andrej calls it "test time" when you're driving with Autopilot engaged in the real world and hoping it won't mess up :D
@galileo3431
@galileo3431 2 жыл бұрын
Andrej beeing the only person on KZfaq I can´t watch on 1.5x speed 😂
@Martinit0
@Martinit0 Жыл бұрын
I did double-check the YT-speed setting several times thinking it was stuck at 1.5. Frankly, I suspect they sped up x1.25 before upload. Andrej's data rate appears just too high.
@howardlo9040
@howardlo9040 2 жыл бұрын
Tesla's AI Day brings me here. Mind-blowing engineering efforts.. Hands up.
@foodmaker5771
@foodmaker5771 Жыл бұрын
Thanks for your valuable info
@boomperson818
@boomperson818 2 жыл бұрын
Very interesting
@josy26
@josy26 3 жыл бұрын
How do they deal with multiple Tesla car sizes?
@XorAlex
@XorAlex 3 жыл бұрын
Андрюха красава есть жи
@veloc1tyTV
@veloc1tyTV Жыл бұрын
22:45 is the stuff I find so interesting. I wish I'd know more about what's going on under the hood.
@anshidar7104
@anshidar7104 2 жыл бұрын
anyone knows where i can Get the slides?
@jacksnotty2318
@jacksnotty2318 2 жыл бұрын
I kept waiting during this extremely long video to have this gentleman address the very serious concern about fog. Driving in the fog (which millions of people due every year) will be impossible using vision only systems. This is where radar is superior. So why are you removing expensive radar from the model 3 and Y and leaving the radar sensors in the model S and X? It’s as simple as this. The combination of radar and vision is more expensive to install. The more affluent Tesla clients who can afford an S or an X will be safer. I just put a deposit down on a 2021 Tesla model 3 performance and then found out from Tesla after I placed the order that the radar sensors were removed for the 2021 model. It’s unbelievable to me when a company makes a car “less safe” and then charges more money. 🤷🏼‍♂️
@Teslavangelist
@Teslavangelist 2 жыл бұрын
driving in fog isn't possible with radar either unless vision can see well enough to override radar's errors. But, to your point, cameras might be able to see better in fog if the computer adjusts the images in real time, for exampling giving them more contrast
@HuckWeed
@HuckWeed 2 жыл бұрын
I drove in fog today using fsd beta in my vision only model 3 and could almost not see in front of me at times but the car was able to drive just fine with fsd visualization displaying the environment/other cars/lanes reliably with a smooth intenticle and no problems.
@Martinit0
@Martinit0 Жыл бұрын
Just slow down to the appropriate speed, like a human would (should) do. Bingo. Solved. Ideally you want traffic to move along at one speed, especially in difficult visual conditions. You DON'T want part of the traffic to speed along because they have radar.
@jukkamchannel
@jukkamchannel 2 жыл бұрын
FSD Beta 9 hasn't worked very reliable, but Mobileye test in NYC has gone well. Mobileye uses automatic mapping. Streets driven by a car using a Mobileye are automatically mapped (static objects are mapped and auto-labeled). Mapping is done by cameras. That AV-map increase safety and reduces the load on the neural network(NN) and therefore Mobileye does not need as powerful computers for NN teaching as Tesla. Mobileye car can focus only on moving objects. Andrej always shows the old-fashioned lidar + hd map concept, where HD-map was used only for localization. It would also be interesting to hear his opinion of the new auto-generated maps used by Mobileye and others
@haowang7598
@haowang7598 2 жыл бұрын
I have the same question. I did not hear any information regarding whether Tesla use cloud sourced map technology or not
@vicen1090
@vicen1090 Жыл бұрын
maybe it is too late, even if that approach could work, their resources have focus on vision and they keep doubling down on it. But it will be interesting to know his opinion on it
@colinmaharaj
@colinmaharaj 2 жыл бұрын
wish I could be in that team who sent the email?
@Bmmhable
@Bmmhable 3 жыл бұрын
If you take input from real drivers, how do you know which of those inputs are actually good driving? Your training includes driving behaviors you want to avoid, not emulate.
@Jack2511
@Jack2511 2 жыл бұрын
My AP drove into a stationary security gate.
@aresdilin
@aresdilin Жыл бұрын
Why 36 Frames per second? 1/36 == 0.027777777777777777... How to process elapsed time ?
@razaenterprisesinc2077
@razaenterprisesinc2077 3 жыл бұрын
bit misleading i guess because mostly he is focusing on CV not on lidars which some how looks awkward and not any discussion on dedicated FPGAs
@jayanthkumar9637
@jayanthkumar9637 3 жыл бұрын
Yes the they were building was called dojo clusters to run theirs vision stack to process 4096 images per second , but its an confidential 🤐🤐🤐
@GeneralKenobi69420
@GeneralKenobi69420 2 жыл бұрын
@@jayanthkumar9637 an confidential
@Jack2511
@Jack2511 2 жыл бұрын
@@GeneralKenobi69420 la confidential
@posthocprior
@posthocprior 2 жыл бұрын
It’s the incremental approach, I think, is the reason that Tesla will never reach level 5.
@kdub1666
@kdub1666 2 жыл бұрын
It would be cool but paint me skeptical. Company reports are always optimistic but never matches reality. I gotta kick how radar drops are an insurmountable issue but logic can easily be applied when an FSD lead vehicle disappears and reappears. And from the sound of those stationary bridge returns they needed to hire a better radar antenna designer.
@tringalij
@tringalij 2 жыл бұрын
Personally, I have to disagree with you here, coming from a pilot (particularly combat pilot) with some engineering pedigree. The idea of just ignoring the radar stack because of its limits and going to vision only is (hopefully not) kind of short sighted. Although the software engineering to get the biases right for a blend of EO (elecro-optical) and radar would take a bit, you'd get a better product. Many of the military sensor arrays now use a blend of available sensors (IR, EO, radar and networked data) to come up with a picture based on the best sensor data per target. Radar definitely has limits and errors, but so does visual spectrum particularly in the dark, rain and fog. Without radar, unless you add infra-red cameras, the cars will have all the same human visual limits for seeing distant objects no? For example, the jet I used to fly had a missile defensive system that amounted to electronic eyes looking for a missile launch plume, and then firing off flares when it was one. Unfortunately it would get fooled by all sorts of things and pop flares without a missile launch (essentially phantom braking off the radar). The upgrade added a second visual battery of sensors which were tied together with software, so the OSC saw what it thought was a plume but instead of going to the flares it went to the IRCM turrets which looked at the event, determined if there was relative motion (if something is on a collision course with you like a missile tracking, it's a threat, if not it's a false alarm) and that system would either defend the jet or ignore the input. That would have been what you should have done, radar says "ahhh! a thing! brakes" but the cameras look and go "no, it's a bridge, don't brake." Anyway, nobody asked me and it's too late to do anything about it, so I hope it works out. I really don't like the idea that I have a system in both of my Teslas that I bought and you're about to shut off without paying me back though. Just saying.
@Teslavangelist
@Teslavangelist 2 жыл бұрын
it's a good point. I think that's what they were trying to do with having radar "report" to vision, but apparently they think it will work smoother with just vision.
@Martinit0
@Martinit0 Жыл бұрын
As you probably know, regular (silicon-based) cameras have a bit of IR sensitivity unless IR is blocked by an IR filter (which is standard for photographic CCDs).
@fredharris929
@fredharris929 11 ай бұрын
They are hackable.
Season 1 Ep.1 Andrej Karpathy on the visionary AI in Tesla's autonomous driving
1:20:19
[CVPR'23 WAD] Keynote - Ashok Elluswamy, Tesla
28:36
WAD at CVPR
Рет қаралды 33 М.
When You Get Ran Over By A Car...
00:15
Jojo Sim
Рет қаралды 16 МЛН
Happy 4th of July 😂
00:12
Pink Shirt Girl
Рет қаралды 19 МЛН
The child was abused by the clown#Short #Officer Rabbit #angel
00:55
兔子警官
Рет қаралды 24 МЛН
KINDNESS ALWAYS COME BACK
00:59
dednahype
Рет қаралды 107 МЛН
Generative AI in a Nutshell - how to survive and thrive in the age of AI
17:57
Tesla Full Self Driving explained by Andrej Karpathy
25:55
Tesla Owners Online
Рет қаралды 27 М.
[1hr Talk] Intro to Large Language Models
59:48
Andrej Karpathy
Рет қаралды 2 МЛН
Embodied AI for Autonomous Driving
32:49
Wayve
Рет қаралды 3,8 М.
PyTorch at Tesla - Andrej Karpathy, Tesla
11:11
PyTorch
Рет қаралды 513 М.
[CVPR'23 WAD] Keynote - Alexandre Alahi, EPFL
30:05
WAD at CVPR
Рет қаралды 1,3 М.
When You Get Ran Over By A Car...
00:15
Jojo Sim
Рет қаралды 16 МЛН