Spiking Neural Networks for More Efficient AI Algorithms

  Рет қаралды 60,999

WaterlooAI

WaterlooAI

4 жыл бұрын

Spiking neural networks (SNNs) have received little attention from the AI community, although they compute in a fundamentally different -- and more biologically inspired -- manner than standard artificial neural networks (ANNs). This can be partially explained by the lack of hardware that natively supports SNNs. However, several groups have recently released neuromorphic hardware that supports SNNs. I will describe example SNN applications that my group has built that demonstrates superior performance on neuromorphic hardware, compared to ANNs on ANN accelerators. I will also discuss new algorithms that outperform standard RNNs (including GRUs, LSTMs, etc.) in both spiking and non-spiking applications.
Speaker Bio:
Professor Chris Eliasmith is currently Director of the Centre for Theoretical Neuroscience at the University of Waterloo and holds a Canada Research Chair in Theoretical Neuroscience. He has authored or co-authored two books and over 90 publications in philosophy, psychology, neuroscience, computer science, and engineering. His book, 'How to build a brain' (Oxford, 2013), describes the Semantic Pointer Architecture for constructing large-scale brain models. His team built what is currently the world's largest functional brain model, 'Spaun,' for which he received the coveted NSERC Polanyi Prize. In addition, he is an expert on neuromorphic computation, writing algorithms for, and designing, brain-like hardware. His team has shown state-of-the-art efficiency on neuromorphic platforms for deep learning, adaptive control, and a variety of other applications.

Пікірлер: 46
@arefpar3465
@arefpar3465 Жыл бұрын
I always thought of the new types of hardware that perform the task similarly to Brain. This exciting and wonderful talk gave me the impression that my thinking wasn't out of context. Looking forward to hearing more about it.
@lachlangray8120
@lachlangray8120 2 жыл бұрын
Very excited for the future of hardware!
@PedramNG
@PedramNG 3 жыл бұрын
Truly, a fascinating talk! I enjoyed it.
@jayp6955
@jayp6955 Жыл бұрын
This is the single most important problem in ML right now. Data uncertainty and lack of generalizing power makes traditional ML brittle. OL is done at the data-pipeline level rather than intrinsically in the model, which won't scale or get us closer to AGI. In the future we'll look back at OL pipelines and see them as primitive. A sound basis of AI must incorporate time/OL, which is something traditional ANNs ignore as they are stationary solutions. ANNs need to be re-evaluated from first-principles where time/OL are baked in. Time-dependent networks like spike/phase oscillators are a promising way forward if time/OL is intrinsic, but the ML community has been seduced by traditional ANNs.
@jayp6955
@jayp6955 Жыл бұрын
Interestingly just came across these clip where Hopfield he talks about the limitation of offline-first networks, and what I perceive as a flaw in simple feed-forward ANN design. Ideas behind Hopfield networks are extremely fascinating. Hopfield also touched on emergent large-scale oscillatory behavior in this talk. There are differential equations that can be used to study this (Kuramoto). kzfaq.info/get/bejne/erGprcaTs9ich3k.html kzfaq.info/get/bejne/erGprcaTs9ich3k.html kzfaq.info/get/bejne/erGprcaTs9ich3k.html
@mywaimarketing
@mywaimarketing Жыл бұрын
Great Video Intro for our research fellows starting hardware based spike neural network research in MYWAI Labs
@SuperGhostRider
@SuperGhostRider 3 жыл бұрын
great lecture!
@mapy1234
@mapy1234 2 жыл бұрын
great work...
@roryhector6581
@roryhector6581 3 жыл бұрын
I wonder how SNNs on Loihi compare to just an iteratively pruned ANN on something like EIE from Han et al. (2016). Is it mainly the fact that it’s a sparse network and hardware for it that give it good performance with less energy vs GPU? Or is the benefit to efficiency more from spikes and asynchrony?
@BrianMantel
@BrianMantel 3 жыл бұрын
You're doing absolutely amazing work here. Can you point me to a simple example or information about how to perform online learning in a spiking neural network?
@solaweng
@solaweng 3 жыл бұрын
I believe the Nengo package that Prof Eliasmith mentioned is capable of doing online training. Training with spiking neuron is tricky though as back propagation is not usually available (I guess you can still do it with back prop if you are working under rate code). The only rule (biologically feasible) that I know is PES.
@BrianMantel
@BrianMantel 3 жыл бұрын
@@solaweng do you think maybe they're using hebian learning? Thanks for responding I was waiting 5 months for that. :-)
@solaweng
@solaweng 3 жыл бұрын
@@BrianMantel I just checked the new nengo doc and seems it has different learning rule now. The Oja learning rule has something to do with Hebbian coactivity so I guess the answer is yes. You can check it out here www.nengo.ai/nengo/examples/learning/learn-unsupervised.html
@postnubilaphoebus96
@postnubilaphoebus96 3 жыл бұрын
Very good lecture. Was looking for material for my master's thesis, and I found lots of interesting pointers.
@PedramNG
@PedramNG 3 жыл бұрын
what is your master thesis? if you don't mind.
@postnubilaphoebus96
@postnubilaphoebus96 3 жыл бұрын
I'm still discussing with my supervisors, so I cannot say yet. But I can come back to comment here once I know 😄
@PedramNG
@PedramNG 3 жыл бұрын
@@postnubilaphoebus96 good luck with it 😁
@postnubilaphoebus96
@postnubilaphoebus96 3 жыл бұрын
Thanks! The project also only starts around January, so there's still some time.
@PedramNG
@PedramNG 3 жыл бұрын
@@postnubilaphoebus96 Do let me know about it. 😁 Btw, I have a BCI mindmap, the link is available in the description of my KZfaq channel. I recently started to add some stuff to its computational neuroscience section of it.
@phasor50
@phasor50 26 күн бұрын
everytime someone says "compute" instead of computation or computing, I say to myself "life, liberty and the pursuit of happy"
@kanesoban
@kanesoban 2 жыл бұрын
I am wondering what is the training algorithm used for these networks? Does backpropagation work with SNNs ?
@lucamaxmeyer
@lucamaxmeyer Жыл бұрын
is it possible to download the slides somewhere? thanks for the video!
@abby-fichtner
@abby-fichtner 3 жыл бұрын
SO helpful. Thank you so much! Other than the math part (which I fear I may possibly have fallen asleep for), everything made sense to me except for the part about your shirt. hmmmm. 🙃
@oraz.
@oraz. 5 ай бұрын
It's weird it seems like there are parallels between LMU and Hippo.
@ritikanarwal8147
@ritikanarwal8147 2 жыл бұрын
Sir, i want to write a research paper on Spiking neural networks. Would you please suggest me some of the applications, that would makeit easier to choose a field in this particular concept.
@zheyuanlin6397
@zheyuanlin6397 3 ай бұрын
Amazing talk. Absolutely no way it's free.
@stanislav4607
@stanislav4607 3 ай бұрын
48:50 that aged well
@henrilemoine3953
@henrilemoine3953 2 жыл бұрын
If this is all true, why aren’t everyone working on neuromorphic computing and SNNs?? This is really confusing to me, because I expect that researchers would all turn towards this research area if they knew it to be accurate.
@nullbeyondo
@nullbeyondo Жыл бұрын
Because I don't think neuromorphic chips are easily debuggable since neurons would become physical. Also, any use of backpropagation (which is the industry's focus right now) destroys the purpose of spiking neural networks in the first place IMO; like how are you gonna calculate a gradient for neurons that have synapsis looping? The only choice is to probably not have looping which renders SNNs further and further away from how the brain actually works. It worth nothing that the brain doesn't use backpropagation.
@TruePeaceSeeker
@TruePeaceSeeker Жыл бұрын
Because to get traction one must be viable in the industry as well
@diffpizza
@diffpizza 10 ай бұрын
This is fucking amazing and I really want to do research on it
@parsarahimi335
@parsarahimi335 3 жыл бұрын
I came here to see spiking networks then he finishes by presenting LMUs. Like how is that even related
@chriseliasmith5387
@chriseliasmith5387 3 жыл бұрын
LMUs (unlike many other RNNs esp. LSTMs etc) run well on spiking hardware.
@shamnanaseer766
@shamnanaseer766 2 жыл бұрын
Klllllllllll
Cosyne 2022 Tutorial on Spiking Neural Networks - Part 1/2
47:04
Neural Reckoning
Рет қаралды 29 М.
$10,000 Every Day You Survive In The Wilderness
26:44
MrBeast
Рет қаралды 133 МЛН
Would you like a delicious big mooncake? #shorts#Mooncake #China #Chinesefood
00:30
The Most Important Algorithm in Machine Learning
40:08
Artem Kirsanov
Рет қаралды 262 М.
The Next Generation Of Brain Mimicking AI
25:46
New Mind
Рет қаралды 107 М.
Neuromorphic: BRAINLIKE Computers
35:52
Coreteks
Рет қаралды 90 М.
Brain-Like (Neuromorphic) Computing - Computerphile
13:58
Computerphile
Рет қаралды 172 М.
Why spiking neural networks are important - Simon Thorpe, CERCO
29:28
Architecture All Access: Neuromorphic Computing Part 1
10:32
Intel Technology
Рет қаралды 27 М.
ESWEEK 2021 Education - Spiking Neural Networks
1:58:53
Embedded Systems Week (ESWEEK)
Рет қаралды 17 М.
China 🇨🇳 Phone 📱 Charger
0:42
Edit Zone 1.8M views
Рет қаралды 382 М.
i like you subscriber ♥️♥️ #trending #iphone #apple #iphonefold
0:14
Samsung S24 Ultra professional shooting kit #shorts
0:12
Photographer Army
Рет қаралды 15 МЛН