The Simplest Neural Model and a Hypothesis for Language

  Рет қаралды 2,687

MITCBMM

MITCBMM

6 ай бұрын

Daniel Mitropolsky, Columbia University
Abstract: How do neurons, in their collective action, beget cognition, as well as intelligence and reasoning? As Richard Axel recently put it, we do not have a logic for the transformation of neural activity into thought and action; discerning this logic as the most important future direction of neuroscience. I will present a mathematical neural model of brain computation called NEMO, whose key ingredients are spiking neurons, random synapses and weights, local inhibition, and Hebbian plasticity (no backpropagation). Concepts are represented by interconnected co-firing assemblies of neurons that emerge organically from the dynamical system of its equations. It turns out it is possible to carry out complex operations on these concept representations, such as copying, merging, completion from small subsets, and sequence memorization. NEMO is a neuromorphic computational system that, because of its simplifying assumptions, can be efficiently simulated on modern hardware. I will present how to use NEMO to implement an efficient parser of a small but non-trivial subset of English, and a more recent model of the language organ in the baby brain that learns the meaning of words, and basic syntax, from whole sentences with grounded input. In addition to constituting hypotheses as to the logic of the brain, we will discuss how principles from these brain-like models might be used to improve AI, which, despite astounding recent progress, still lags behind humans in several key dimensions such as creativity, hard constraints, energy consumption.
cbmm.mit.edu/news-events/even...

Пікірлер: 3
@jumpstar9000
@jumpstar9000 5 ай бұрын
I really like the way Daniel thinks. It's very intuitive and that works for me. Very good. Thanks a lot for sharing.
@AlgoNudger
@AlgoNudger 6 ай бұрын
Thanks.
@wengemurphy
@wengemurphy 3 ай бұрын
You can make up anything you like but what does this model have to do with biology?
Computation and Learning with Assemblies of Neurons
1:04:22
MITCBMM
Рет қаралды 4,3 М.
ROCK PAPER SCISSOR! (55 MLN SUBS!) feat @PANDAGIRLOFFICIAL #shorts
00:31
Stupid Barry Find Mellstroy in Escape From Prison Challenge
00:29
Garri Creative
Рет қаралды 20 МЛН
СНЕЖКИ ЛЕТОМ?? #shorts
00:30
Паша Осадчий
Рет қаралды 8 МЛН
Successes and Failures of Neural Network Models of Hearing
58:14
Diffusion and Score-Based Generative Models
1:32:01
MITCBMM
Рет қаралды 69 М.
Dense Associative Memory in Machine Learning
56:26
MITCBMM
Рет қаралды 3,6 М.
Let's build GPT: from scratch, in code, spelled out.
1:56:20
Andrej Karpathy
Рет қаралды 4,4 МЛН
Virology Lectures 2023 #1: What is a virus?
57:42
MicrobeTV
Рет қаралды 51 М.
Christos Papadimitriou - How Does the Brain Create Language?
1:22:05
Berkeley EECS
Рет қаралды 2,2 М.
MIT 6.S191 (2023): Recurrent Neural Networks, Transformers, and Attention
1:02:50
WE MUST ADD STRUCTURE TO DEEP LEARNING BECAUSE...
1:49:11
Machine Learning Street Talk
Рет қаралды 81 М.
Wolfram Physics Project: Relations to Category Theory
3:54:12
Wolfram
Рет қаралды 447 М.
Neuronal Ensembles 2021 Webinar
3:51:32
Neurotechnology Center at Columbia University
Рет қаралды 2 М.
Gizli Apple Watch Özelliği😱
0:14
Safak Novruz
Рет қаралды 3,6 МЛН
Собери ПК и Получи 10,000₽
1:00
build monsters
Рет қаралды 1,2 МЛН
Iphone or nokia
0:15
rishton vines😇
Рет қаралды 1,9 МЛН
Урна с айфонами!
0:30
По ту сторону Гугла
Рет қаралды 7 МЛН