Language Understanding and LLMs with Christopher Manning - 686

  Рет қаралды 2,453

The TWIML AI Podcast with Sam Charrington

The TWIML AI Podcast with Sam Charrington

Күн бұрын

Today, we're joined by Christopher Manning, the Thomas M. Siebel professor in Machine Learning at Stanford University and a recent recipient of the 2024 IEEE John von Neumann medal. In our conversation with Chris, we discuss his contributions to foundational research areas in NLP, including word embeddings and attention. We explore his perspectives on the intersection of linguistics and large language models, their ability to learn human language structures, and their potential to teach us about human language acquisition. We also dig into the concept of “intelligence” in language models, as well as the reasoning capabilities of LLMs. Finally, Chris shares his current research interests, alternative architectures he anticipates emerging beyond the LLM, and opportunities ahead in AI research.
🎧 / 🎥 Listen or watch the full episode on our page: twimlai.com/go/686.
🔔 Subscribe to our channel for more great content just like this: kzfaq.info?sub_confi...
🗣️ CONNECT WITH US!
===============================
Subscribe to the TWIML AI Podcast: twimlai.com/podcast/twimlai/
Follow us on Twitter: / twimlai
Follow us on LinkedIn: / twimlai
Join our Slack Community: twimlai.com/community/
Subscribe to our newsletter: twimlai.com/newsletter/
Want to get in touch? Send us a message: twimlai.com/contact/
📖 CHAPTERS
===============================
00:00 - Introduction
2:10 - Emergence of LLMs
3:15 - Perspectives of a Linguist in AI
7:43 - LLMs and human language acquisition
14:09 - Relationship of intelligence and LLMs
20:02 - Reasoning in LLMs
27:37 - Breakthroughs of world models
29:19 - GloVe paper
32:39 - Contextual representations and word vectors
34:49 - Embeddings vs transformer architectures in retrieval
35:51 - Attention as mechanism
38:44 - Evolution of attention and transformers
40:21 - Current areas of interests
45:51 - New architectural ideas
50:30 - Recap and future directions
🔗 LINKS & RESOURCES
===============================
GloVe: Global Vectors for Word Representation - nlp.stanford.edu/pubs/glove.pdf
2024 IEEE John Von Neumann Medal: Christopher D. Manning Video - ieeetv.ieee.org/channels/ieee...
Direct Preference Optimization: Your Language Model is Secretly a Reward Model - arxiv.org/abs/2305.18290
Pushdown Layers: Encoding Recursive Structure in Transformer Language Models - arxiv.org/abs/2310.19089
For a COMPLETE LIST of links and references, head over to twimlai.com/go/686.
📸 Camera: amzn.to/3TQ3zsg
🎙️Microphone: amzn.to/3t5zXeV
🚦Lights: amzn.to/3TQlX49
🎛️ Audio Interface: amzn.to/3TVFAIq
🎚️ Stream Deck: amzn.to/3zzm7F5

Пікірлер: 8
@sachinelearning
@sachinelearning Ай бұрын
Prof Manning is a Legend in the NLP space! Thanks a lot for the amazing interview!
@twimlai
@twimlai 28 күн бұрын
Couldn't agree more!
@420_gunna
@420_gunna Ай бұрын
I love this little man and his big ideas
@sanesanyo
@sanesanyo Ай бұрын
Really nice talk. Thanks a lot for doing this.
@markryan2475
@markryan2475 Ай бұрын
Awesome discussion. Really interesting to hear a perspective grounded in linguistics
@RalphDratman
@RalphDratman 27 күн бұрын
Where does Prof. Manning see the LLM's acquisition of grammar? He seemed to be implying that it is sort of obvious that LLMs understand grammar. I'd like to learn how to see that. I am hoping to find out by means of small language models how to learn grammar without semantics.
@twimlai
@twimlai 22 күн бұрын
Understand in the sense that they can respond to and produce it, but I think he'd argue against "understanding" in some deeper senses.
@RalphDratman
@RalphDratman 22 күн бұрын
@@twimlai Respond to and produce is enough for me. I want to learn something about how they do it.
Chronos: Learning the Language of Time Series with Abdul Fatir Ansari - 685
42:36
The TWIML AI Podcast with Sam Charrington
Рет қаралды 905
Stuart Russell, "AI: What If We Succeed?" April 25, 2024
1:29:57
Neubauer Collegium
Рет қаралды 17 М.
Каха и суп
00:39
К-Media
Рет қаралды 2,9 МЛН
KINDNESS ALWAYS COME BACK
00:59
dednahype
Рет қаралды 131 МЛН
Large Language Models (LLMs) - Everything You NEED To Know
25:20
Matthew Berman
Рет қаралды 62 М.
Better Architectures for Neural Networks - Chris Manning vs Yann LeCun
33:27
The Data Science Channel
Рет қаралды 3,4 М.
Long Context Language Models and their Biological Applications with Eric Nguyen - 690
45:12
The TWIML AI Podcast with Sam Charrington
Рет қаралды 339
What's the future for generative AI? - The Turing Lectures with Mike Wooldridge
1:00:59
Ex-Professor Reveals Way to REALLY Learn Languages (according to science)
23:44
John Schulman (OpenAI Cofounder) - Reasoning, RLHF, & Plan for 2027 AGI
1:36:55
GraphRAG: Knowledge Graphs for AI Applications with Kirk Marple - 681
46:53
The TWIML AI Podcast with Sam Charrington
Рет қаралды 3,6 М.
Clicks чехол-клавиатура для iPhone ⌨️
0:59
Mastering Picture Editing: Zoom Tools Tutorial
0:52
Photoo Edit
Рет қаралды 504 М.
Как слушать музыку с помощью чека?
0:36
1$ vs 500$ ВИРТУАЛЬНАЯ РЕАЛЬНОСТЬ !
23:20
GoldenBurst
Рет қаралды 1,6 МЛН
ИГРОВОВЫЙ НОУТ ASUS ЗА 57 тысяч
25:33
Ремонтяш
Рет қаралды 347 М.
Simple maintenance. #leddisplay #ledscreen #ledwall #ledmodule #ledinstallation
0:19
LED Screen Factory-EagerLED
Рет қаралды 28 МЛН