Memory in LangChain | Deep dive (python)

  Рет қаралды 8,882

Eden Marco

Eden Marco

Жыл бұрын

ConversationBufferMemory
ConversationBufferMemory is a memory utility in the Langchain package that allows for storing messages in a buffer and extracting them as a string or a list of messages. It is useful for storing conversation history in a chatbot or conversational AI system.
It simply keeps a buffer of all the interactions in a conversation. It does not have a limit on the number of interactions it can store
ConversationBufferWindowMemory
ConversationBufferWindowMemory is a type of memory in the LangChain package that keeps a list of the interactions of a conversation over time, but only uses the last K interactions. This can be useful for keeping a sliding window of the most recent interactions, so the buffer does not get too large. On the other hand, ConversationBufferMemory is a different type of memory that simply keeps a buffer of all the interactions in a conversation. It does not have a limit on the number of interactions it can store.
ConversationTokenBufferMemory
ConversationTokenBufferMemory is a buffer for storing conversation memory that keeps a buffer of recent interactions in memory, and uses token length rather than the number of interactions to determine when to flush interactions. It is a class in the langchain.memory.token_buffer module and is used in conversational AI applications to store and retrieve conversation history.
ConversationSummaryMemory
ConversationSummaryMemory is a type of memory utility in the LangChain package that creates a summary of the conversation over time. It condenses information from the conversation over time and can be useful for keeping track of the conversation history. It differs from other memory utilities in LangChain in that it specifically focuses on creating a summary of the conversation, whereas other memory utilities may focus on different aspects of the conversation, such as keeping a buffer of recent interactions or using token length to determine when to flush interactions.
ConversationSummaryBufferMemory
ConversationSummaryBufferMemory is a type of memory utility in the LangChain package that keeps a buffer of recent interactions in memory and compiles them into a summary, using token length rather than the number of interactions to determine when to flush interactions. It differs from other memory utilities in that it combines the functionality of keeping a buffer of recent interactions with the ability to summarize the conversation over time. Other memory utilities in the package include ConversationSummaryMemory, which creates a summary of the conversation over time, and ConversationBufferMemory, which keeps a buffer of recent interactions in memory.
Entity Memory
Entity Memory is a memory module in LangChain that remembers information about specific entities. It uses LLMs to extract information on entities and builds up its knowledge about that entity over time. It is designed to be used in chains and provides easy ways to incorporate its utilities into chains. Compared to other memory utilities in LangChain, Entity Memory is focused on remembering information about specific entities, while other memory utilities are more general and can be used to manage and manipulate previous chat messages. Additionally, Entity Memory is designed to be used in chains, while other memory utilities can be used in a variety of contexts.
VectorStore-Backed Memory
VectorStore-Backed Memory is a memory component in LangChain that stores memories in a VectorDB and queries the top-K most “salient” docs every time it is called. It differs from most of the other Memory classes in that it doesn’t explicitly track the order of interactions. In this case, the “docs” are previous conversation snippets. This can be useful to refer to relevant pieces of information that the AI was told earlier in the conversation.

Пікірлер: 17
@MariusHeier1
@MariusHeier1 Жыл бұрын
This was an awesome way of explaining it. Basically learned as much as 4 other videos by leaving out the code samples. Love it. Subbed.
@user-uc2bl5jm6o
@user-uc2bl5jm6o 4 ай бұрын
Hi Eden. I'm really happy I took your Udemy class on Langchain. Well worth the money. Lots of great information. Thanks for making it.
@williamtian566
@williamtian566 Жыл бұрын
the best tutorial so fat about langchain , well done , make everything about memory very clear , i love u !
@reinerheiner1148
@reinerheiner1148 Жыл бұрын
That was a great lecture, thank you. I like it that you focus more on how it works instead of just going through a notebook with code. This channel is seriously underrated. Please keep up the good work and you'll gain more subscribers in no time. You earned my sub!
@EdenMarco
@EdenMarco Жыл бұрын
you made my day :) thank you!
@reinerheiner1148
@reinerheiner1148 Жыл бұрын
@@EdenMarco My pleasure. Just stating the facts though. :) Looking forward to your next videos!
@mchinnyc
@mchinnyc 11 ай бұрын
This is great! Answers some questions that have come up recently from me. Thank you.
@wehappyfewkd
@wehappyfewkd 8 ай бұрын
very nice explanation of all supported memories. Thank you and keep up the good work!
@mdyoonus83
@mdyoonus83 9 ай бұрын
Very neatly put together brother
@yazanrisheh5127
@yazanrisheh5127 2 ай бұрын
Hey Eden, I know its late to ask but have you ever used the ReadOnlyMemory and how is it different compared to standard memory like ConversationBufferMemory for their agents/tools? Cuz I'm reading that ReadOnlyMemory is better for agents/tools and standard memory is better for non-agents but I'm confused as in why. Does that mean my agent can delete or change the chat history if I dont use ReadOnlyMemory?
@kylemallah57
@kylemallah57 Жыл бұрын
At 13:49 when you're referencing the SUMMARY_PROMPT, you read _DEFAULT_ENTITY_EXTRACTION_TEMPLATE but it's actually using _DEFAULT_SUMMARIZER_TEMPLATE
@edgarcc9350
@edgarcc9350 9 ай бұрын
how i can pre load a memory? im deploying this on a backend but the buffermemory is the same for 2 session user for example
@keenanfernandes1130
@keenanfernandes1130 8 ай бұрын
is there a way to carry the system message forward as the conversation goes on?
@LiminalStvte
@LiminalStvte 7 ай бұрын
i noticed lm studio has a memory that only cuts out the middle, keeps the first few and last message history.
@ShivamGupta-qh8go
@ShivamGupta-qh8go 2 ай бұрын
Can someone pls tell me what course is being talked about here .... i really loved how you explained instead of just showing ... i want to buy that course
@EdenMarco
@EdenMarco 2 ай бұрын
Here is a discount coupon :) www.udemy.com/course/langchain/?referralCode=D981B8213164A3EA91AC
Slim Version of ChatGPT Code-Interpreter with LangChain
5:33
Eden Marco
Рет қаралды 1 М.
Русалка
01:00
История одного вокалиста
Рет қаралды 2,5 МЛН
아이스크림으로 체감되는 요즘 물가
00:16
진영민yeongmin
Рет қаралды 28 МЛН
Alat Seru Penolong untuk Mimpi Indah Bayi!
00:31
Let's GLOW! Indonesian
Рет қаралды 15 МЛН
Chatbot Memory for Chat-GPT, Davinci + other LLMs - LangChain #4
26:30
LangChain: Giving Memory to LLMs
15:48
Prompt Engineering
Рет қаралды 20 М.
Memory in LangChain Python
33:45
Geeky Shows
Рет қаралды 1,2 М.
Do NOT use Streamlit for PRODUCTION until you watch this!
6:33
Full Stack AI
Рет қаралды 4,1 М.
Building a LangChain Custom Medical Agent with Memory
17:47
Sam Witteveen
Рет қаралды 46 М.
LangChain101: Question A 300 Page Book (w/ OpenAI + Pinecone)
11:32
Greg Kamradt (Data Indy)
Рет қаралды 202 М.
LangChain Tutorial (JS) #7: Long Term Conversation Memory
19:06
Leon van Zyl
Рет қаралды 3,6 М.
Chat with Multiple PDFs | LangChain App Tutorial in Python (Free LLMs and Embeddings)
1:07:30
Alejandro AO - Software & Ai
Рет қаралды 441 М.
В России ускорили интернет в 1000 раз
0:18
Короче, новости
Рет қаралды 1,8 МЛН
OZON РАЗБИЛИ 3 КОМПЬЮТЕРА
0:57
Кинг Комп Shorts
Рет қаралды 1,1 МЛН
Hisense Official Flagship Store Hisense is the champion What is going on?
0:11
Special Effects Funny 44
Рет қаралды 2,8 МЛН
КРУТОЙ ТЕЛЕФОН
0:16
KINO KAIF
Рет қаралды 3,5 МЛН
Хотела заскамить на Айфон!😱📱(@gertieinar)
0:21
Взрывная История
Рет қаралды 5 МЛН