LangChain(四)——Memory

LangChain模块架构图 [1]

一、记忆封装:Memory 简介

Memory模块可以帮助保存和管理历史聊天消息,以及构建关于特定实体的知识。这些组件可以跨多轮对话储存信息,并允许在对话期间跟踪特定信息和上下文。

  • 对话缓存储存 (ConversationBufferMemory)
  • 对话缓存窗口储存 (ConversationBufferWindowMemory)
  • 对话令牌缓存储存 (ConversationTokenBufferMemory)
  • 对话摘要缓存储存 (ConversationSummaryBufferMemory)

二、对话缓存储存:ConversationBufferMemory

保存对话上下文

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
from langchain.chains import ConversationChain
from langchain_openai import ChatOpenAI
from langchain.memory import ConversationBufferMemory

llm = ChatOpenAI(temperature=0.0)

memory = ConversationBufferMemory()

# 新建一个 ConversationChain Class 实例
# verbose参数设置为True时,程序会输出更详细的信息,以提供更多的调试或运行时信息。
# 相反,当将verbose参数设置为False时,程序会以更简洁的方式运行,只输出关键的信息。
conversation = ConversationChain(llm=llm, memory = memory, verbose=False )

conversation.predict(input="Hi, my name is XiaoMing")
print(memory.load_memory_variables({}))
conversation.predict(input="What is 1+1?")
print(memory.load_memory_variables({}))

memory.save_context({"input": "Hi"}, {"output": "What's up"}) # 向缓存区添加指定对话的输入输出
print("=====")
print(memory.load_memory_variables({}))
1
2
3
4
{'history': "Human: Hi, my name is XiaoMing\nAI: Hello XiaoMing! It's nice to meet you. How can I assist you today?"}
{'history': "Human: Hi, my name is XiaoMing\nAI: Hello XiaoMing! It's nice to meet you. How can I assist you today?\nHuman: What is 1+1?\nAI: 1+1 equals 2. Is there anything else you would like to know?"}
=====
{'history': "Human: Hi, my name is XiaoMing\nAI: Hello XiaoMing! It's nice to meet you. How can I assist you today?\nHuman: What is 1+1?\nAI: 1+1 equals 2. Is there anything else you would like to know?\nHuman: Hi\nAI: What's up"}

在使用大型语言模型进行聊天对话时,大型语言模型本身实际上是无状态的。语言模型本身并不记得到目前为止的历史对话。每次调用API结点都是独立的。Memory可以储存到目前为止的所有术语或对话,并将其输入或附加上下文到LLM中用于生成输出。

三、对话缓存窗口储存:ConversationBufferWindowMemory

随着对话变得越来越长,所需的内存量也变得非常长。将大量的tokens发送到LLM的成本,也会变得更加昂贵。对话缓存窗口储存只保留一个窗口大小的对话,它只使用最近的n次交互,这可以用于保持最近交互的滑动窗口,以便缓冲区不会过大。

1
2
3
4
5
6
7
from langchain.memory import ConversationBufferWindowMemory

window = ConversationBufferWindowMemory(k=2)
window.save_context({"input": "第一轮问"}, {"output": "第一轮答"})
window.save_context({"input": "第二轮问"}, {"output": "第二轮答"})
window.save_context({"input": "第三轮问"}, {"output": "第三轮答"})
print(window.load_memory_variables({}))

输出

1
{'history': 'Human: 第二轮问\nAI: 第二轮答\nHuman: 第三轮问\nAI: 第三轮答'}

k为窗口参数,k=1时表明只保留一个对话记忆,只会记录一轮问答

四、通过 Token 数控制上下文长度:ConversationTokenBufferMemory

使用对话token缓存记忆,内存将限制保存的token数量。如果token数量超出指定数目,它会切掉这个对话的早期部分以保留与最近的交流相对应的token数量,但不超过token限制。

1
2
3
4
5
6
7
8
9
10
11
12
13
from langchain.memory import ConversationTokenBufferMemory
from langchain_openai import ChatOpenAI

memory = ConversationTokenBufferMemory(
llm=ChatOpenAI(),
max_token_limit=40
)
memory.save_context(
{"input": "你好啊"}, {"output": "你好,我是你的AI助手。"})
memory.save_context(
{"input": "你会干什么"}, {"output": "我什么都会"})

print(memory.load_memory_variables({}))

输出

1
{'history': 'AI: 你好,我是你的AI助手。\nHuman: 你会干什么\nAI: 我什么都会'}

五、对话摘要缓存储存:ConversationSummaryMemory

对话摘要缓存储存,使用LLM编写到目前为止历史对话的摘要,并将其保存

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
from langchain.memory import ConversationTokenBufferMemory
from langchain.memory import ConversationSummaryBufferMemory

# 创建一个长字符串
schedule = "There is a meeting at 8am with your product team. \
You will need your powerpoint presentation prepared. \
9am-12pm have time to work on your LangChain \
project which will go quickly because Langchain is such a powerful tool. \
At Noon, lunch at the italian resturant with a customer who is driving \
from over an hour away to meet you to understand the latest in AI. \
Be sure to bring your laptop to show the latest LLM demo."

# 使用对话摘要缓存记忆
llm = ChatOpenAI(temperature=0.0)
memory = ConversationSummaryBufferMemory(llm=llm, max_token_limit=100)
memory.save_context({"input": "Hello"}, {"output": "What's up"})
memory.save_context({"input": "Not much, just hanging"}, {"output": "Cool"})
memory.save_context(
{"input": "What is on the schedule today?"}, {"output": f"{schedule}"}
)
print(memory.load_memory_variables({})['history'])

输出

1
System: The human and AI exchange greetings and discuss the day's schedule. The AI informs the human of a morning meeting with the product team, work on the LangChain project, and a lunch meeting with a customer interested in AI. The AI emphasizes the importance of being prepared for the day's events.

六、更多类型

参考


LangChain(四)——Memory
https://mztchaoqun.com.cn/posts/D28_LangChain_Memory/
作者
mztchaoqun
发布于
2024年7月15日
许可协议