# 创建Prompt模板 template = PromptTemplate( template="We have provided context information below.\n\n---------------------\n{context_str}\n---------------------\nGiven this information, please answer the question:\n{query_str}\n" )
context_str="In this work, we develop and release Llama 2, a collection of pretrained and fine-tuned large language models (LLMs) ranging in scale from 7 billion to 70 billion parameters", query_str="How many params does llama 2 have",
--------------------- ('In this work, we develop and release Llama 2, a collection of pretrained and fine-tuned large language models (LLMs) ranging in scale from 7 billion to 70 billion parameters',) --------------------- Given this information, please answer the question: ('How many params does llama 2 have',)
========================== [ChatMessage(role=<MessageRole.USER: 'user'>, content="We have provided context information below.\n\n---------------------\n('In this work, we develop and release Llama 2, a collection of pretrained and fine-tuned large language models (LLMs) ranging in scale from 7 billion to 70 billion parameters',)\n---------------------\nGiven this information, please answer the question:\n('How many params does llama 2 have',)\n", additional_kwargs={})]
还可以从chat messages构建
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
from llama_index.core import ChatPromptTemplate from llama_index.core.llms import ChatMessage, MessageRole
message_templates = [ ChatMessage(content="You are an expert system.", role=MessageRole.SYSTEM), ChatMessage( content="Generate a short story about {topic}", role=MessageRole.USER, ), ] chat_template = ChatPromptTemplate(message_templates=message_templates)
# you can create message prompts (for chat API) messages = chat_template.format_messages(topic="bear") print(messages) print("==========================") # or easily convert to text prompt (for completion API) prompt = chat_template.format(topic="bear") print(prompt)
输出
1 2 3 4 5
[ChatMessage(role=<MessageRole.SYSTEM: 'system'>, content='You are an expert system.', additional_kwargs={}), ChatMessage(role=<MessageRole.USER: 'user'>, content='Generate a short story about bear', additional_kwargs={})] ========================== system: You are an expert system. user: Generate a short story about bear assistant:
# define prompt viewing function defdisplay_prompt_dict(prompts_dict): for k, p in prompts_dict.items(): text_md = f"**Prompt Key**: {k}<br>"f"**Text:** <br>" display(Markdown(text_md)) print(p.get_template()) display(Markdown("<br><br>")) qa_prompt_tmpl_str = ( "Context information is below.\n" "---------------------\n" "{context_str}\n" "---------------------\n" "Given the context information and not prior knowledge, " "answer the query in the style of a Shakespeare play.\n" "Query: {query_str}\n" "Answer: " )
Context information is below. --------------------- {context_str} --------------------- Given the context information and not prior knowledge, answer the query in the style of a Shakespeare play. Query: {query_str} Answer:
The original query is as follows: {query_str} We have provided an existing answer: {existing_answer} We have the opportunity to refine the existing answer (only if needed) with some more context below. ------------ {context_msg} ------------ Given the new context, refine the original answer to better answer the query. If the context isn't useful, return the original answer. Refined Answer:
defformat_context_fn(**kwargs): # format context with bullet points context_list = kwargs["context_str"].split("\n\n") fmtted_context = "\n\n".join([f"- {c}"for c in context_list]) return fmtted_context
'Context information is below.\n---------------------\n- context\n---------------------\nGiven the context information and not prior knowledge, answer the query in the style of a Shakespeare play.\nQuery: query\nAnswer: '