from langchain.llms import OpenAI
llm = OpenAI(openai_api_key="...")
否则,可以不使用任何参数进行初始化:
from langchain.llms import OpenAI
llm = OpenAI()
call:输入字符串 -> 输出字符串 使用LLM的最简单方法是可调用的:输入一个字符串,获得一个字符串完成结果。 llm("Tell me a joke") 'Why did the chicken cross the road?\n\nTo get to the other side.' generate:批量调用,更丰富的输出 generate允许使用字符串列表调用模型,获得比仅文本更完整的响应。这个完整的响应可以包括多个顶部响应和其他LLM提供程序特定的信息:
llm_result = llm.generate(["Tell me a joke","Tell me a poem"]*15)len(llm_result.generations)#30
llm_result.generations[0]
[Generation(text='\n\nWhy did the chicken cross the road?\n\nTo get to the other side!'),
Generation(text='\n\nWhy did the chicken cross the road?\n\nTo get to the other side.')]
llm_result.generations[-1]
[Generation(text="\n\nWhat if love neverspeech\n\nWhat if love never ended\n\nWhat if love was only a feeling\n\nI'll never know this love\n\nIt's not a feeling\n\nBut it's what we have for each other\n\nWe just know that love is something strong\n\nAnd we can't help but be happy\n\nWe just feel what love is for us\n\nAnd we love each other with all our heart\n\nWe just don't know how\n\nHow it will go\n\nBut we know that love is something strong\n\nAnd we'll always have each other\n\nIn our lives."),
Generation(text='\n\nOnce upon a time\n\nThere was a love so pure and true\n\nIt lasted for centuries\n\nAnd never became stale or dry\n\nIt was moving and alive\n\nAnd the heart of the love-ick\n\nIs still beating strong and true.')]
import langchain
from langchain.llms import OpenAI
# To make the caching really obvious, lets use a slower model.
llm = OpenAI(model_name="text-davinci-002", n=2, best_of=2)
内存缓存(In Memory Cache)
from langchain.cache import InMemoryCache
langchain.llm_cache = InMemoryCache()# The first time, it is not yet in cache, so it should take longer
llm("Tell me a joke")
CPU times: user 35.9 ms, sys:28.6 ms, total:64.6 ms
Wall time:4.83 s
"\n\nWhy couldn't the bicycle stand up by itself? It was...two tired!"
# The second time it is, so it goes faster
llm("Tell me a joke")
CPU times: user 238 µs, sys:143 µs, total:381 µs
Wall time:1.76 ms
'\n\nWhy did the chicken cross the road?\n\nTo get to the other side.'
SQLite缓存(SQLite Cache) rm .langchain.db
# We can do the same thing with a SQLite cachefrom langchain.cache import SQLiteCache
langchain.llm_cache = SQLiteCache(database_path=".langchain.db")
# The first time, it is not yet in cache, so it should take longer
llm("Tell me a joke")
CPU times: user 17 ms, sys:9.76 ms, total:26.7 ms
Wall time:825 ms
'\n\nWhy did the chicken cross the road?\n\nTo get to the other side.'
# The second time it is, so it goes faster
llm("Tell me a joke")
CPU times: user 2.46 ms, sys:1.23 ms, total:3.7 ms
Wall time:2.67 ms
'\n\nWhy did the chicken cross the road?\n\nTo get to the other side.'
链中的可选缓存(Optional Caching in Chains) 还可以关闭链中特定节点的缓存。请注意,由于某些接口的原因,先构建链,然后再编辑LLM通常更容易。 例如,我们将加载一个摘要映射-减少链。我们将对映射步骤的结果进行缓存,但不对合并步骤进行冻结。
from langchain.docstore.document import Document
docs =[Document(page_content=t)for t in texts[:3]]from langchain.chains.summarize import load_summarize_chain
CPU times: user 452 ms, sys:60.3 ms, total:512 ms
Wall time:5.09 s
'\n\nPresident Biden is discussing the American Rescue Plan and the Bipartisan Infrastructure Law, which will create jobs and help Americans. He also talks about his vision for America, which includes investing in education and infrastructure. In response to Russian aggression in Ukraine, the United States is joining with European allies to impose sanctions and isolate Russia. American forces are being mobilized to protect NATO countries in the event that Putin decides to keep moving west. The Ukrainians are bravely fighting back, but the next few weeks will be hard for them. Putin will pay a high price for his actions in the long run. Americans should not be alarmed, as the United States is taking action to protect its interests and allies.'
CPU times: user 11.5 ms, sys:4.33 ms, total:15.8 ms
Wall time:1.04 s
'\n\nPresident Biden is discussing the American Rescue Plan and the Bipartisan Infrastructure Law, which will create jobs and help Americans. He also talks about his vision for America, which includes investing in education and infrastructure.'
from langchain.llms import OpenAI
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
llm = OpenAI(streaming=True, callbacks=[StreamingStdOutCallbackHandler()], temperature=0)
resp = llm("Write me a song about sparkling water.")
Verse 1
I'm sippin' on sparkling water,
It's so refreshing and light,
It's the perfect way to quench my thirst
On a hot summer night.
Chorus
Sparkling water, sparkling water,
It's the best way to stay hydrated,
It's so crisp and so clean,
It's the perfect way to stay refreshed.
Verse 2
I'm sippin' on sparkling water,
It's so bubbly and bright,
It's the perfect way to cool me down
On a hot summer night.
Chorus
Sparkling water, sparkling water,
It's the best way to stay hydrated,
It's so crisp and so clean,
It's the perfect way to stay refreshed.
Verse 3
I'm sippin' on sparkling water,
It's so light and so clear,
It's the perfect way to keep me cool
On a hot summer night.
Chorus
Sparkling water, sparkling water,
It's the best way to stay hydrated,
It's so crisp and so clean,
It's the perfect way to stay refreshed.
llm.generate(["Tell me a joke."])
Q: What did the fish say when it hit the wall?
A: Dam!
LLMResult(generations=[[Generation(text='\n\nQ: What did the fish say when it hit the wall?\nA: Dam!', generation_info={'finish_reason':'stop','logprobs':None})]], llm_output={'token_usage':{},'model_name':'text-davinci-003'})
from langchain.schema import(
AIMessage,
HumanMessage,
SystemMessage
)
chat([HumanMessage(content="Translate this sentence from English to French: I love programming.")])
messages =[
SystemMessage(content="You are a helpful assistant that translates English to French."),
HumanMessage(content="I love programming.")]
chat(messages)
batch_messages =[[
SystemMessage(content="You are a helpful assistant that translates English to French."),
HumanMessage(content="I love programming.")],[
SystemMessage(content="You are a helpful assistant that translates English to French."),
HumanMessage(content="I love artificial intelligence.")],]
result = chat.generate(batch_messages)
result
from langchain import PromptTemplate
from langchain.prompts.chat import(
ChatPromptTemplate,
SystemMessagePromptTemplate,
AIMessagePromptTemplate,
HumanMessagePromptTemplate,)
template="You are a helpful assistant that translates {input_language} to {output_language}."
system_message_prompt = SystemMessagePromptTemplate.from_template(template)
human_template="{text}"
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)
chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, human_message_prompt])# 获取来自格式化消息的聊天完成。
chat(chat_prompt.format_prompt(input_language="English", output_language="French", text="I love programming.").to_messages())
AIMessage(content="J'adore la programmation.", additional_kwargs={})
prompt=PromptTemplate(
template="You are a helpful assistant that translates {input_language} to {output_language}.",
input_variables=["input_language","output_language"],)
system_message_prompt = SystemMessagePromptTemplate(prompt=prompt)
from langchain.chat_models import ChatOpenAI
from langchain.schema import(
HumanMessage,)from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
chat = ChatOpenAI(streaming=True, callbacks=[StreamingStdOutCallbackHandler()], temperature=0)
resp = chat([HumanMessage(content="Write me a song about sparkling water.")])
Verse 1:
Bubbles rising to the top
A refreshing drink that never stops
Clear and crisp, it's pure delight
A taste that's sure to excite
Chorus:
Sparkling water, oh so fine
A drink that's always on my mind
With every sip, I feel alive
Sparkling water, you're my vibe
Verse 2:
No sugar, no calories, just pure bliss
A drink that's hard to resist
It's the perfect way to quench my thirst
A drink that always comes first
Chorus:
Sparkling water, oh so fine
A drink that's always on my mind
With every sip, I feel alive
Sparkling water, you're my vibe
Bridge:
From the mountains to the sea
Sparkling water, you're the key
To a healthy life, a happy soul
A drink that makes me feel whole
Chorus:
Sparkling water, oh so fine
A drink that's always on my mind
With every sip, I feel alive
Sparkling water, you're my vibe
Outro:
Sparkling water, you're the one
A drink that's always so much fun
I'll never let you go, my friend
Sparkling