文档API 参考📓 教程🧑‍🍳 食谱🤝 集成💜 Discord🎨 Studio
文档

AzureOpenAIGenerator

此组件支持使用 Azure 服务通过 OpenAI 的大型语言模型 (LLM) 进行文本生成。

pipeline 中的最常见位置PromptBuilder 之后
必需的初始化变量"api_key": Azure OpenAI API 密钥。可以通过以下方式设置环境变量 AZURE_OPENAI_API_KEY

"azure_ad_token": Microsoft Entra ID 令牌。可以通过AZURE_OPENAI_AD_TOKEN 环境变量设置。
强制运行变量“prompt”:一个包含 LLM 提示的字符串
输出变量“replies”:一个包含 LLM 生成的所有回复的字符串列表

”meta”:一个包含与每个回复相关的元数据的字典列表,例如 token 计数、结束原因等
API 参考Generators (生成器)
GitHub 链接https://github.com/deepset-ai/haystack/blob/main/haystack/components/generators/azure.py

概述

AzureOpenAIGenerator 支持通过 Azure 服务部署的 OpenAI 模型。要查看支持的模型列表,请参阅 Azure 文档。该组件使用的默认模型是gpt-4o-mini.

要使用 Azure 组件,您将需要一个 Azure OpenAI API 密钥以及一个 Azure OpenAI 端点。您可以在 Azure 文档 中了解有关它们的更多信息。

该组件使用AZURE_OPENAI_API_KEY 和 AZURE_OPENAI_AD_TOKEN 环境变量。否则,您可以在初始化时传递api_key 和 azure_ad_token

client = AzureOpenAIGenerator(azure_endpoint="<Your Azure endpoint e.g. `https://your-company.azure.openai.com/>",
                        api_key=Secret.from_token("<your-api-key>"),
                        azure_deployment="<a model name>")

📘

我们建议使用环境变量而不是初始化参数。

然后,该组件需要一个 prompt 来运行,但您可以通过openai.ChatCompletion.create 方法,将任何有效的文本生成参数直接传递给此组件,使用generation_kwargs 参数,无论是初始化时还是run() 方法。有关支持的参数的更多详细信息,请参阅 Azure 文档

您还可以通过azure_deployment 初始化参数为此组件指定模型。

流式传输

AzureOpenAIGenerator 支持直接在输出中流式传输来自 LLM 的 token。为此,请将一个函数传递给streaming_callback 初始化参数。请注意,流式传输 token 仅与生成单个响应兼容,因此n 必须设置为 1 才能使流式传输正常工作。

📘

此组件专为文本生成而设计,而非聊天。如果您想将 LLM 用于聊天,请改用 AzureOpenAIChatGenerator

用法

单独使用

基本用法

from haystack.components.generators import AzureOpenAIGenerator
client = AzureOpenAIGenerator()
response = client.run("What's Natural Language Processing? Be brief.")
print(response)

>> {'replies': ['Natural Language Processing (NLP) is a branch of artificial intelligence that focuses on
>> the interaction between computers and human language. It involves enabling computers to understand, interpret,
>> and respond to natural human language in a way that is both meaningful and useful.'], 'meta': [{'model':
>> 'gpt-4o-mini', 'index': 0, 'finish_reason': 'stop', 'usage': {'prompt_tokens': 16,
>> 'completion_tokens': 49, 'total_tokens': 65}}]}

使用流式传输

from haystack.components.generators import AzureOpenAIGenerator

client = AzureOpenAIGenerator(streaming_callback=lambda chunk: print(chunk.content, end="", flush=True))
response = client.run("What's Natural Language Processing? Be brief.")
print(response)

>>> Natural Language Processing (NLP) is a branch of artificial
	intelligence that focuses on the interaction between computers and human
  language. It involves enabling computers to understand, interpret,and respond
  to natural human language in a way that is both meaningful and useful.
>>> {'replies': ['Natural Language Processing (NLP) is a branch of artificial
	intelligence that focuses on the interaction between computers and human
  language. It involves enabling computers to understand, interpret,and respond
  to natural human language in a way that is both meaningful and useful.'],
  'meta': [{'model': 'gpt-4o-mini', 'index': 0, 'finish_reason':
  'stop', 'usage': {'prompt_tokens': 16, 'completion_tokens': 49,
  'total_tokens': 65}}]}

在 Pipeline 中

from haystack import Pipeline
from haystack.components.retrievers.in_memory import InMemoryBM25Retriever
from haystack.components.builders.prompt_builder import PromptBuilder
from haystack.components.generators import AzureOpenAIGenerator
from haystack.document_stores.in_memory import InMemoryDocumentStore
from haystack import Document

docstore = InMemoryDocumentStore()
docstore.write_documents([Document(content="Rome is the capital of Italy"), Document(content="Paris is the capital of France")])

query = "What is the capital of France?"

template = """
Given the following information, answer the question.

Context: 
{% for document in documents %}
    {{ document.content }}
{% endfor %}

Question: {{ query }}?
"""
pipe = Pipeline()

pipe.add_component("retriever", InMemoryBM25Retriever(document_store=docstore))
pipe.add_component("prompt_builder", PromptBuilder(template=template))
pipe.add_component("llm", AzureOpenAIGenerator())
pipe.connect("retriever", "prompt_builder.documents")
pipe.connect("prompt_builder", "llm")

res=pipe.run({
    "prompt_builder": {
        "query": query
    },
    "retriever": {
        "query": query
    }
})

print(res)

相关链接

在我们的 API 参考中查看参数详情