CohereGenerator
CohereGenerator 使用 Cohere 的大型语言模型 (LLMs) 实现文本生成。
| pipeline 中的最常见位置 | 在 PromptBuilder 之后 |
| 必需的初始化变量 | "api_key": Cohere API 密钥。可以设置为COHERE_API_KEY 或CO_API_KEY 环境变量。 |
| 强制运行变量 | “prompt”:一个包含 LLM 提示的字符串 |
| 输出变量 | “replies”:一个包含 LLM 生成的所有回复的字符串列表 ”meta”:一个包含与每个回复相关的元数据的字典列表,例如 token 计数、结束原因等 |
| API 参考 | Cohere |
| GitHub 链接 | https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/cohere |
此集成支持 Cohere 模型,例如command, command-r 和comman-r-plus。请参阅 Cohere 文档 中最新的完整列表。
概述
CohereGenerator 需要 Cohere API 密钥才能工作。您可以将此密钥写入
- 该
api_key初始化参数中使用 Secret API - 该
COHERE_API_KEY环境变量(推荐)
然后,该组件需要一个提示才能运行,但您可以在初始化时直接通过generation_kwargs 参数将任何文本生成参数传递给此组件。有关 Cohere API 支持的参数的更多详细信息,请参阅 Cohere 文档。
流式传输
此 Generator 支持将 LLM 的 token直接流式传输到输出中。要做到这一点,请将一个函数传递给streaming_callback 初始化参数。
用法
您需要首先安装cohere-haystack 包以使用CohereGenerator:
pip install cohere-haystack
单独使用
基本用法
from haystack_integrations.components.generators.cohere import CohereGenerator
client = CohereGenerator()
response = client.run("Briefly explain what NLP is in one sentence.")
print(response)
>>> {'replies': ["Natural Language Processing (NLP) is a subfield of artificial intelligence and computational linguistics that focuses on the interaction between computers and human languages..."],
'meta': [{'finish_reason': 'COMPLETE'}]}
使用流式传输
from haystack_integrations.components.generators.cohere import CohereGenerator
client = CohereGenerator(streaming_callback=lambda chunk: print(chunk.content, end="", flush=True))
response = client.run("Briefly explain what NLP is in one sentence.")
print(response)
>>> Natural Language Processing (NLP) is the study of natural language and how it can be used to solve problems through computational methods, enabling machines to understand, interpret, and generate human language.
>>>{'replies': [' Natural Language Processing (NLP) is the study of natural language and how it can be used to solve problems through computational methods, enabling machines to understand, interpret, and generate human language.'], 'meta': [{'index': 0, 'finish_reason': 'COMPLETE'}]}
在 pipeline 中
在 RAG 管道中
from haystack import Pipeline
from haystack.components.retrievers.in_memory import InMemoryBM25Retriever
from haystack.components.builders.prompt_builder import PromptBuilder
from haystack.document_stores.in_memory import InMemoryDocumentStore
from haystack_integrations.components.generators.cohere import CohereGenerator
from haystack import Document
docstore = InMemoryDocumentStore()
docstore.write_documents([Document(content="Rome is the capital of Italy"), Document(content="Paris is the capital of France")])
query = "What is the capital of France?"
template = """
Given the following information, answer the question.
Context:
{% for document in documents %}
{{ document.content }}
{% endfor %}
Question: {{ query }}?
"""
pipe = Pipeline()
pipe.add_component("retriever", InMemoryBM25Retriever(document_store=docstore))
pipe.add_component("prompt_builder", PromptBuilder(template=template))
pipe.add_component("llm", CohereGenerator())
pipe.connect("retriever", "prompt_builder.documents")
pipe.connect("prompt_builder", "llm")
res=pipe.run({
"prompt_builder": {
"query": query
},
"retriever": {
"query": query
}
})
print(res)
更新于 大约 1 年前
