文档API 参考📓 教程🧑‍🍳 食谱🤝 集成💜 Discord🎨 Studio
文档

GoogleGenAIChatGenerator

此组件通过 Google Gen AI SDK 实现使用 Google Gemini 模型进行聊天补全。

pipeline 中的最常见位置使用 ChatPromptBuilder 后
必需的初始化变量"api_key": Google API 密钥。可以设置GOOGLE_API_KEY 环境变量。
强制运行变量“messages”:一个由 ChatMessage 对象组成的列表,表示聊天记录
输出变量“replies”: 模型对输入聊天的备选回复列表
API 参考Google GenAI
GitHub 链接https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/google_genai

概述

GoogleGenAIChatGenerator 支持gemini-2.0-flash (默认),gemini-2.5-pro-exp-03-25, gemini-1.5-progemini-1.5-flash 模型。

流式传输

此 Generator 支持将 LLM 的 token直接流式传输到输出中。要做到这一点,请将一个函数传递给streaming_callback 初始化参数。

身份验证

Google Gen AI 与 Gemini Developer API 和 Vertex AI API 都兼容。

要将此组件与 Gemini Developer API 配合使用并获取 API 密钥,请访问Google AI Studio
要将此组件与 Vertex AI API 配合使用,请访问Google Cloud > Vertex AI

该组件默认使用GOOGLE_API_KEYGEMINI_API_KEY 环境变量。否则,您可以在初始化时通过SecretSecret.from_token 静态方法传递 API 密钥。

embedder = GoogleGenAITextEmbedder(api_key=Secret.from_token("<your-api-key>"))

以下示例显示了如何将该组件与 Gemini Developer API 和 Vertex AI API 结合使用。

Gemini Developer API (API 密钥认证)

from haystack_integrations.components.generators.google_genai import GoogleGenAIChatGenerator

# set the environment variable (GOOGLE_API_KEY or GEMINI_API_KEY)
chat_generator = GoogleGenAIChatGenerator()

Vertex AI (Application Default Credentials)

from haystack_integrations.components.generators.google_genai import GoogleGenAIChatGenerator

# Using Application Default Credentials (requires gcloud auth setup)
chat_generator = GoogleGenAIChatGenerator(
    api="vertex",
    vertex_ai_project="my-project",
    vertex_ai_location="us-central1",
)

Vertex AI (API 密钥认证)

from haystack_integrations.components.generators.google_genai import GoogleGenAIChatGenerator

# set the environment variable (GOOGLE_API_KEY or GEMINI_API_KEY)
chat_generator = GoogleGenAIChatGenerator(api="vertex")

用法

要开始使用此集成,请通过以下方式安装软件包:

pip install google-genai-haystack

单独使用

from haystack.dataclasses.chat_message import ChatMessage
from haystack_integrations.components.generators.google_genai import GoogleGenAIChatGenerator

# Initialize the chat generator
chat_generator = GoogleGenAIChatGenerator()

# Generate a response
messages = [ChatMessage.from_user("Tell me about movie Shawshank Redemption")]
response = chat_generator.run(messages=messages)
print(response["replies"][0].text)

您还可以轻松使用函数调用。首先,在本地定义函数并将其转换为 Tool

from typing import Annotated
from haystack.tools import create_tool_from_function

# example function to get the current weather
def get_current_weather(
    location: Annotated[str, "The city for which to get the weather, e.g. 'San Francisco'"] = "Munich",
    unit: Annotated[str, "The unit for the temperature, e.g. 'celsius'"] = "celsius",
) -> str:
    return f"The weather in {location} is sunny. The temperature is 20 {unit}."

tool = create_tool_from_function(get_current_weather)

创建新实例GoogleGenAIChatGenerator 来设置工具,并使用 ToolInvoker 来调用工具。

import os
from haystack_integrations.components.generators.google_genai import GoogleGenAIChatGenerator
from haystack.components.tools import ToolInvoker

os.environ["GOOGLE_API_KEY"] = "<MY_API_KEY>"

genai_chat = GoogleGenAIChatGenerator(tools=[tool])

tool_invoker = ToolInvoker(tools=[tool])

然后提出问题

from haystack.dataclasses import ChatMessage

messages = [ChatMessage.from_user("What is the temperature in celsius in Berlin?")]
res = genai_chat.run(messages=messages)

print(res["replies"][0].tool_calls)
>>> [ToolCall(tool_name='get_current_weather',
>>>           arguments={'unit': 'celsius', 'location': 'Berlin'}, id=None)]

tool_messages = tool_invoker.run(messages=replies)["tool_messages"]
messages = user_message + replies + tool_messages

messages += res["replies"][0] + [ChatMessage.from_function(content=weather, name="get_current_weather")]

final_replies = genai_chat.run(messages=messages)["replies"]
print(final_replies[0].text)
>>> The temperature in Berlin is 20 degrees Celsius.

带流式输出

from haystack.dataclasses.chat_message import ChatMessage
from haystack.dataclasses import StreamingChunk
from haystack_integrations.components.generators.google_genai import GoogleGenAIChatGenerator

def streaming_callback(chunk: StreamingChunk):
    print(chunk.content, end='', flush=True)

# Initialize with streaming callback
chat_generator = GoogleGenAIChatGenerator(
    streaming_callback=streaming_callback
)

# Generate a streaming response
messages = [ChatMessage.from_user("Write a short story")]
response = chat_generator.run(messages=messages)
# Text will stream in real-time through the callback

在 pipeline 中

import os
from haystack.components.builders import ChatPromptBuilder
from haystack.dataclasses import ChatMessage
from haystack import Pipeline
from haystack_integrations.components.generators.google_genai import GoogleGenAIChatGenerator

# no parameter init, we don't use any runtime template variables
prompt_builder = ChatPromptBuilder()

os.environ["GOOGLE_API_KEY"] = "<MY_API_KEY>"
genai_chat = GoogleGenAIChatGenerator()

pipe = Pipeline()
pipe.add_component("prompt_builder", prompt_builder)
pipe.add_component("genai", genai_chat)
pipe.connect("prompt_builder.prompt", "genai.messages")

location = "Rome"
messages = [ChatMessage.from_user("Tell me briefly about {{location}} history")]
res = pipe.run(data={"prompt_builder": {"template_variables":{"location": location}, "template": messages}})

print(res)