文档API 参考📓 教程🧑‍🍳 食谱🤝 集成💜 Discord🎨 Studio
文档

WeaveConnector

了解如何使用 Weights & Biases Weave 框架来跟踪和监控您的管道组件。

pipeline 中的最常见位置任何地方,因为它不与其他组件连接
必需的初始化变量“pipeline_name”:您的管道名称,该名称也会显示在 Weaver 仪表板中。
输出变量“pipeline_name”:刚刚运行的管道的名称
API 参考weights and bias
GitHub 链接<https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/weights_and_biases_weave>

概述

此集成允许您在 Weights & Biases 中跟踪和可视化您的管道执行。

Haystack 跟踪工具捕获的信息,例如 API 调用、上下文数据和提示,会被发送到 Weights & Biases,您可以在其中查看管道执行的完整跟踪。

先决条件

您需要一个 Weave 帐户才能使用此功能。您可以在 Weights & Biases 网站免费注册。

然后,您需要设置WANDB_API_KEY 环境变量,并设置为您的 Weights & Biases API 密钥。登录后,您可以在 您的主页上找到您的 API 密钥。

然后转到https://wandb.ai/<user_name>/projects,在创建管道时指定的管道名称下查看管道的完整跟踪。WeaveConnector.

您还需要设置HAYSTACK_CONTENT_TRACING_ENABLED 环境变量设置为true.

用法

首先,请安装weights_biases-haystack 包才能使用此连接器

pip install weights_biases-haystack

然后,在没有任何连接的情况下将其添加到您的管道中,它将自动开始将跟踪发送到 Weights & Biases。

import os

from haystack import Pipeline
from haystack.components.builders import ChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage

from haystack_integrations.components.connectors.weave import WeaveConnector

pipe = Pipeline()
pipe.add_component("prompt_builder", ChatPromptBuilder())
pipe.add_component("llm", OpenAIChatGenerator(model="gpt-3.5-turbo"))
pipe.connect("prompt_builder.prompt", "llm.messages")

connector = WeaveConnector(pipeline_name="test_pipeline")
pipe.add_component("weave", connector)

messages = [
    ChatMessage.from_system(
        "Always respond in German even if some input data is in other languages."
    ),
    ChatMessage.from_user("Tell me about {{location}}"),
]

response = pipe.run(
    data={
        "prompt_builder": {
            "template_variables": {"location": "Berlin"},
            "template": messages,
        }
    }
)

然后,您可以在以下位置查看管道的完整跟踪:https://wandb.ai/<user_name>/projects,在创建管道时指定的管道名称下。WeaveConnector.

与 Agent 一起使用

import os

# Enable Haystack content tracing
os.environ["HAYSTACK_CONTENT_TRACING_ENABLED"] = "true"

from typing import Annotated

from haystack.components.agents import Agent
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack.tools import tool
from haystack import Pipeline

from haystack_integrations.components.connectors.weave import WeaveConnector


@tool
def get_weather(city: Annotated[str, "The city to get weather for"]) -> str:
    """Get current weather information for a city."""
    weather_data = {
        "Berlin": "18°C, partly cloudy",
        "New York": "22°C, sunny",
        "Tokyo": "25°C, clear skies"
    }
    return weather_data.get(city, f"Weather information for {city} not available")


@tool
def calculate(operation: Annotated[str, "Mathematical operation: add, subtract, multiply, divide"], 
              a: Annotated[float, "First number"], 
              b: Annotated[float, "Second number"]) -> str:
    """Perform basic mathematical calculations."""
    if operation == "add":
        result = a + b
    elif operation == "subtract":
        result = a - b
    elif operation == "multiply":
        result = a * b
    elif operation == "divide":
        if b == 0:
            return "Error: Division by zero"
        result = a / b
    else:
        return f"Error: Unknown operation '{operation}'"

    return f"The result of {a} {operation} {b} is {result}"


# Create the chat generator
chat_generator = OpenAIChatGenerator()

# Create the agent with tools
agent = Agent(
    chat_generator=chat_generator,
    tools=[get_weather, calculate],
    system_prompt="You are a helpful assistant with access to weather and calculator tools. Use them when needed.",
    exit_conditions=["text"]
)

# Create the WeaveConnector for tracing
weave_connector = WeaveConnector(pipeline_name="Agent Example")

# Build the pipeline
pipe = Pipeline()
pipe.add_component("tracer", weave_connector)
pipe.add_component("agent", agent)

# Run the pipeline
response = pipe.run(
    data={
        "agent": {
            "messages": [
                ChatMessage.from_user("What's the weather in Berlin and calculate 15 + 27?")
            ]
        },
        "tracer": {}
    }
)

# Display results
print("Agent Response:")
print(response["agent"]["last_message"].text)
print(f"\nPipeline Name: {response['tracer']['pipeline_name']}")
print("\nCheck your Weights & Biases dashboard at https://wandb.ai/<user_name>/projects to see the traces!")