跳到主要内容
这将帮助您开始使用 IBM watsonx.ai 聊天模型。有关所有 IBM watsonx.ai 功能和配置的详细文档,请参阅 IBM watsonx.ai

概览

集成详情

类别本地可序列化PY 支持下载量版本
ChatWatsonx@langchain/communityNPM - DownloadsNPM - Version

模型功能

工具调用结构化输出JSON 模式图像输入音频输入视频输入令牌级流式传输Token 用量Logprobs

设置

要访问 IBM watsonx.ai 模型,您需要创建一个 IBM watsonx.ai 账户,获取 API 密钥,并安装 @langchain/community 集成包。

凭据

前往 IBM Cloud 注册 IBM watsonx.ai 并生成 API 密钥或提供如下所示的任何其他身份验证形式。

IAM 身份验证

export WATSONX_AI_AUTH_TYPE=iam
export WATSONX_AI_APIKEY=<YOUR-APIKEY>

Bearer 令牌身份验证

export WATSONX_AI_AUTH_TYPE=bearertoken
export WATSONX_AI_BEARER_TOKEN=<YOUR-BEARER-TOKEN>

IBM watsonx.ai 软件身份验证

export WATSONX_AI_AUTH_TYPE=cp4d
export WATSONX_AI_USERNAME=<YOUR_USERNAME>
export WATSONX_AI_PASSWORD=<YOUR_PASSWORD>
export WATSONX_AI_URL=<URL>
一旦这些值被放置在您的环境变量中并且对象被初始化,身份验证将自动进行。 身份验证也可以通过将这些值作为参数传递给新实例来完成。

IAM 身份验证

import { WatsonxLLM } from "@langchain/community/llms/ibm";

const props = {
  version: "YYYY-MM-DD",
  serviceUrl: "<SERVICE_URL>",
  projectId: "<PROJECT_ID>",
  watsonxAIAuthType: "iam",
  watsonxAIApikey: "<YOUR-APIKEY>",
};
const instance = new WatsonxLLM(props);

Bearer 令牌身份验证

import { WatsonxLLM } from "@langchain/community/llms/ibm";

const props = {
  version: "YYYY-MM-DD",
  serviceUrl: "<SERVICE_URL>",
  projectId: "<PROJECT_ID>",
  watsonxAIAuthType: "bearertoken",
  watsonxAIBearerToken: "<YOUR-BEARERTOKEN>",
};
const instance = new WatsonxLLM(props);

IBM watsonx.ai 软件身份验证

import { WatsonxLLM } from "@langchain/community/llms/ibm";

const props = {
  version: "YYYY-MM-DD",
  serviceUrl: "<SERVICE_URL>",
  projectId: "<PROJECT_ID>",
  watsonxAIAuthType: "cp4d",
  watsonxAIUsername: "<YOUR-USERNAME>",
  watsonxAIPassword: "<YOUR-PASSWORD>",
  watsonxAIUrl: "<url>",
};
const instance = new WatsonxLLM(props);
如果您想获取模型调用的自动化跟踪,您还可以通过取消注释下方来设置您的 LangSmith API 密钥
# export LANGSMITH_TRACING="true"
# export LANGSMITH_API_KEY="your-api-key"

安装

LangChain IBM watsonx.ai 集成位于 @langchain/community 包中
npm install @langchain/community @langchain/core

实例化

现在我们可以实例化我们的模型对象并生成聊天完成
import { ChatWatsonx } from "@langchain/community/chat_models/ibm";
const props = {
  maxTokens: 200,
  temperature: 0.5
};

const instance = new ChatWatsonx({
  version: "YYYY-MM-DD",
  serviceUrl: process.env.API_URL,
  projectId: "<PROJECT_ID>",
  // spaceId: "<SPACE_ID>",
  // idOrName: "<DEPLOYMENT_ID>",
  model: "<MODEL_ID>",
  ...props
});
注意
  • 您必须提供 spaceIdprojectIdidOrName(部署 ID),除非您使用轻量级引擎,该引擎无需指定即可工作(请参阅 watsonx.ai 文档
  • 根据您提供的服务实例的区域,使用正确的 serviceUrl。

使用模型网关

import { ChatWatsonx } from "@langchain/community/chat_models/ibm";
const props = {
  maxTokens: 200,
  temperature: 0.5
};

const instance = new ChatWatsonx({
  version: "YYYY-MM-DD",
  serviceUrl: process.env.API_URL,
  model: "<ALIAS_MODEL_ID>",
  modelGateway: true,
  ...props
});
要将模型网关与 Langchain 结合使用,您需要预先创建提供程序并通过 @ibm-cloud/watsonx-ai SDK 或 watsonx.ai API 添加模型。请遵循此文档

调用

const aiMsg = await instance.invoke([{
  role: "system",
  content: "You are a helpful assistant that translates English to French. Translate the user sentence.",
},
{
  role: "user",
  content: "I love programming."
}]);
console.log(aiMsg)
AIMessage {
  "id": "chat-c5341b2062dc42f091e5ae2558e905e3",
  "content": " J'adore la programmation.",
  "additional_kwargs": {
    "tool_calls": []
  },
  "response_metadata": {
    "tokenUsage": {
      "completion_tokens": 10,
      "prompt_tokens": 28,
      "total_tokens": 38
    },
    "finish_reason": "stop"
  },
  "tool_calls": [],
  "invalid_tool_calls": [],
  "usage_metadata": {
    "input_tokens": 28,
    "output_tokens": 10,
    "total_tokens": 38
  }
}
console.log(aiMsg.content)
 J'adore la programmation.

流式传输模型输出

import { HumanMessage, SystemMessage } from "@langchain/core/messages";

const messages = [
    new SystemMessage('You are a helpful assistant which telling short-info about provided topic.'),
    new HumanMessage("moon")
]
const stream = await instance.stream(messages);
for await(const chunk of stream){
    console.log(chunk)
}
 The
 Moon
 is
 Earth
'
s
 only
 natural
 satellite
 and

工具调用

import { tool } from "@langchain/core/tools";
import * as z from "zod";

const calculatorSchema = z.object({
    operation: z
      .enum(["add", "subtract", "multiply", "divide"])
      .describe("The type of operation to execute."),
    number1: z.number().describe("The first number to operate on."),
    number2: z.number().describe("The second number to operate on."),
  });

const calculatorTool = tool(
async ({ operation, number1, number2 }) => {
    if (operation === "add") {
    return `${number1 + number2}`;
    } else if (operation === "subtract") {
    return `${number1 - number2}`;
    } else if (operation === "multiply") {
    return `${number1 * number2}`;
    } else if (operation === "divide") {
    return `${number1 / number2}`;
    } else {
    throw new Error("Invalid operation.");
    }
},
{
    name: "calculator",
    description: "Can perform mathematical operations.",
    schema: calculatorSchema,
}
);

const instanceWithTools = instance.bindTools([calculatorTool]);

const res = await instanceWithTools.invoke("What is 3 * 12");
console.log(res)
AIMessage {
  "id": "chat-d2214d0bdb794483a213b3211cf0d819",
  "content": "",
  "additional_kwargs": {
    "tool_calls": [
      {
        "id": "chatcmpl-tool-257f3d39532141b89178c2120f81f0cb",
        "type": "function",
        "function": "[Object]"
      }
    ]
  },
  "response_metadata": {
    "tokenUsage": {
      "completion_tokens": 38,
      "prompt_tokens": 177,
      "total_tokens": 215
    },
    "finish_reason": "tool_calls"
  },
  "tool_calls": [
    {
      "name": "calculator",
      "args": {
        "number1": 3,
        "number2": 12,
        "operation": "multiply"
      },
      "type": "tool_call",
      "id": "chatcmpl-tool-257f3d39532141b89178c2120f81f0cb"
    }
  ],
  "invalid_tool_calls": [],
  "usage_metadata": {
    "input_tokens": 177,
    "output_tokens": 38,
    "total_tokens": 215
  }
}

API 参考

有关所有 IBM watsonx.ai 功能和配置的详细文档,请参阅 API 参考:API 文档
以编程方式连接这些文档到 Claude、VSCode 等,通过 MCP 获取实时答案。
© . This site is unofficial and not affiliated with LangChain, Inc.