跳到主要内容
LangChain Ollama 集成包正式支持工具调用。点击此处查看文档
LangChain 提供了一个围绕通过 Ollama 在本地运行的开源模型的实验性封装器,它提供了与 OpenAI Functions 相同的 API。 请注意,更强大、更有能力的模型在处理复杂模式和/或多个函数时表现会更好。下面的示例使用 Mistral
这是一个实验性封装器,它试图为不原生支持工具调用的模型添加工具调用支持。请谨慎使用。

设置

按照这些说明设置并运行本地 Ollama 实例。

初始化模型

您可以像初始化标准 ChatOllama 实例一样初始化此封装器
import { OllamaFunctions } from "@langchain/community/experimental/chat_models/ollama_functions";

const model = new OllamaFunctions({
  temperature: 0.1,
  model: "mistral",
});

传入函数

现在您可以像 OpenAI 一样传入函数
import { ChatOllama } from "@langchain/ollama";
import { HumanMessage } from "@langchain/core/messages";

const model = new ChatOllama({
  temperature: 0.1,
  model: "mistral",
})
  .bindTools([
    {
      name: "get_current_weather",
      description: "Get the current weather in a given location",
      parameters: {
        type: "object",
        properties: {
          location: {
            type: "string",
            description: "The city and state, e.g. San Francisco, CA",
          },
          unit: { type: "string", enum: ["celsius", "fahrenheit"] },
        },
        required: ["location"],
      },
    },
  ])
  .withConfig({
    // You can set the `tool_choice` arg to force the model to use a function
    tool_choice: "get_current_weather",
  });

const response = await model.invoke([
  new HumanMessage({
    content: "What's the weather in Boston?",
  }),
]);

console.log(response);

/*
  AIMessage {
    content: '',
    additional_kwargs: {
      function_call: {
        name: 'get_current_weather',
        arguments: '{"location":"Boston, MA","unit":"fahrenheit"}'
      }
    }
  }
*/

用于提取

import * as z from "zod";

import { ChatOllama } from "@langchain/ollama";
import { PromptTemplate } from "@langchain/core/prompts";
import { JsonOutputFunctionsParser } from "@langchain/core/output_parsers/openai_functions";

const EXTRACTION_TEMPLATE = `Extract and save the relevant entities mentioned in the following passage together with their properties.

Passage:
{input}
`;

const prompt = PromptTemplate.fromTemplate(EXTRACTION_TEMPLATE);

// Use Zod for easier schema declaration
const schema = z.object({
  people: z.array(
    z.object({
      name: z.string().describe("The name of a person"),
      height: z.number().describe("The person's height"),
      hairColor: z.optional(z.string()).describe("The person's hair color"),
    })
  ),
});

const model = new ChatOllama({
  temperature: 0.1,
  model: "mistral",
})
  .bindTools([
    {
      name: "information_extraction",
      description: "Extracts the relevant information from the passage.",
      schema,
    },
  ])
  .withConfig({
    tool_choice: "information_extraction",
  });

// Use a JsonOutputFunctionsParser to get the parsed JSON response directly.
const chain = prompt.pipe(model).pipe(new JsonOutputFunctionsParser());

const response = await chain.invoke({
  input:
    "Alex is 5 feet tall. Claudia is 1 foot taller than Alex and jumps higher than him. Claudia has orange hair and Alex is blonde.",
});

console.log(JSON.stringify(response, null, 2));

/*
{
  "people": [
    {
      "name": "Alex",
      "height": 5,
      "hairColor": "blonde"
    },
    {
      "name": "Claudia",
      "height": {
        "$num": 1,
        "add": [
          {
            "name": "Alex",
            "prop": "height"
          }
        ]
      },
      "hairColor": "orange"
    }
  ]
}
*/
您可以在此处查看此操作的简单 LangSmith 跟踪

自定义

在后台,这使用 Ollama 的 JSON 模式将输出限制为 JSON,然后将工具模式作为 JSON 模式传递到提示中。 由于不同的模型有不同的优势,传入您自己的系统提示可能会有所帮助。这是一个示例:
import { ChatOllama } from "@langchain/ollama";
import { HumanMessage, SystemMessage } from "@langchain/core/messages";

// Custom system prompt to format tools. You must encourage the model
// to wrap output in a JSON object with "tool" and "tool_input" properties.
const toolSystemPromptTemplate = `You have access to the following tools:

{tools}

To use a tool, respond with a JSON object with the following structure:
{{
  "tool": <name of the called tool>,
  "tool_input": <parameters for the tool matching the above JSON schema>
}}`;

const model = new ChatOllama({
  temperature: 0.1,
  model: "mistral",
})
  .bindTools([
    {
      name: "get_current_weather",
      description: "Get the current weather in a given location",
      parameters: {
        type: "object",
        properties: {
          location: {
            type: "string",
            description: "The city and state, e.g. San Francisco, CA",
          },
          unit: { type: "string", enum: ["celsius", "fahrenheit"] },
        },
        required: ["location"],
      },
    },
  ])
  .withConfig({
    // You can set the `tool_choice` arg to force the model to use a function
    tool_choice: "get_current_weather",
  });

const response = await model.invoke([
  new SystemMessage(toolSystemPromptTemplate),
  new HumanMessage({
    content: "What's the weather in Boston?",
  }),
]);

console.log(response);

/*
  AIMessage {
    content: '',
    additional_kwargs: {
      function_call: {
        name: 'get_current_weather',
        arguments: '{"location":"Boston, MA","unit":"fahrenheit"}'
      }
    }
  }
*/

以编程方式连接这些文档到 Claude、VSCode 等,通过 MCP 获取实时答案。
© . This site is unofficial and not affiliated with LangChain, Inc.