跳到主要内容
MiniMax 是一家中国初创公司,为企业和个人提供自然语言处理模型。 本示例演示如何使用 LangChain.js 与 MiniMax 进行交互。

设置

要使用 MiniMax 模型,您需要一个 MiniMax 账户、一个 API 密钥和一个 Group ID。
有关安装 LangChain 软件包的一般说明,请参阅此部分
npm
npm install @langchain/community @langchain/core
我们正在统一所有包中的模型参数。现在我们建议使用 model 而不是 modelName,以及使用 apiKey 来表示 API 密钥。

基本用法

import { ChatMinimax } from "@langchain/community/chat_models/minimax";
import { HumanMessage } from "@langchain/core/messages";

// Use abab5.5
const abab5_5 = new ChatMinimax({
  model: "abab5.5-chat",
  botSetting: [
    {
      bot_name: "MM Assistant",
      content: "MM Assistant is an AI Assistant developed by minimax.",
    },
  ],
});
const messages = [
  new HumanMessage({
    content: "Hello",
  }),
];

const res = await abab5_5.invoke(messages);
console.log(res);

/*
AIChatMessage {
  text: 'Hello! How may I assist you today?',
  name: undefined,
  additional_kwargs: {}
  }
}
*/

// use abab5
const abab5 = new ChatMinimax({
  proVersion: false,
  model: "abab5-chat",
  minimaxGroupId: process.env.MINIMAX_GROUP_ID, // In Node.js defaults to process.env.MINIMAX_GROUP_ID
  minimaxApiKey: process.env.MINIMAX_API_KEY, // In Node.js defaults to process.env.MINIMAX_API_KEY
});

const result = await abab5.invoke([
  new HumanMessage({
    content: "Hello",
    name: "XiaoMing",
  }),
]);
console.log(result);

/*
AIMessage {
  lc_serializable: true,
  lc_kwargs: {
    content: 'Hello! Can I help you with anything?',
    additional_kwargs: { function_call: undefined }
  },
  lc_namespace: [ 'langchain', 'schema' ],
  content: 'Hello! Can I help you with anything?',
  name: undefined,
  additional_kwargs: { function_call: undefined }
}
 */

链式模型调用

import { LLMChain } from "@langchain/classic/chains";
import { ChatMinimax } from "@langchain/community/chat_models/minimax";
import {
  ChatPromptTemplate,
  HumanMessagePromptTemplate,
  SystemMessagePromptTemplate,
} from "@langchain/core/prompts";

// We can also construct an LLMChain from a ChatPromptTemplate and a chat model.
const chat = new ChatMinimax({ temperature: 0.01 });

const chatPrompt = ChatPromptTemplate.fromMessages([
  SystemMessagePromptTemplate.fromTemplate(
    "You are a helpful assistant that translates {input_language} to {output_language}."
  ),
  HumanMessagePromptTemplate.fromTemplate("{text}"),
]);
const chainB = new LLMChain({
  prompt: chatPrompt,
  llm: chat,
});

const resB = await chainB.invoke({
  input_language: "English",
  output_language: "Chinese",
  text: "I love programming.",
});
console.log({ resB });

通过函数调用

import { ChatMinimax } from "@langchain/community/chat_models/minimax";
import { HumanMessage } from "@langchain/core/messages";

const functionSchema = {
  name: "get_weather",
  description: " Get weather information.",
  parameters: {
    type: "object",
    properties: {
      location: {
        type: "string",
        description: " The location to get the weather",
      },
    },
    required: ["location"],
  },
};

// Bind function arguments to the model.
// All subsequent invoke calls will use the bound parameters.
// "functions.parameters" must be formatted as JSON Schema
const model = new ChatMinimax({
  botSetting: [
    {
      bot_name: "MM Assistant",
      content: "MM Assistant is an AI Assistant developed by minimax.",
    },
  ],
}).withConfig({
  functions: [functionSchema],
});

const result = await model.invoke([
  new HumanMessage({
    content: " What is the weather like in NewYork tomorrow?",
    name: "I",
  }),
]);

console.log(result);

/*
AIMessage {
  lc_serializable: true,
  lc_kwargs: { content: '', additional_kwargs: { function_call: [Object] } },
  lc_namespace: [ 'langchain', 'schema' ],
  content: '',
  name: undefined,
  additional_kwargs: {
    function_call: { name: 'get_weather', arguments: '{"location": "NewYork"}' }
  }
}
*/

// Alternatively, you can pass function call arguments as an additional argument as a one-off:

const minimax = new ChatMinimax({
  model: "abab5.5-chat",
  botSetting: [
    {
      bot_name: "MM Assistant",
      content: "MM Assistant is an AI Assistant developed by minimax.",
    },
  ],
});

const result2 = await minimax.invoke(
  [new HumanMessage("What is the weather like in NewYork tomorrow?")],
  {
    functions: [functionSchema],
  }
);
console.log(result2);

/*
AIMessage {
  lc_serializable: true,
  lc_kwargs: { content: '', additional_kwargs: { function_call: [Object] } },
  lc_namespace: [ 'langchain', 'schema' ],
  content: '',
  name: undefined,
  additional_kwargs: {
    function_call: { name: 'get_weather', arguments: '{"location": "NewYork"}' }
  }
}
 */

带有 Zod 的函数

使用字形

此功能可以帮助用户强制模型返回指定格式的内容。
import { ChatMinimax } from "@langchain/community/chat_models/minimax";
import {
  ChatPromptTemplate,
  HumanMessagePromptTemplate,
} from "@langchain/core/prompts";
import { HumanMessage } from "@langchain/core/messages";

const model = new ChatMinimax({
  model: "abab5.5-chat",
  botSetting: [
    {
      bot_name: "MM Assistant",
      content: "MM Assistant is an AI Assistant developed by minimax.",
    },
  ],
}).withConfig({
  replyConstraints: {
    sender_type: "BOT",
    sender_name: "MM Assistant",
    glyph: {
      type: "raw",
      raw_glyph: "The translated text:{{gen 'content'}}",
    },
  },
});

const messagesTemplate = ChatPromptTemplate.fromMessages([
  HumanMessagePromptTemplate.fromTemplate(
    " Please help me translate the following sentence in English: {text}"
  ),
]);

const messages = await messagesTemplate.formatMessages({ text: "我是谁" });
const result = await model.invoke(messages);

console.log(result);

/*
AIMessage {
  lc_serializable: true,
  lc_kwargs: {
    content: 'The translated text: Who am I\x02',
    additional_kwargs: { function_call: undefined }
  },
  lc_namespace: [ 'langchain', 'schema' ],
  content: 'The translated text: Who am I\x02',
  name: undefined,
  additional_kwargs: { function_call: undefined }
}
*/

// use json_value

const modelMinimax = new ChatMinimax({
  model: "abab5.5-chat",
  botSetting: [
    {
      bot_name: "MM Assistant",
      content: "MM Assistant is an AI Assistant developed by minimax.",
    },
  ],
}).withConfig({
  replyConstraints: {
    sender_type: "BOT",
    sender_name: "MM Assistant",
    glyph: {
      type: "json_value",
      json_properties: {
        name: {
          type: "string",
        },
        age: {
          type: "number",
        },
        is_student: {
          type: "boolean",
        },
        is_boy: {
          type: "boolean",
        },
        courses: {
          type: "object",
          properties: {
            name: {
              type: "string",
            },
            score: {
              type: "number",
            },
          },
        },
      },
    },
  },
});

const result2 = await modelMinimax.invoke([
  new HumanMessage({
    content:
      " My name is Yue Wushuang, 18 years old this year, just finished the test with 99.99 points.",
    name: "XiaoMing",
  }),
]);

console.log(result2);

/*
AIMessage {
  lc_serializable: true,
  lc_kwargs: {
    content: '{\n' +
      '  "name": "Yue Wushuang",\n' +
      '  "is_student": true,\n' +
      '  "is_boy": false,\n' +
      '  "courses":   {\n' +
      '    "name": "Mathematics",\n' +
      '    "score": 99.99\n' +
      '   },\n' +
      '  "age": 18\n' +
      ' }',
    additional_kwargs: { function_call: undefined }
  }
}

 */

附带示例消息

此功能可以帮助模型更好地理解用户想要获取的返回信息,包括但不限于信息的内容、格式和响应模式。
import { ChatMinimax } from "@langchain/community/chat_models/minimax";
import { AIMessage, HumanMessage } from "@langchain/core/messages";

const model = new ChatMinimax({
  model: "abab5.5-chat",
  botSetting: [
    {
      bot_name: "MM Assistant",
      content: "MM Assistant is an AI Assistant developed by minimax.",
    },
  ],
}).withConfig({
  sampleMessages: [
    new HumanMessage({
      content: " Turn A5 into red and modify the content to minimax.",
    }),
    new AIMessage({
      content: "select A5 color red change minimax",
    }),
  ],
});

const result = await model.invoke([
  new HumanMessage({
    content:
      ' Please reply to my content according to the following requirements: According to the following interface list, give the order and parameters of calling the interface for the content I gave. You just need to give the order and parameters of calling the interface, and do not give any other output. The following is the available interface list: select: select specific table position, input parameter use letters and numbers to determine, for example "B13"; color: dye the selected table position, input parameters use the English name of the color, for example "red"; change: modify the selected table position, input parameters use strings.',
  }),
  new HumanMessage({
    content: " Process B6 to gray and modify the content to question.",
  }),
]);

console.log(result);

使用插件

此功能支持调用搜索引擎等工具以获取可辅助模型的额外数据。
import { ChatMinimax } from "@langchain/community/chat_models/minimax";
import { HumanMessage } from "@langchain/core/messages";

const model = new ChatMinimax({
  model: "abab5.5-chat",
  botSetting: [
    {
      bot_name: "MM Assistant",
      content: "MM Assistant is an AI Assistant developed by minimax.",
    },
  ],
}).withConfig({
  plugins: ["plugin_web_search"],
});

const result = await model.invoke([
  new HumanMessage({
    content: " What is the weather like in NewYork tomorrow?",
  }),
]);

console.log(result);

/*
AIMessage {
  lc_serializable: true,
  lc_kwargs: {
    content: 'The weather in Shanghai tomorrow is expected to be hot. Please note that this is just a forecast and the actual weather conditions may vary.',
    additional_kwargs: { function_call: undefined }
  },
  lc_namespace: [ 'langchain', 'schema' ],
  content: 'The weather in Shanghai tomorrow is expected to be hot. Please note that this is just a forecast and the actual weather conditions may vary.',
  name: undefined,
  additional_kwargs: { function_call: undefined }
}
*/

以编程方式连接这些文档到 Claude、VSCode 等,通过 MCP 获取实时答案。
© . This site is unofficial and not affiliated with LangChain, Inc.