MistralAIEmbeddings 功能和配置选项的详细文档,请参阅 API 参考。
概览
集成详情
| 类别 | 包 | 本地 | Python 支持 | 下载量 | 版本 |
|---|---|---|---|---|---|
| MistralAIEmbeddings | @langchain/mistralai | ❌ | ✅ |
设置
要访问 MistralAI 嵌入模型,您需要创建一个 MistralAI 帐户,获取 API 密钥,并安装@langchain/mistralai 集成包。
凭据
前往 console.mistral.ai 注册MistralAI 并生成 API 密钥。完成此操作后,设置 MISTRAL_API_KEY 环境变量。
复制
向 AI 提问
export MISTRAL_API_KEY="your-api-key"
复制
向 AI 提问
# export LANGSMITH_TRACING="true"
# export LANGSMITH_API_KEY="your-api-key"
安装
LangChain MistralAIEmbeddings 集成位于@langchain/mistralai 包中。
复制
向 AI 提问
npm install @langchain/mistralai @langchain/core
实例化
现在我们可以实例化我们的模型对象并生成聊天完成复制
向 AI 提问
import { MistralAIEmbeddings } from "@langchain/mistralai";
const embeddings = new MistralAIEmbeddings({
model: "mistral-embed", // Default value
});
索引和检索
嵌入模型通常用于检索增强生成 (RAG) 流程中,既作为索引数据的一部分,也用于后续检索数据。有关更详细的说明,请参阅学习选项卡下的 RAG 教程。 下面,我们将演示如何使用我们上面初始化的embeddings 对象来索引和检索数据。在此示例中,我们将使用演示 MemoryVectorStore 索引和检索一个示例文档。复制
向 AI 提问
// Create a vector store with a sample text
import { MemoryVectorStore } from "@langchain/classic/vectorstores/memory";
const text = "LangChain is the framework for building context-aware reasoning applications";
const vectorstore = await MemoryVectorStore.fromDocuments(
[{ pageContent: text, metadata: {} }],
embeddings,
);
// Use the vector store as a retriever that returns a single document
const retriever = vectorstore.asRetriever(1);
// Retrieve the most similar text
const retrievedDocuments = await retriever.invoke("What is LangChain?");
retrievedDocuments[0].pageContent;
复制
向 AI 提问
LangChain is the framework for building context-aware reasoning applications
直接使用
在底层,向量存储和检索器实现分别调用embeddings.embedDocument(...) 和 embeddings.embedQuery(...) 来为 fromDocuments 和检索器的 invoke 操作中使用的文本创建嵌入。 您可以直接调用这些方法来获取您自己用例的嵌入。嵌入单个文本
您可以使用embedQuery 嵌入查询以进行搜索。这会生成特定于查询的向量表示。
复制
向 AI 提问
const singleVector = await embeddings.embedQuery(text);
console.log(singleVector.slice(0, 100));
复制
向 AI 提问
[
-0.04443359375, 0.01885986328125, 0.018035888671875,
-0.00864410400390625, 0.049652099609375, -0.001190185546875,
0.028900146484375, -0.035675048828125, -0.00702667236328125,
0.00016105175018310547, -0.027587890625, 0.029388427734375,
-0.053253173828125, -0.0003020763397216797, -0.046112060546875,
0.0258026123046875, -0.0010776519775390625, 0.02703857421875,
0.040985107421875, -0.004547119140625, -0.020172119140625,
-0.02606201171875, -0.01457977294921875, 0.01220703125,
-0.0078582763671875, -0.0084228515625, -0.02056884765625,
-0.071044921875, -0.0404052734375, 0.00923919677734375,
0.01407623291015625, -0.0210113525390625, 0.0006284713745117188,
-0.01465606689453125, 0.0186309814453125, -0.015838623046875,
0.0007920265197753906, -0.04437255859375, 0.008758544921875,
-0.0172119140625, 0.01312255859375, -0.01358795166015625,
-0.0212860107421875, -0.000035822391510009766, -0.0226898193359375,
-0.01390838623046875, -0.007659912109375, -0.016021728515625,
0.025909423828125, -0.034515380859375, -0.0372314453125,
0.020355224609375, -0.02606201171875, -0.0158843994140625,
-0.037994384765625, 0.00450897216796875, 0.0142822265625,
-0.012725830078125, -0.0770263671875, 0.02630615234375,
-0.048614501953125, 0.006072998046875, 0.00417327880859375,
-0.005138397216796875, 0.02557373046875, 0.0311279296875,
0.026519775390625, -0.0103607177734375, -0.0108489990234375,
-0.029510498046875, 0.022186279296875, 0.0256500244140625,
-0.0186309814453125, 0.0443115234375, -0.0304107666015625,
-0.03131103515625, 0.007427215576171875, 0.0234527587890625,
0.0224761962890625, 0.00463104248046875, -0.0037021636962890625,
0.0302581787109375, 0.0733642578125, -0.0121612548828125,
-0.0172576904296875, 0.019317626953125, 0.029052734375,
-0.0024871826171875, 0.0174713134765625, 0.026092529296875,
0.04425048828125, -0.0004563331604003906, 0.0146026611328125,
-0.00748443603515625, 0.06146240234375, 0.02294921875,
-0.016845703125, -0.0014057159423828125, -0.01435089111328125,
0.06097412109375
]
嵌入多个文本
您可以使用embedDocuments 嵌入多个文本以进行索引。此方法使用的内部机制可能(但不一定)与嵌入查询不同。
复制
向 AI 提问
const text2 = "LangGraph is a library for building stateful, multi-actor applications with LLMs";
const vectors = await embeddings.embedDocuments([text, text2]);
console.log(vectors[0].slice(0, 100));
console.log(vectors[1].slice(0, 100));
复制
向 AI 提问
[
-0.04443359375, 0.01885986328125, 0.0180511474609375,
-0.0086517333984375, 0.049652099609375, -0.00121307373046875,
0.0289154052734375, -0.03570556640625, -0.007015228271484375,
0.0001499652862548828, -0.0276641845703125, 0.0294036865234375,
-0.05322265625, -0.0002808570861816406, -0.04608154296875,
0.02581787109375, -0.0011072158813476562, 0.027099609375,
0.040985107421875, -0.004547119140625, -0.0201873779296875,
-0.0260772705078125, -0.0146026611328125, 0.0121917724609375,
-0.007843017578125, -0.0084381103515625, -0.0205535888671875,
-0.07110595703125, -0.04046630859375, 0.00931549072265625,
0.01409912109375, -0.02099609375, 0.0006232261657714844,
-0.014678955078125, 0.0186614990234375, -0.0158233642578125,
0.000812530517578125, -0.04437255859375, 0.00873565673828125,
-0.0172119140625, 0.013092041015625, -0.0135498046875,
-0.0212860107421875, -0.000006735324859619141, -0.0226898193359375,
-0.01389312744140625, -0.0076751708984375, -0.0160064697265625,
0.0259246826171875, -0.0345458984375, -0.037200927734375,
0.020355224609375, -0.0260009765625, -0.0159149169921875,
-0.03802490234375, 0.004489898681640625, 0.0143280029296875,
-0.01274871826171875, -0.07708740234375, 0.0263214111328125,
-0.04864501953125, 0.00608062744140625, 0.004192352294921875,
-0.005115509033203125, 0.0255889892578125, 0.0311279296875,
0.0265045166015625, -0.0103607177734375, -0.01084136962890625,
-0.0294952392578125, 0.022186279296875, 0.0256500244140625,
-0.0186767578125, 0.044342041015625, -0.030426025390625,
-0.03131103515625, 0.007396697998046875, 0.0234527587890625,
0.0224609375, 0.004634857177734375, -0.003643035888671875,
0.0302886962890625, 0.07342529296875, -0.01221466064453125,
-0.017303466796875, 0.0193023681640625, 0.029052734375,
-0.0024890899658203125, 0.0174407958984375, 0.026123046875,
0.044219970703125, -0.0004944801330566406, 0.01462554931640625,
-0.007450103759765625, 0.06146240234375, 0.022979736328125,
-0.016845703125, -0.001445770263671875, -0.0143890380859375,
0.06097412109375
]
[
-0.02032470703125, 0.02606201171875, 0.051605224609375,
-0.0281982421875, 0.055755615234375, 0.001987457275390625,
0.031982421875, -0.0131378173828125, -0.0252685546875,
0.001010894775390625, -0.024017333984375, 0.053375244140625,
-0.042816162109375, 0.005584716796875, -0.04132080078125,
0.03021240234375, 0.01324462890625, 0.016876220703125,
0.041961669921875, -0.004299163818359375, -0.0273895263671875,
-0.039642333984375, -0.021575927734375, 0.0309295654296875,
-0.0099945068359375, -0.0163726806640625, -0.00968170166015625,
-0.07733154296875, -0.030364990234375, -0.003864288330078125,
0.016387939453125, -0.0389404296875, -0.0026702880859375,
-0.0176544189453125, 0.0264434814453125, -0.01226806640625,
-0.0022220611572265625, -0.039703369140625, -0.00907135009765625,
-0.0260467529296875, 0.03155517578125, -0.0004324913024902344,
-0.019500732421875, -0.0120697021484375, -0.008544921875,
-0.01654052734375, 0.00067138671875, -0.0134735107421875,
0.01080322265625, -0.034759521484375, -0.06201171875,
0.012359619140625, -0.006237030029296875, -0.0168914794921875,
-0.0183563232421875, 0.0236053466796875, -0.0021419525146484375,
-0.0164947509765625, -0.052581787109375, 0.022125244140625,
-0.045745849609375, -0.0009088516235351562, 0.0097808837890625,
-0.0009326934814453125, 0.041656494140625, 0.0269775390625,
0.016845703125, -0.0022335052490234375, -0.0182342529296875,
-0.0245208740234375, 0.0036602020263671875, -0.0188751220703125,
-0.0023956298828125, 0.0238800048828125, -0.034942626953125,
-0.033782958984375, 0.0046234130859375, 0.0318603515625,
0.0251007080078125, -0.0023288726806640625, -0.0225677490234375,
0.0004394054412841797, 0.064208984375, -0.0254669189453125,
-0.0234222412109375, 0.0009264945983886719, 0.01464080810546875,
0.006626129150390625, -0.007450103759765625, 0.02642822265625,
0.0260009765625, 0.00536346435546875, 0.01479339599609375,
-0.0032253265380859375, 0.0498046875, 0.048248291015625,
-0.01519012451171875, 0.00605010986328125, 0.019744873046875,
0.0296478271484375
]
挂钩
Mistral AI 支持三个事件的自定义钩子:beforeRequest、requestError 和 response。每种钩子类型的函数签名示例如下所示复制
向 AI 提问
const beforeRequestHook = (req: Request): Request | void | Promise<Request | void> => {
// Code to run before a request is processed by Mistral
};
const requestErrorHook = (err: unknown, req: Request): void | Promise<void> => {
// Code to run when an error occurs as Mistral is processing a request
};
const responseHook = (res: Response, req: Request): void | Promise<void> => {
// Code to run before Mistral sends a successful response
};
复制
向 AI 提问
import { ChatMistralAI } from "@langchain/mistralai"
const modelWithHooks = new ChatMistralAI({
model: "mistral-large-latest",
temperature: 0,
maxRetries: 2,
beforeRequestHooks: [ beforeRequestHook ],
requestErrorHooks: [ requestErrorHook ],
responseHooks: [ responseHook ],
// other params...
});
复制
向 AI 提问
import { ChatMistralAI } from "@langchain/mistralai"
const model = new ChatMistralAI({
model: "mistral-large-latest",
temperature: 0,
maxRetries: 2,
// other params...
});
model.beforeRequestHooks = [ ...model.beforeRequestHooks, beforeRequestHook ];
model.requestErrorHooks = [ ...model.requestErrorHooks, requestErrorHook ];
model.responseHooks = [ ...model.responseHooks, responseHook ];
model.addAllHooksToHttpClient();
复制
向 AI 提问
model.removeHookFromHttpClient(beforeRequestHook);
model.removeAllHooksFromHttpClient();
API 参考
有关所有 MistralAIEmbeddings 功能和配置的详细文档,请参阅 API 参考。以编程方式连接这些文档到 Claude、VSCode 等,通过 MCP 获取实时答案。