跳到主要内容
Snowflake Cortex 让您即时访问由 Mistral、Reka、Meta 和 Google 等公司研究人员训练的业界领先的大型语言模型 (LLM),其中包括由 Snowflake 开发的开源企业级模型 Snowflake Arctic 本示例介绍了如何使用 LangChain 与 Snowflake Cortex 交互。

安装与设置

我们首先使用以下命令安装 `snowflake-snowpark-python` 库。然后,我们配置连接到 Snowflake 的凭据,可以是环境变量,也可以直接传入。
pip install -qU snowflake-snowpark-python
import getpass
import os

# First step is to set up the environment variables, to connect to Snowflake,
# you can also pass these snowflake credentials while instantiating the model

if os.environ.get("SNOWFLAKE_ACCOUNT") is None:
    os.environ["SNOWFLAKE_ACCOUNT"] = getpass.getpass("Account: ")

if os.environ.get("SNOWFLAKE_USERNAME") is None:
    os.environ["SNOWFLAKE_USERNAME"] = getpass.getpass("Username: ")

if os.environ.get("SNOWFLAKE_PASSWORD") is None:
    os.environ["SNOWFLAKE_PASSWORD"] = getpass.getpass("Password: ")

if os.environ.get("SNOWFLAKE_DATABASE") is None:
    os.environ["SNOWFLAKE_DATABASE"] = getpass.getpass("Database: ")

if os.environ.get("SNOWFLAKE_SCHEMA") is None:
    os.environ["SNOWFLAKE_SCHEMA"] = getpass.getpass("Schema: ")

if os.environ.get("SNOWFLAKE_WAREHOUSE") is None:
    os.environ["SNOWFLAKE_WAREHOUSE"] = getpass.getpass("Warehouse: ")

if os.environ.get("SNOWFLAKE_ROLE") is None:
    os.environ["SNOWFLAKE_ROLE"] = getpass.getpass("Role: ")
from langchain_community.chat_models import ChatSnowflakeCortex
from langchain.messages import HumanMessage, SystemMessage

# By default, we'll be using the cortex provided model: `mistral-large`, with function: `complete`
chat = ChatSnowflakeCortex()
以上单元格假设您的 Snowflake 凭据已设置在您的环境变量中。如果您想手动指定它们,请使用以下代码
chat = ChatSnowflakeCortex(
    # Change the default cortex model and function
    model="mistral-large",
    cortex_function="complete",

    # Change the default generation parameters
    temperature=0,
    max_tokens=10,
    top_p=0.95,

    # Specify your Snowflake Credentials
    account="YOUR_SNOWFLAKE_ACCOUNT",
    username="YOUR_SNOWFLAKE_USERNAME",
    password="YOUR_SNOWFLAKE_PASSWORD",
    database="YOUR_SNOWFLAKE_DATABASE",
    schema="YOUR_SNOWFLAKE_SCHEMA",
    role="YOUR_SNOWFLAKE_ROLE",
    warehouse="YOUR_SNOWFLAKE_WAREHOUSE"
)

调用聊天模型

我们现在可以使用 `invoke` 或 `stream` 方法调用聊天模型。 messages = [ SystemMessage(content="您是一位友好的助手。"), HumanMessage(content="什么是大型语言模型?"), ] chat.invoke(messages)

流式处理

# Sample input prompt
messages = [
    SystemMessage(content="You are a friendly assistant."),
    HumanMessage(content="What are large language models?"),
]

# Invoke the stream method and print each chunk as it arrives
print("Stream Method Response:")
for chunk in chat._stream(messages):
    print(chunk.message.content)

以编程方式连接这些文档到 Claude、VSCode 等,通过 MCP 获取实时答案。
© . This site is unofficial and not affiliated with LangChain, Inc.