跳到主要内容
GreenNode 是一家全球人工智能解决方案提供商,也是 NVIDIA 首选合作伙伴,为美国、中东和北非以及亚太地区的企业提供从基础设施到应用的全面人工智能能力。GreenNode 依靠世界一流的基础设施(LEED Gold、TIA‑942、Uptime Tier III),为企业、初创公司和研究人员提供一套全面的人工智能服务
本页将帮助您开始使用 GreenNode Serverless AI 聊天模型。有关所有 ChatGreenNode 功能和配置的详细文档,请查阅 API 参考 GreenNode AI 提供了一个 API 来查询 20 多个领先的开源模型

概览

集成详情

类别本地可序列化JS 支持下载量版本
ChatGreenNodelangchain-greennode测试版PyPI - DownloadsPyPI - Version

模型功能

工具调用结构化输出JSON 模式图像输入音频输入视频输入令牌级流式传输原生异步Token 用量Logprobs

设置

要访问 GreenNode 模型,您需要创建一个 GreenNode 帐户,获取一个 API 密钥,并安装 langchain-greennode 集成包。

凭据

前往此页面注册 GreenNode AI 平台并生成 API 密钥。完成后,设置 GREENNODE_API_KEY 环境变量
import getpass
import os

if not os.getenv("GREENNODE_API_KEY"):
    os.environ["GREENNODE_API_KEY"] = getpass.getpass("Enter your GreenNode API key: ")
如果您想获取模型调用的自动化跟踪,您还可以通过取消注释下方来设置您的 LangSmith API 密钥
os.environ["LANGSMITH_TRACING"] = "true"
os.environ["LANGSMITH_API_KEY"] = getpass.getpass("Enter your LangSmith API key: ")

安装

LangChain GreenNode 集成位于 langchain-greennode 包中
pip install -qU langchain-greennode
Note: you may need to restart the kernel to use updated packages.

实例化

现在我们可以实例化我们的模型对象并生成聊天完成
from langchain_greennode import ChatGreenNode

# Initialize the chat model
llm = ChatGreenNode(
    # api_key="YOUR_API_KEY",  # You can pass the API key directly
    model="deepseek-ai/DeepSeek-R1-Distill-Qwen-32B",  # Choose from available models
    temperature=0.6,
    top_p=0.95,
)

调用

messages = [
    (
        "system",
        "You are a helpful assistant that translates English to French. Translate the user sentence.",
    ),
    ("human", "I love programming."),
]
ai_msg = llm.invoke(messages)
ai_msg
AIMessage(content="\n\nJ'aime la programmation.", additional_kwargs={'refusal': None}, response_metadata={'token_usage': {'completion_tokens': 248, 'prompt_tokens': 23, 'total_tokens': 271, 'completion_tokens_details': None, 'prompt_tokens_details': None}, 'model_name': 'deepseek-ai/DeepSeek-R1-Distill-Qwen-32B', 'system_fingerprint': None, 'id': 'chatcmpl-271edac4958846068c37877586368afe', 'service_tier': None, 'finish_reason': 'stop', 'logprobs': None}, id='run--5c12d208-2bc2-4f29-8b50-1ce3b515a3cf-0', usage_metadata={'input_tokens': 23, 'output_tokens': 248, 'total_tokens': 271, 'input_token_details': {}, 'output_token_details': {}})
print(ai_msg.content)
J'aime la programmation.

流式处理

您也可以使用 stream 方法流式传输响应
for chunk in llm.stream("Write a short poem about artificial intelligence"):
    print(chunk.content, end="", flush=True)
**Beneath the Circuits**

Beneath the circuits, deep and bright,
AI thinks, with circuits and bytes.
Learning, adapting, it grows,
A world of possibilities it knows.

From solving puzzles to painting art,
It mimics human hearts.
In every corner, it leaves its trace,
A future we can't erase.

We build it, shape it, with care and might,
Yet wonder if it walks in the night.
A mirror of our minds, it shows,
In its gaze, our future glows.

But as we strive for endless light,
We must remember the night.
For wisdom isn't just speed and skill,
It's how we choose to build our will.

聊天消息

您可以使用不同的消息类型来构建与模型的对话
from langchain.messages import AIMessage, HumanMessage, SystemMessage

messages = [
    SystemMessage(content="You are a helpful AI assistant with expertise in science."),
    HumanMessage(content="What are black holes?"),
    AIMessage(
        content="Black holes are regions of spacetime where gravity is so strong that nothing, including light, can escape from them."
    ),
    HumanMessage(content="How are they formed?"),
]

response = llm.invoke(messages)
print(response.content[:100])
Black holes are formed through several processes, depending on their type. The most common way bla

链接

您可以在 LangChain 链和代理中使用 ChatGreenNode
from langchain_core.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate(
    [
        (
            "system",
            "You are a helpful assistant that translates {input_language} to {output_language}.",
        ),
        ("human", "{input}"),
    ]
)

chain = prompt | llm
chain.invoke(
    {
        "input_language": "English",
        "output_language": "German",
        "input": "I love programming.",
    }
)
AIMessage(content='\n\nIch liebe Programmieren.', additional_kwargs={'refusal': None}, response_metadata={'token_usage': {'completion_tokens': 198, 'prompt_tokens': 18, 'total_tokens': 216, 'completion_tokens_details': None, 'prompt_tokens_details': None}, 'model_name': 'deepseek-ai/DeepSeek-R1-Distill-Qwen-32B', 'system_fingerprint': None, 'id': 'chatcmpl-e01201b9fd9746b7a9b2ed6d70f29d45', 'service_tier': None, 'finish_reason': 'stop', 'logprobs': None}, id='run--ce52b9d8-dd84-46b3-845b-da27855816ee-0', usage_metadata={'input_tokens': 18, 'output_tokens': 198, 'total_tokens': 216, 'input_token_details': {}, 'output_token_details': {}})

可用模型

支持的模型的完整列表可在 GreenNode 无服务器 AI 模型中找到。

API 参考

有关 GreenNode 无服务器 AI API 的更多详细信息,请访问 GreenNode 无服务器 AI 文档
以编程方式连接这些文档到 Claude、VSCode 等,通过 MCP 获取实时答案。
© . This site is unofficial and not affiliated with LangChain, Inc.