跳到主要内容
OCI 数据科学是一个完全托管的无服务器平台,供数据科学团队在 Oracle 云基础设施中构建、训练和管理机器学习模型。
有关最新更新、示例和实验功能,请参阅ADS LangChain 集成
本笔记本介绍了如何使用托管在OCI 数据科学模型部署上的 LLM。 对于身份验证,使用 oracle-ads 库自动加载调用端点所需的凭据。
!pip3 install oracle-ads

先决条件

部署模型

您可以使用 OCI 数据科学模型部署上的AI 快速操作轻松部署、微调和评估基础模型。有关其他部署示例,请访问 Oracle GitHub 示例仓库

策略

请确保拥有访问 OCI 数据科学模型部署端点所需的策略

设置

部署模型后,您必须设置以下必需的调用参数
  • endpoint:来自已部署模型的模型 HTTP 端点,例如 https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict

身份验证

您可以通过 ads 或环境变量设置身份验证。当您在 OCI 数据科学笔记本会话中工作时,可以利用资源主体访问其他 OCI 资源。请查看此处以查看更多选项。

示例

import ads
from langchain_community.llms import OCIModelDeploymentLLM

# Set authentication through ads
# Use resource principal are operating within a
# OCI service that has resource principal based
# authentication configured
ads.set_auth("resource_principal")

# Create an instance of OCI Model Deployment Endpoint
# Replace the endpoint uri and model name with your own
# Using generic class as entry point, you will be able
# to pass model parameters through model_kwargs during
# instantiation.
llm = OCIModelDeploymentLLM(
    endpoint="https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict",
    model="odsc-llm",
)

# Run the LLM
llm.invoke("Who is the first president of United States?")
import ads
from langchain_community.llms import OCIModelDeploymentVLLM

# Set authentication through ads
# Use resource principal are operating within a
# OCI service that has resource principal based
# authentication configured
ads.set_auth("resource_principal")

# Create an instance of OCI Model Deployment Endpoint
# Replace the endpoint uri and model name with your own
# Using framework specific class as entry point, you will
# be able to pass model parameters in constructor.
llm = OCIModelDeploymentVLLM(
    endpoint="https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict",
)

# Run the LLM
llm.invoke("Who is the first president of United States?")
import os

from langchain_community.llms import OCIModelDeploymentTGI

# Set authentication through environment variables
# Use API Key setup when you are working from a local
# workstation or on platform which does not support
# resource principals.
os.environ["OCI_IAM_TYPE"] = "api_key"
os.environ["OCI_CONFIG_PROFILE"] = "default"
os.environ["OCI_CONFIG_LOCATION"] = "~/.oci"

# Set endpoint through environment variables
# Replace the endpoint uri with your own
os.environ["OCI_LLM_ENDPOINT"] = (
    "https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict"
)

# Create an instance of OCI Model Deployment Endpoint
# Using framework specific class as entry point, you will
# be able to pass model parameters in constructor.
llm = OCIModelDeploymentTGI()

# Run the LLM
llm.invoke("Who is the first president of United States?")

异步调用

await llm.ainvoke("Tell me a joke.")

流式调用

for chunk in llm.stream("Tell me a joke."):
    print(chunk, end="", flush=True)

API 参考

有关所有功能和配置的详细信息,请参阅每个类的 API 参考文档
以编程方式连接这些文档到 Claude、VSCode 等,通过 MCP 获取实时答案。
© . This site is unofficial and not affiliated with LangChain, Inc.