Oracle Cloud Infrastructure (OCI) 生成式 AI 是一项完全托管的服务,提供了一套最先进、可定制的大型语言模型 (LLM),涵盖了广泛的使用场景,并且可以通过单一 API 访问。使用 OCI 生成式 AI 服务,您可以访问开箱即用的预训练模型,或根据您自己的数据在专用 AI 集群上创建和托管您自己微调的自定义模型。服务和 API 的详细文档可在此处和此处找到。本 notebook 解释了如何将 OCI 的生成式 AI 模型与 LangChain 结合使用。
from langchain_community.embeddings import OCIGenAIEmbeddings# use default authN method API-keyembeddings = OCIGenAIEmbeddings( model_id="MY_EMBEDDING_MODEL", service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com", compartment_id="MY_OCID",)query = "This is a query in English."response = embeddings.embed_query(query)print(response)documents = ["This is a sample document", "and here is another one"]response = embeddings.embed_documents(documents)print(response)
复制
向 AI 提问
# Use Session Token to authNembeddings = OCIGenAIEmbeddings( model_id="MY_EMBEDDING_MODEL", service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com", compartment_id="MY_OCID", auth_type="SECURITY_TOKEN", auth_profile="MY_PROFILE", # replace with your profile name auth_file_location="MY_CONFIG_FILE_LOCATION", # replace with file location where profile name configs present)query = "This is a sample query"response = embeddings.embed_query(query)print(response)documents = ["This is a sample document", "and here is another one"]response = embeddings.embed_documents(documents)print(response)