跳到主要内容
您可以通过环境变量静态地更改跟踪的目标项目,也可以在运行时动态更改。

静态设置目标项目

跟踪概念部分所述,LangSmith 使用 Project 概念对跟踪进行分组。如果未指定,项目将设置为 default。您可以设置 LANGSMITH_PROJECT 环境变量,为整个应用程序运行配置自定义项目名称。这应该在执行应用程序之前完成。
export LANGSMITH_PROJECT=my-custom-project
LANGSMITH_PROJECT 标志仅在 JS SDK 版本 >= 0.2.16 中受支持,如果您使用的是旧版本,请改用 LANGCHAIN_PROJECT
如果指定的项目不存在,它将在摄取第一个跟踪时自动创建。

动态设置目标项目

您还可以通过多种方式在程序运行时设置项目名称,具体取决于您如何为跟踪注释代码。当您希望在同一应用程序中将跟踪记录到不同项目时,这很有用。
使用以下方法之一动态设置项目名称将覆盖由 LANGSMITH_PROJECT 环境变量设置的项目名称。
import openai
from langsmith import traceable
from langsmith.run_trees import RunTree

client = openai.Client()
messages = [
  {"role": "system", "content": "You are a helpful assistant."},
  {"role": "user", "content": "Hello!"}
]

# Use the @traceable decorator with the 'project_name' parameter to log traces to LangSmith
# Ensure that the LANGSMITH_TRACING environment variables is set for @traceable to work
@traceable(
  run_type="llm",
  name="OpenAI Call Decorator",
  project_name="My Project"
)
def call_openai(
  messages: list[dict], model: str = "gpt-4o-mini"
) -> str:
  return client.chat.completions.create(
      model=model,
      messages=messages,
  ).choices[0].message.content

# Call the decorated function
call_openai(messages)

# You can also specify the Project via the project_name parameter
# This will override the project_name specified in the @traceable decorator
call_openai(
  messages,
  langsmith_extra={"project_name": "My Overridden Project"},
)

# The wrapped OpenAI client accepts all the same langsmith_extra parameters
# as @traceable decorated functions, and logs traces to LangSmith automatically.
# Ensure that the LANGSMITH_TRACING environment variables is set for the wrapper to work.
from langsmith import wrappers
wrapped_client = wrappers.wrap_openai(client)
wrapped_client.chat.completions.create(
  model="gpt-4o-mini",
  messages=messages,
  langsmith_extra={"project_name": "My Project"},
)

# Alternatively, create a RunTree object
# You can set the project name using the project_name parameter
rt = RunTree(
  run_type="llm",
  name="OpenAI Call RunTree",
  inputs={"messages": messages},
  project_name="My Project"
)
chat_completion = client.chat.completions.create(
  model="gpt-4o-mini",
  messages=messages,
)
# End and submit the run
rt.end(outputs=chat_completion)
rt.post()

以编程方式连接这些文档到 Claude、VSCode 等,通过 MCP 获取实时答案。
© . This site is unofficial and not affiliated with LangChain, Inc.