跳到主要内容
LangSmith 可以使用 OpenInference 的 OpenAI 检测功能捕获由 Semantic Kernel 生成的追踪。本指南将向您展示如何自动捕获 Semantic Kernel 应用程序的追踪并将其发送到 LangSmith 进行监控和分析。

安装

使用您首选的包管理器安装所需包
pip install langsmith semantic-kernel openinference-instrumentation-openai
需要 LangSmith Python SDK 版本 langsmith>=0.4.26 才能获得最佳 OpenTelemetry 支持。

设置

1. 配置环境变量

设置您的 API 密钥和项目名称
export LANGSMITH_API_KEY=<your_langsmith_api_key>
export LANGSMITH_PROJECT=<your_project_name>
export OPENAI_API_KEY=<your_openai_api_key>

2. 配置 OpenTelemetry 集成

在您的 Semantic Kernel 应用程序中,导入并配置 LangSmith OpenTelemetry 集成以及 OpenAI instrumentor
from langsmith.integrations.otel import configure
from openinference.instrumentation.openai import OpenAIInstrumentor

# Configure LangSmith tracing
configure(project_name="semantic-kernel-demo")

# Instrument OpenAI calls
OpenAIInstrumentor().instrument()
您无需设置任何 OpenTelemetry 环境变量或手动配置导出器——configure() 会自动处理所有事情。

3. 创建并运行您的 Semantic Kernel 应用程序

配置完成后,您的 Semantic Kernel 应用程序将自动向 LangSmith 发送追踪: 此示例包含一个最小的应用程序,它配置了内核,定义了基于提示的功能,并调用它们以生成追踪活动。
import os
import asyncio
from semantic_kernel import Kernel
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion
from semantic_kernel.prompt_template import InputVariable, PromptTemplateConfig
from openinference.instrumentation.openai import OpenAIInstrumentor
from langsmith.integrations.otel import configure
import dotenv

# Load environment variables
dotenv.load_dotenv(".env.local")

# Configure LangSmith tracing
configure(project_name="semantic-kernel-assistant")

# Instrument OpenAI calls
OpenAIInstrumentor().instrument()

# Configure Semantic Kernel
kernel = Kernel()
kernel.add_service(OpenAIChatCompletion())

# Create a code analysis prompt template
code_analysis_prompt = """
Analyze the following code and provide insights:

Code: {{$code}}

Please provide:
1. A brief summary of what the code does
2. Any potential improvements
3. Code quality assessment
"""

prompt_template_config = PromptTemplateConfig(
    template=code_analysis_prompt,
    name="code_analyzer",
    template_format="semantic-kernel",
    input_variables=[
        InputVariable(name="code", description="The code to analyze", is_required=True),
    ],
)

# Add the function to the kernel
code_analyzer = kernel.add_function(
    function_name="analyzeCode",
    plugin_name="codeAnalysisPlugin",
    prompt_template_config=prompt_template_config,
)

# Create a documentation generator
doc_prompt = """
Generate comprehensive documentation for the following function:

{{$function_code}}

Include:
- Purpose and functionality
- Parameters and return values
- Usage examples
- Any important notes
"""

doc_template_config = PromptTemplateConfig(
    template=doc_prompt,
    name="doc_generator",
    template_format="semantic-kernel",
    input_variables=[
        InputVariable(name="function_code", description="The function code to document", is_required=True),
    ],
)

doc_generator = kernel.add_function(
    function_name="generateDocs",
    plugin_name="documentationPlugin",
    prompt_template_config=doc_template_config,
)

async def main():
    # Example code to analyze
    sample_code = """
def fibonacci(n):
    if n <= 1:
        return n
    return fibonacci(n-1) + fibonacci(n-2)
    """

    # Analyze the code
    analysis_result = await kernel.invoke(code_analyzer, code=sample_code)
    print("Code Analysis:")
    print(analysis_result)
    print("\n" + "="*50 + "\n")

    # Generate documentation
    doc_result = await kernel.invoke(doc_generator, function_code=sample_code)
    print("Generated Documentation:")
    print(doc_result)

    return {"analysis": str(analysis_result), "documentation": str(doc_result)}

if __name__ == "__main__":
    asyncio.run(main())

高级用法

自定义元数据和标签

您可以通过设置 span 属性向您的追踪添加自定义元数据
from opentelemetry import trace

# Get the current tracer
tracer = trace.get_tracer(__name__)

async def main():
    with tracer.start_as_current_span("semantic_kernel_workflow") as span:
        # Add custom metadata
        span.set_attribute("langsmith.metadata.workflow_type", "code_analysis")
        span.set_attribute("langsmith.metadata.user_id", "developer_123")
        span.set_attribute("langsmith.span.tags", "semantic-kernel,code-analysis")

        # Your Semantic Kernel code here
        result = await kernel.invoke(code_analyzer, code=sample_code)
        return result

与其他工具结合使用

您可以通过添加其他 instrumentor(例如 DSPy、AutoGen)并将其初始化为 instrumentor,将 Semantic Kernel 检测与其他 instrumentor 结合使用
from langsmith.integrations.otel import configure
from openinference.instrumentation.openai import OpenAIInstrumentor
from openinference.instrumentation.dspy import DSPyInstrumentor

# Configure LangSmith tracing
configure(project_name="multi-framework-app")

# Initialize multiple instrumentors
OpenAIInstrumentor().instrument()
DSPyInstrumentor().instrument()

# Your application code using multiple frameworks

以编程方式连接这些文档到 Claude、VSCode 等,通过 MCP 获取实时答案。
© . This site is unofficial and not affiliated with LangChain, Inc.