快速开始
方式一:LangSmith(LangChain 用户)
1. 注册
访问 smith.langchain.com,注册并创建项目。
2. 配置环境变量
export LANGCHAIN_TRACING_V2=true
export LANGCHAIN_API_KEY="your-api-key"
export LANGCHAIN_PROJECT="my-project"
3. 运行 LangChain 应用
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage
llm = ChatOpenAI(model="gpt-4")
response = llm.invoke([HumanMessage(content="你好")])
打开 LangSmith 控制台即可看到 Trace。
方式二:结构化日志(通用)
最小实现
import time
import json
import uuid
def traced_llm_call(prompt: str, model: str):
trace_id = str(uuid.uuid4())
start = time.perf_counter()
# 实际调用...
response = "模拟响应"
duration = time.perf_counter() - start
log = {
"trace_id": trace_id,
"event": "llm_call",
"model": model,
"prompt_len": len(prompt),
"response_len": len(response),
"duration_ms": round(duration * 1000, 2),
}
print(json.dumps(log, ensure_ascii=False))
return response
将日志收集到 ELK、Loki 等即可做查询与聚合。
方式三:OpenTelemetry
安装
pip install opentelemetry-api opentelemetry-sdk opentelemetry-exporter-otlp
基本配置
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, BatchSpanProcessor
provider = TracerProvider()
provider.add_span_processor(BatchSpanProcessor(ConsoleSpanExporter()))
trace.set_tracer_provider(provider)
tracer = trace.get_tracer("my-llm-app", "1.0")
with tracer.start_as_current_span("llm_call") as span:
span.set_attribute("model", "gpt-4")
span.set_attribute("prompt_tokens", 100)
# 实际调用...
span.set_attribute("completion_tokens", 50)
可对接 Jaeger、Zipkin、OTLP 后端等。
验证清单
- Trace/日志中包含 trace_id、model、duration
- 可在控制台或日志系统中按 trace_id 查询
- 有基本延迟、Token 等指标聚合(若已接入监控)