Installation
Quick Start
What Gets Traced
The integration captures events from the OpenAI Agents SDK’s built-in tracing system:- Agent runs — trace-level events with workflow name and metadata
- LLM generations — model, input messages, output, token usage (input_tokens/output_tokens)
- Tool/function calls — function name, input arguments, output (TypeScript only)
- Handoffs — from/to agent names for multi-agent workflows (TypeScript only)
- Errors — captured with error status on spans (TypeScript only)
Configuration
Multi-Agent Workflows
The integration automatically captures handoffs between agents:Flushing and Shutdown
Always callflush() before your process exits to ensure all telemetry is shipped:
Known Limitations (Python)
- Tool calls and handoffs are not captured in the Python SDK. Only LLM generation spans (model, input, output, tokens) are tracked. The TypeScript SDK captures full span trees including tools and handoffs.
- Multi-response traces: In multi-agent workflows with handoffs, only the last response’s data survives in the single
track_aicall.